I have uploaded a decent-size (~250M) FITS table to my VOspace with the datalab command-line tool. I now want to try accessing it with SQL-like queries. So, I've used qc.mydb_import to try to import it to MyDB. With a small test file, this creates a table that I can successfully query. With the larger file, it produces an error after several minutes. This has happened three times in a row so it doesn't seem to be just a glitch. Also, it doesn't matter if I use an authentication token in the call.
'<html>\r\n<head><title>504 Gateway Time-out</title></head>\r\n<body>\r\n<center><h1>504 Gateway Time-out</h1></center>\r\n<hr><center>nginx/1.27.3</center>\r\n</body>\r\n</html>\r\n'
Is this error produced by a definite file size or time limit? Is there a practical limit to how big the FITS tables can be?
[Edit: at least one of these attempts actually loaded the table into mydb. Actualy, it seems like at least two did, because I wound up with a table twice the intended length, I'm guessing due to an assumed append mode. I deleted and started over and managed to get a working table. So this specific answer seems more or less working despite the error message. But I still want to know the answer to my general question.]
Ultimately, I'm searching for a way to extract samples from a 17G fits file. This would obviously a be a lot more challenging than my 250M file, both for the upload and the query. Maybe I'm headed in the wrong direction here and should look for a solution outside Datalab.