0 votes
1.9k views
by datalab (20.7k points) | 1.9k views

1 Answer

0 votes

In principle, no. But if your query is expected to return a very large number of rows, it may trigger a timeout, or even an error message that your query is not suitable for synchronous mode.

In such case, please submit your query in asynchronous mode. In a Jupyter notebook, you would write:

from dl import queryClient as qc
query = # my complex or long query string
jobid = qc.query(token,query,async=True)
# and after a while...
status = qc.status(token,jobid)
if jobid == 'COMPLETED':
    result = qc.results(token,jobid)

Another problem, even for asynchronous mode, can be that you try to load too many rows into your memory (e.g. if you are working with the Jupyter notebook server). Instead, write out your query results directly to a file in your vospace:

jobid = qc.query(token,query,async=True,out='vos://results.csv')
by robnik (1.0k points)

418 questions

435 answers

440 comments

637 users

Welcome to Data Lab Help Desk, where you can ask questions and receive answers from other members of the community.

Categories