HELP CENTER
HELP CENTER
I've created a connector that pulls the data from the big Dataset (~2M records) using the Dataset Query API.
Since the Dataset Query API does not support exporting huge datasets as CSV I used Querying by 50000 record with limit and incremental offset, so I'm making several HTTP requests to the Dataset like:
I'm making requests until all the data is fetched.
The problem is that the DOMO Query API periodically returns responses with 400 status code and `There was a problem executing the SQL query: Underlying service error: Internal Server Error` message. I assume DOMO API server is under load and fails to execute queries for a specified amount of time. If you retry the same request in few minutes it will be successful.
The question is: is there any way to prevent these 400 bad responses? Or is there any way to determine that the server is under load? I have a retry logic for requests but this will cause another issue - the dataset has federated data which updates periodically and since I can not pull all the data by API at once this can cause data inconsistency.