Query a DataSet: random "400: Internal server errors"

I've created a connector that pulls the data from the big Dataset (~2M records) using the Dataset Query API.

Since the Dataset Query API does not support exporting huge datasets as CSV I used Querying by 50000 record with limit and incremental offset, so I'm making several HTTP requests to the Dataset like:

  1. request POST https://api.domo.com/v1/datasets/query/execute/ce79d23f-ef7d-4318-9787-ebde54a8c5b4 Accept: application/json Authorization: bearer <your-valid-oauth-access-token> {"sql": "SELECT * FROM table ORDER BY "ID" LIMIT 50000"}
  2. request POST https://api.domo.com/v1/datasets/query/execute/ce79d23f-ef7d-4318-9787-ebde54a8c5b4 Accept: application/json Authorization: bearer <your-valid-oauth-access-token> {"sql": "SELECT * FROM table ORDER BY "ID" LIMIT 50000 OFFSET 50000"}
  3. ..."sql": "SELECT * FROM table ORDER BY "ID" LIMIT 50000 OFFSET 100000"}

I'm making requests until all the data is fetched.

The problem is that the DOMO Query API periodically returns responses with 400 status code and `There was a problem executing the SQL query: Underlying service error: Internal Server Error` message. I assume DOMO API server is under load and fails to execute queries for a specified amount of time. If you retry the same request in few minutes it will be successful.

The question is: is there any way to prevent these 400 bad responses? Or is there any way to determine that the server is under load? I have a retry logic for requests but this will cause another issue - the dataset has federated data which updates periodically and since I can not pull all the data by API at once this can cause data inconsistency.

Comments

  • amehdad
    amehdad 🟒

    It doesn't appear that the Query API solution is fit for your use cases, due to the constant request load put on the server (even after limiting and offsetting). Have you reached out to Support [support.domo.com] to explore alternative solutions?