Data Import Questions

It takes long time for large dataset import. Some large dataset cannot be imported into DOMO. What are the maximum capacity and the maximum speed of data transfer to DOMO? Is data import to DOMO differenciated by different connectivity methods? What type of connectivity is the best way to transfer data to DOMO?  

Best Answer

  • JacobFolsom
    JacobFolsom

    domo

    💎

    Answer ✓

    @Wenling_Zhen,

    There are some inherit limitations with each connector, depending on many factors from Vendor API limitations like a date-range limitation or a usage throttling imposed to prevent you from querying too much data from them. It could also be an issue in the Domo connector code that needs to be investigated.

     

    I think the best thing to do from here would be to have a specific case handled for each connector issue to get to the bottom of it. We do have connector specialists in our Support Dept who can investigate the specifics of your dataset failures. Please reach out with any specifics if you haven't already at http://support.domo.com. Best of luck to you!

     

    It is also worth noting that sometimes API queries have limitations that require workarounds, such as using the Email Dataset Connector to email reports that aren't available in their API framework. This can be a handy alternative.

     

    Thanks,

     

    Jacob Folsom
    **Say “Thanks” by clicking the “heart” in the post that helped you.
    **Please mark the post that solves your problem by clicking on "Accept as Solution"

Answers

  • @Wenling_Zhen,

     

    There are three upload options in Domo:

    • Workbench
    • Connector Framework
    • Dataset API and Stream API

    Within the connector framework, there are some limited options, such as the standard File Upload connector which is limited to 256MB files from a local file connection. For larger files, you would use the Advanced CSV connector which pulls from an HTTP or SFTP location. Also, Workbench can push up large files from behind your firewall.

     

    The Stream API (https://developer.domo.com/docs/stream/overview) is a method to break a large dataset into smaller chunks and stream them in through the API. It requires setting up a process on your end with a script that is designed to hit Domo's API.

     

    What specific method are you having trouble with and what is your file size? I can advise you from there.

     

    Thanks,

     

    Jacob Folsom
    **Say “Thanks” by clicking the “heart” in the post that helped you.
    **Please mark the post that solves your problem by clicking on "Accept as Solution"
  • Thanks Jacob for the information. It is helpful!

    I encountered data import difficulty from NetSuite and from Jira. 

    Example 1: I had a NetSuite case dataset which has around 45,800 rows and 31 columns. The data import failed often.  

    Example 2: I want to import a full Jira issue dataset (191 full columns) but can import only 3 months data range.

  • Thanks so much!!

    The information is super helpful!

This discussion has been closed.