DOMO slow to pull in data

Hi guys,


I have a complaint about DOMO not pulling all data.

e.g. we run every 15 minutes to pull in transaction data, but it seems not to pull everything in, sometimes 2-3-4-5h delay. This started to happen in the last 1-2 weeks I think.


If I run exact same query on sql console - shows most recent orders, whilst in domo, it seems to be lagging few hours of data behind. Even when Manually FORCING to update, it still doesn't pull the data in. Weird, confusing and cost me a lot of headache :(

Best Answer

  • jaeW_at_Onyx
    jaeW_at_Onyx Budapest / Portland, OR 🟤
    Accepted Answer

    if you're using Domo for real time reporting, you should be on platform 3+.  If you asked Domo support or account representative about 15-minute updates, (unless it's explicitly written into your contract) they'll probably tell you you're not paying for that level of service.  


    I hear your point, 'it worked before and now it isn't working,' if I were your consultant, I'd tell you, 'previously it should not have worked, and you got away w/ it because you had smaller data.'


    If you're looking at a low latency use case, you should probably be using UPSERT or incremental loads + APPEND to get your data into Domo.  If you're not doing that, I don't see a reasonable way you can consistently expect 15-minute updates.


    Make sure you're not building your pipeline planning to transform the entire dataset in Domo every 15 minutes.  For low latency workflows, you must only process as little data as necessary.  If you're doing an incremental load, you may be able to get away with just transforming the incremental load using Magic (mySQL and Redshift will not consistently come in under 15-minute mark) before appending your data to the full history, but I don't recommend doing ETL in Domo if you need sub 15 minute.  Do your transformation outside of Domo.


    Lastly, yes you can use nested fusions, but again, performance may suffer, I would recommend dataset views if you need more robust transformation tooling.


  • GrantSmith
    GrantSmith Indiana 🔴

    Hi @Sandis 


    How are you pulling in your data? Are you pulling in the most recent records only or are you pulling in your entire dataset? You may want to look into utilizing UPSERT in Workbench 5 - to only insert the records which were changed thus reducing the amount of processing time needed

  • jaeW_at_Onyx
    jaeW_at_Onyx Budapest / Portland, OR 🟤

    updating data every 15 minutes... unless you have a proper low latency workflow requirement, it's not recommended.  talk to your AE, but 15 minute updates is a non-trivial strain on Domo if you don't do the 'right' workflow.


    Also, if you have ETL AFTER data ingestion, that's another bit of lag in your dataflow, so if you don't have some sort of incremental load model, yes, your dataflows will take longer to execute.


    If you're using 15 minute updates, MagicETL is virtually the only method that can even get close to delivering the throughput you need.

  • Thank you @GrantSmith   and @jaeW_at_Onyx 


    It never was a problem but seems just recently started to happen.


    I don't want to add an extra tool as we're pulling from MariaDB, I'm thinking perhaps I need to split up this in 2 tables - 1 that loads everything that's older than 1 week and load it say 1x per day; and then download most recent ones every 15 mins (15 mins is long or small depends how you look as we are trying as close as possible to real time transaction reporting).


    And then probably join the 2 tables via ETL?

  • GrantSmith
    GrantSmith Indiana 🔴


    You can use a Fusion to union the two tables together which would likely be more efficient than an ETL but it's up to you.

  • Interesting @GrantSmith 


    Can fusion be pulled then in another fusion?