Reduce loading time of input dataset in Magic ETL

I have a Magic ETL dataflow using an input dataset with 10M rows. It takes around 20 mins just to prepare this input dataset, and consumes most of the ETL processing time.

 

This ETL is used by a custom app, so ideally we would like it to finish as fast as possible. Currently it is running way too long and the bottleneck seems to be the loading of the input dataset.

 

Wonder if there is a better way of doing ETL with such huge input data? Or is there any way to improve its performance?

 

Comments

  • MarkSnodgrass
    MarkSnodgrass Portland, Oregon 🟤

    If you aren't doing much in your Magic ETL, you might consider DataFusion, also known as Blend. It is much better equipped to handle datasets that are in the millions of rows. 

    https://knowledge.domo.com/Prepare/Magic_Transforms/DataFusion

     

  • Thanks @MarkSnodgrass  for the reply. However, we need to do some complex calculations and the ETL needs to be triggered by a custom app, so Blend is not a feasible solution.

  • Hiya - I just ran a small test in magic ETL using an input of 10.5M records and simply output the result to another dataset. The whole process took < 4 mins. You might want to try to isolate your load process from the actual transformation process to see if the load process is indeed the bottleneck. if it is, then it is worth contacting DOMO helpdesk to check the configuration of your workspace.

     

    history.png

  • Thanks @tadashii for the advice. We'll raise the issue to support.