Reduce loading time of input dataset in Magic ETL
I have a Magic ETL dataflow using an input dataset with 10M rows. It takes around 20 mins just to prepare this input dataset, and consumes most of the ETL processing time.
This ETL is used by a custom app, so ideally we would like it to finish as fast as possible. Currently it is running way too long and the bottleneck seems to be the loading of the input dataset.
Wonder if there is a better way of doing ETL with such huge input data? Or is there any way to improve its performance?
Comments
-
If you aren't doing much in your Magic ETL, you might consider DataFusion, also known as Blend. It is much better equipped to handle datasets that are in the millions of rows.
https://knowledge.domo.com/Prepare/Magic_Transforms/DataFusion
**Make sure toany users posts that helped you.
**Please mark as accepted the ones who solved your issue.2 -
Thanks @MarkSnodgrass for the reply. However, we need to do some complex calculations and the ETL needs to be triggered by a custom app, so Blend is not a feasible solution.
0 -
Hiya - I just ran a small test in magic ETL using an input of 10.5M records and simply output the result to another dataset. The whole process took < 4 mins. You might want to try to isolate your load process from the actual transformation process to see if the load process is indeed the bottleneck. if it is, then it is worth contacting DOMO helpdesk to check the configuration of your workspace.
0
Categories
- 11K All Categories
- 5 Private Company Board
- 2 APAC User Group
- 12 Welcome
- 42 Domo News
- 9.9K Using Domo
- 2K Dataflows
- 2.5K Card Building
- 2.2K Ideas Exchange
- 1.2K Connectors
- 349 Workbench
- 267 Domo Best Practices
- 11 Domo Certification
- 474 Domo Developer
- 51 Domo Everywhere
- 110 Apps
- 725 New to Domo
- 85 Dojo
- Domopalooza
- 1.1K 日本支部
- 4 道場-日本支部へようこそ
- 29 お知らせ
- 65 Kowaza
- 301 仲間に相談
- 662 ひらめき共有