Export very large dataset (million or row) in CSV as chunks
Answers
-
Hi @user094816
I don't believe Domo's API supports pagination / chunking data. You could utilize the API to read the entire dataset and then use your python script to do the splitting logic yourself.
**Was this post helpful? Click Agree or Like below**
**Did this solve your problem? Accept it as a solution!**1 -
Hi @GrantSmith , Thank you for your feedback. Since domo is slow while exporting data from cards (applying filer of date range to reduce number of rows < 10 ML) for 2 reasons
1) data is very huge
2) Many filters applied on card
I have solved 2nd option and added all filters on ETL itself to reduce the data and generated a new dataset.
Now I am looking for export this huge dataset 104 M rows with 38.5 GB data size. What is the best approach?
I tried to filter and export from UI in DOMO Dataset, but it's not exporting filtered rows from dataset.
0 -
either use dataset views to construct filtered views of your dataset, or use the domo cli https://knowledge.domo.com/Administer/Other_Administrative_Tools/Command_Line_Interface_(CLI)_Tool#section_34, to query-data or export-data and create filtered exports.
Jae Wilson
Check out my 🎥 Domo Training YouTube Channel 👨💻
**Say "Thanks" by clicking the ❤️ in the post that helped you.
**Please mark the post that solves your problem by clicking on "Accept as Solution"0
Categories
- 10.7K All Categories
- 1 APAC User Group
- 12 Welcome
- 36 Domo News
- 9.6K Using Domo
- 1.9K Dataflows
- 2.4K Card Building
- 2.2K Ideas Exchange
- 1.2K Connectors
- 339 Workbench
- 252 Domo Best Practices
- 11 Domo Certification
- 461 Domo Developer
- 47 Domo Everywhere
- 101 Apps
- 705 New to Domo
- 84 Dojo
- Domopalooza
- 1.1K 日本支部
- 4 道場-日本支部へようこそ
- 23 お知らせ
- 63 Kowaza
- 297 仲間に相談
- 649 ひらめき共有