Export very large dataset (million or row) in CSV as chunks
Answers
-
Hi @user094816
I don't believe Domo's API supports pagination / chunking data. You could utilize the API to read the entire dataset and then use your python script to do the splitting logic yourself.
**Was this post helpful? Click Agree or Like below**
**Did this solve your problem? Accept it as a solution!**1 -
Hi @GrantSmith , Thank you for your feedback. Since domo is slow while exporting data from cards (applying filer of date range to reduce number of rows < 10 ML) for 2 reasons
1) data is very huge
2) Many filters applied on card
I have solved 2nd option and added all filters on ETL itself to reduce the data and generated a new dataset.
Now I am looking for export this huge dataset 104 M rows with 38.5 GB data size. What is the best approach?
I tried to filter and export from UI in DOMO Dataset, but it's not exporting filtered rows from dataset.
0 -
either use dataset views to construct filtered views of your dataset, or use the domo cli https://knowledge.domo.com/Administer/Other_Administrative_Tools/Command_Line_Interface_(CLI)_Tool#section_34, to query-data or export-data and create filtered exports.
Jae Wilson
Check out my 🎥 Domo Training YouTube Channel 👨💻
**Say "Thanks" by clicking the ❤️ in the post that helped you.
**Please mark the post that solves your problem by clicking on "Accept as Solution"0
Categories
- 7.6K All Categories
- Connect
- 913 Connectors
- 241 Workbench
- 470 Transform
- 1.8K Magic ETL
- 60 SQL DataFlows
- 445 Datasets
- 24 Visualize
- 194 Beast Mode
- 2K Charting
- 6 Variables
- 1 Automate
- 348 APIs & Domo Developer
- 82 Apps
- Workflows
- 14 Predict
- 3 Jupyter Workspaces
- 11 R & Python Tiles
- 241 Distribute
- 59 Domo Everywhere
- 241 Scheduled Reports
- 14 Manage
- 35 Governance & Security
- 18 Product Ideas
- 1.1K Ideas Exchange
- Community Forums
- 15 Getting Started
- 1 Community Member Introductions
- 49 Community News
- 18 Event Recordings
- 579 日本支部