Import huge amount of data from mySQL - is it possible to partition a datasource?
I'm writing this because it seems I cannot solve properly my problem on my own and I need your help.
So, i have a huge table in mysql (50 million rows, 100 columns) and we used to import it entirely every day to Domo to have all the new data. Of course, this is getting slower and slower and we cannot manage this with a massive approach anymore.
For simplicity, consider that we have one date column that we can use for partitioning the table.
Considering how we work on this table, I am actually interested only in the last 7 days data and I would like to update only them.
I've considered using the merge option in the connector, but this does not work in my case because I may also delete some rows in the last 7 days from one day to the other, and merge would not let me delete those rows.
So, my question is: it is possible to partition my datasource so that every day I delete the last 7 days from Domo (based on the partitioning column) and insert again them from MySQL (so to have updated information). Of course I want to do this without touching all the other rows.
I have already tried to use a dataflow having as input the old datasource, the new 7 days rows and performing manually the delete + insert, but this takes so much time because we need to load the whole dataset (very slow), perform the delete+insert (very fast), and then writing the updated dataset (very slow, again).
If we just could manage somehow to delete rows from a data source based on a partitioning column this would make things extremely faster.
Thank you in advance.
- 7.3K All Categories
- 13 Getting Started in the Community
- 134 Beastmode & Analytics
- 1.8K Data Platform & Data Science
- 52 Domo Everywhere
- 2K Charting
- 994 Ideas Exchange
- 894 Connectors
- 236 Workbench
- 342 APIs
- 76 Apps
- 18 Governance & Productivity
- 233 Use Cases & Best Practices
- 49 News
- 473 Onboarding
- 570 日本支部