Commiting data before update in dataflow
Hi, i would like to know if there is a way to commit data before getting update in
We have use kafka to push records to dataset from file. with kafka we can set up how many records we expect so before pushing to dataset we can confirm the number of records in the file and commit it .
Now we would like to avoid having file and push in kind of constant stream the data to our dataset/dataflow.
Let's say our stream data consist of 100 records.
If only 50 records get consume and 50 records are delayed how we can avoid data to be pushed in dataset until all 100 records are ready?
- 7.3K All Categories
- 13 Getting Started in the Community
- 134 Beastmode & Analytics
- 1.8K Data Platform & Data Science
- 52 Domo Everywhere
- 2K Charting
- 994 Ideas Exchange
- 894 Connectors
- 236 Workbench
- 342 APIs
- 76 Apps
- 18 Governance & Productivity
- 233 Use Cases & Best Practices
- 49 News
- 473 Onboarding
- 570 日本支部