Has anyone had any success getting near real-time data set refreshes using Domo Stream API? Would you mind sharing the design pattern you used to achieve this?
I'm not sure the Domo Stream API will work for us. I didn't notice this portion before in the documentation:
The Stream API only supports the ability to execute a “commit” every 15 minutes.
That being said, for data sets that can accept 15 minute latency, Stream API would still be an improvement over the Redshift connector we are currently using if we can query an existing dataflow for the max source system change date prior to pushing only new or changed records this that date.
Will Stream API support this type of design pattern?
I would like to use stream API to upload large number of csv files coming at high frequency. The input csv files come with date_time appended to the file name, it means the input files will change but the dataset will remain same. If you have used stream API for similar need, i would appreicate if you could share your experience as well as code for the same using Python. We are relatively new to python but we are told that pydomo method has stream function that allows us to feed data into DOMO directly without using workbench. Your help on this would be greatly appreciated. Thank you!