API to perform 'Run Now' for raw dataset (Orchestrating pushing data into Domo)
I’d like to ask for advice on Domo’s API and what would be the best possible way to solve a specific use case.
My team is currently faced with a task of pushing data into Domo from S3. The data is first being processed in Databricks in a separate ETL and we orchestrate all steps in Airflow.
After trying out different methods it seems like Domo S3 connector is the most stable and performant approach. However, we’re not able to find API to run a raw dataset update (basically same thing as ‘Run Now’ option but via API).
It seems that Domo raw datasets based on connectors can only be scheduled inside Domo but that would split our orchestration into two independent parts which would implicitly depend on each other’s timing.
So far we’ve been able to orchestrate data upload with Domo Dataset API but that seems far less effective and reliable, besides we had to implement several hacks and workarounds.
Also we’ve checked out documentation on Domo Streams API which is said to be built for pushing large amounts of data and could be a way to go, but as we understood a new dataset is created every time an upload is performed, which isn’t an option, because then the following Domo ETL would have to be rebuilt every time (i.e. start with a different input dataset).
So from what we’ve gathered the best way would be to go with Domo S3 connector if it could be externally launched via some kind of Http request. Could anybody please advise if there’s API for that or maybe a beta feature that could be enabled?
TLDR. Is there a way to perform a 'Run Now' operation for a raw dataset based on an S3 connector via API / by making an Http request?
Thank you in advance, any help is appreciated.