Hi! If this isn't the right forum to post this, please let me know.
TL;DR: I'm trying to automate alerts/tasks from the output of a dataflow such that each row in the output gets its own task instead of the entire updated dataset getting one bulk "dataset is updated, here you go" task. Two failed attempts are detailed below.
I've created a discrepancy report such that the output of a DataFlow are only errant rows that need to be addressed by a person. I'd like to automate creating a task for each row in the output dataset to track how each item is dealt with.
Current Recursive DataFlow
On each iteration, a single non-processed row (identified by hash function in another DataFlow, these will always be unique) is added to a staging table, and an alert + task is driven off of the single-row staging table. The hash is logged to a master hash list against which all future iterations are checked. The exit condition is when there are no rows to process, and the continuation condition is when there are non-processed hashes.
usually you'd use that in the context Domo's writeback connector offering. or at least... that's the dream they sell 😛
a writeback connector is essentially a tile that runs at the end of magic etl that usually just ... sends data from Domo to another system via API.
in your case, instead of just pushing a JSON array, it sounds like you'd want to loop over each row in the output dataset and call a POST request to some task API.
for each row in your output dataset send a post request to