Is it possble to point a new Dataflow to an exsiitng Dataset? I have an exsiting dataflow and an exsising dataset, I have created a new Dataflow and want to point it to the exsising Dataset.
Hi @user19085 ,
How you have created new dataflow? You could use the the dataset which is required to create dataflow. WIthout dataset you cant create a dafalow.
However, if you are using SQL to create dataflow, you can copy and use same SQL with diffrent dataset, just change the name of dataset and columns names if reuired.
**Say "Thanks" by clicking the "heart" in the post that helped you.
**Please mark the post that solves your problem by clicking on "Accept as Solution"
@user19085 - Are you wanting to have the new dataflow point to the existing dataset as the output and overwrite the existing dataset or are you wanting to use the existing dataset as an input to your new dataflow?
If you're looking to overwrite the existing dataset I don't think that's an option with MagicETL or MySQL ETL since they just create new datasets when the dataflow is first run.
Thanks Grant. Yes, I have a new dataflow and want to point to an existing dataset and overwrite the data. What I'm finding is that it seems that the dataset created by a dataflow is unique to that dataflow. In my mind the target dataset/table should not care where the data is coming from.
Thanks Neeti, I have an existing dataflow with an existing target dataset. I have created a new dataflow to replace the existing, but what we are finding is when the existing cards are repointed to the new dataset/table most of the beast modes don’t migrate requiring an extensive rebuild of the cards.
Can you combine the new dataflow into the old dataflow and utilize the old dataflow's output dataset?
If you need to keep track of the existing dataflow data you could create a new output dataset and rename it as the old one (even though it'd be treated differently)
You can alter dataflows if you know how to monitor network traffic and send curl requests. OR you can update the JSON definition of a dataflow using the JavaCLI (my recommendation over CURL or Postman) and the list-dataflow command.
This is definitely one of those ... use at your own risk kind of things.
Creating a workflow where you
1) create a copy of the ETL, let's call it DEV_ETL, iterate and QA. then
2) copy the contents of DEV_ETL into PROD_ETL
will probably give you the results you're looking for while ensuring that you don't accidentally blow up cards while you're testing / iterating.