New addition to Webform not flowing through all dataflows

I have a webform that houses in different columns site level details like store name, Region, and Site ID's across multiple sources. I use this to join with datasets so it makes it easy to filter for things like Region and Site's. I recently added a new site, and all the other details to this webform. if has flowed through to several cards, and pages as you would hope but there are still some that it hasn't and I can't seem to figure out why. The one's it hasn't appeared on are fairly simple dataflows. sugar+ webform and that's it. any insight or recommendation on what I might be missing would be appreciated. I'm guessing it is something fairly simple that I am not accounting for.

Answers

  • Hey @user048760, what's the unique identifier between all the datasets that you use to match-up? Is it added in to the Webform properly? I've had instances in the past where I have two separate rows show up in subsequent datasets where the originating webform I used had some type of hidden special character in the unique identifier that made it just a little different. I would fix it by copying the correct one into the field in the webform (and I'd note it was different because the "unique values" subheader in the column head would increase until I overwrote all incorrect values).

  • GrantSmith
    GrantSmith Indiana 🔴

    Hi @user048760

    What type of join are you doing? Are you doing an inner join? If so, does the specific field you're joining on have the same value in both datasets?

  • @GrantSmith it is a left outer join. I took the value from the data set and copied it to the webform in the appropriate column, and that is the column in both that I am joining on.


  • jaeW_at_Onyx
    jaeW_at_Onyx Budapest / Portland, OR 🟤

    a LEFT JOIN will only match values from your webform that actually exist in your Transactional Data. Make sure to validate that the new values you added exist on the LEFT side.


    When you update a webform, it can sometimes take a few seconds even minutes before the actual data is updated in Domo / VAULT (storage layer). Make sure to wait until it's updated before you trigger the downstream ETL.

    Lastly, depending on your Data Entry, it isn't unheard of to accidentally add white spaces in unexpected places. So do quadruple check for that.


    Actually Lastly :P for the use case you're describing, and to make my lookup workflow bullet proof, I usually do this using the a recursive dataflow + hack with the CLI to output a webform.


    P.P.S. I document all sorts of cool hacks and best practices on the DataCrew community site: https://datacrew.circle.so/