Best practice for managing datasets and dataflows: when the resulting dataflow is "gospel"

Our organization hasn't been using Domo but for 6 months now. But I'm already seeing some issues which could arrise due to datasets and dataflows.

Here's our situation: we have a number of datasets coming in from scripts being applied to a SQL database. That data arrives in to our Domo instance, but it's not ready for production. I need to run it through one (or more) ETL runs before it finally gets to a dataset ready for prime time.

Can the Dojo speak on some of the measures your organization takes to make sure the users are consuming the "correct" data and not something which might need some additional work?

What about other datasets users create which might be considered offspring from these "gospel" sets? We have wanted to allow our users some degree of freedom and transparency to work with data as they see fit, but I (as the MajorDomo and having now to deal with what can only be described as mutant datasets) am now having to come up with some sort of plan.

Many thanks!

 

Brian

Comments

  • AS
    AS 🔵

    Hi Brian

     

    It's a pivotal time for your organization at this step in your Domo maturation, so take charge of the opportunity and lay the groundwork for: data governance best practices, optimizing access rights, and training.

     

    See this article for some ideas from the Domo data team.  We follow quite a few of these.  Find what you think will make sense for you, get a system in place, and then train administrators, users, and superusers.  In my mind training is of utmost importance at this stage.   If they don't know better they won't perform better. 

     

     

    Aaron
    MajorDomo @ Merit Medical

    **Say "Thanks" by clicking the heart in the post that helped you.
    **Please mark the post that solves your problem by clicking on "Accept as Solution"
This discussion has been closed.