Data Reconciliation?

The data that I am working with requires a 30 day reconciliation period. Every day we delete the prior 29 days and pull 30. 


Is this possible in domo in general?


Also, what is the difference between a SQL Server connector and Workbench ODBC? Is one a pipeline and the other a make table system or are both just make table systems?


  • Hi all,

    Can anybody help out @lesliefenwick?


  • @lesliefenwick Depending on what kind of data source you're pulling, you can accomplish this a number of ways. If  you have a connection to a structured dataset like mySQL, you can just pull that data with a typical SQL query, probably using workbench with your respective ODBC drivers.  For example, we have several oracle datasets that we run every day that pull the most recent 7 days of data, and those 7 days are refreshed every day.

    If your data is in the cloud and accessed via APIs, your connector configuration will probably allow you to specify a timeframe as well.


    To your other question, workbench is an application that typically sits on a server behind your firewall, connects via ODBC or other drivers to your on-premise source system, and you configure workbench to run (utilizing windows task scheduler in the background) to push data up to Domo's servers.  The SQL server connector will open up a scheduled connection, using the authentication credentials and SQL query you provide, and request the data to be pulled up to Domo.  You decide, in either event, if you want the returned rows to be added to the existing data in Domo or to replace what is there already.  But both Workbench and SQL Server require credentials, a query, a schedule, and replace type.

  • Thanks for the reply!


    I have set up all the connections to the dataset and I am running everything through workbench.


    What I ultimately need an answer to is can I delete 29 days from a data set with 365 days and add 30 to the existing data set within domo every day? Or do I have to replace the entire data set every day? I need about 3 years of data plus ongoing data to do YoY analysis and this equates to about 15 million records. So if I could delete and replace the last x days every day that would be great.



  • You certainly could replace all 15 million rows every day, but that would likely be too taxing and is overkill.  Unfortunately you can't replace only a section of those 15 million straight on the source, but you can implement a dataflow that can do this for you.


    There are a couple of articles and posts here that go into more detail about this process, but essentially run a process that takes two inputs: one (large set) for all of the rows as well as one (small set) for the 30 days of rows, removes the 29 days from the large and inserts the small set to create a new output.  Tomorrow that output will become the new input and the cycle will repeat.



This discussion has been closed.