Old columns of data continue to appear in updated DataSet

We recently updated a Salesforce App and many of the field names changed at the API level. I've updated the Domo DataSet to collect the new fields, but I still see all of the old fields/columns of data too. 


If I refresh the preview on that DataSet, the old columns drop off, but it's only temporary. If I go out of the DataSet and go back in, the old columns are back. When I try to use the DataSet anywhere else, say a DataFlow, I see all of the old, stale columns along with the new ones.


I have the DataSet marked as Replace, not Append --- so why isn't the old data gone?


Any help would be great!


  • @Andrea_F is there new data coming into the old columns? Was the column change the addition of new columns or is this a change in the name of the column?

    Alex Peay
    Product Manager
  • This is a dataset where I use the check boxes to pull 35 fields of data from Salesforce.


    I've changed the dataset so that 20 of the original fields are still selected, 15 were unselected, and 15 new fields were selected.


    Now, when I look at my data, I have all 50 columns of data.


    So, no, new data isn't coming into the old columns -- and I would like the old columns to drop off.


    Obviously, I could just make a new dataset in situations like this, but depending on how many cards feed off the dataset in question, that could be a real pain. 

  • I would assume that you are doing and append on the DataSet when new data comes in, is that correct?


    Do you need to exclude th old columns explicitly? Based on what you have explained this is working as designed. 


    You have a DataSet with 30 columns, you have this set up as an append update method. The DataSet runs and brings in data for X days. At some point you decide to change the columns being brought in and remove 15 columns and add 15 new columns. Since this is append you will no longer get data on the 'old' columns but they will continue to exist in the DataSet. You will begin to get data in the 'new' columns.


    Given that this is a SFDC DataSet you will want this running as an Append update method so you don't lose data over time.


    If you must remove the 'old' DataSets you will want to remove the columns through a DataFlow or Fusion, but continue to append the data so the historical data remains in the 15 original columns that did not get removed.

    Alex Peay
    Product Manager
  • No, this is not an append - it's a replace. That's why I'm surprised that the old data still remains.

  • It took a little while but I think I am understanding now. 


    When you said 'field names changed in the API' does that mean that you changed these columns in the source system? In this case SFDC?


    Then when you came to Domo to old columns no longer show up in the dropdown/check list but you selected the new ones?


    Of the 15 old columns that you no longer want in the report, did all of those names change in SFDC? I am thinking what may be happening is that when you changed those columns in SFDC the dropped out of what we call 'discovery' (which is what we use to determine which fields you have access to) but since they had previously been selected they are still running and we have not given you a chance to de-select or reset your selections for those fields.


    This would only be happening for columns that changed in the source system, did any of the columns that you no longer want keep the same name? If so were you able to successfully unselect them and have them be removed from the report?


    Alex Peay
    Product Manager
  • I have seen this issue a few times - sometimes it worked and sometimes it didn't. When it didn't I found deleting the original data set and creating a new Dataset that used the same name often fixed the problem and allowed for minimal maintenance. Also I sometime just did a second data set and then changed where cards pointed - If there are not a lot cards, this worked well. Neither is perfect but fairly quick and solves the issue when the replace did not work.
    Dojo Community Member
    ** Please like responses by clicking on the thumbs up
    ** Please Accept / check the answer that solved your problem / answered your question.
  • JoSaCh
    JoSaCh ⚪️

    I too have run into this issue fairly consistently. My understanding of the issue per Domo Support is that it's a caching issue. While you can delete data sets and create new ones, that's really more of a band aid approach and doesn't address the root cause. Clearly, the solution can't and shouldn't be to delete the data set and create a new one; this essentially requires everyone to "get it 100% right" the first time.

  • @JoSaCh can you explain the process you went through to get to this issue?


    I am trying to zero in on what is causing the issue and I would like to confirm the process that leads to this. Main items are:


    1.) Did a change happen in the 3rd party DataSource (i.e. Salesforce) and then the columns didn't update in Domo?

    2.) Are you also doing a replace?

    3.) Are you bringing in the same data just with a different column/report name?


    I agree that delete and start again is never the right answer. We are looking into why this is happening and will get back to you as we have more information. The more details you can share about what leads to this issue will help me get to the root cause.


    Thanks for your feedback and for using Domo.

    Alex Peay
    Product Manager
  • For me it was Magic ETL (if I remember) and I was making a lot of changes - I thought perhaps I made changes too quick and system did not keep up. Didnt worry too much about it. It has happened two or three times but I have not been doing new ETLs recently. Sorry I cnat give you more details!
    Dojo Community Member
    ** Please like responses by clicking on the thumbs up
    ** Please Accept / check the answer that solved your problem / answered your question.
  • Glad to know I'm not the only one seeing this! As far as re-creating this in your environment, here's my situation. Hope this helps!


    The Salesforce "application" we used for tracking our projects got updated --- and with it many of the underlying API field names.


    So, for example, Project__c is now MPM4_BASE__Project__c and the associated Account record went from Account__c to MPM4_BASE__Account__c.


    My main data source is pulling fields from Salesforce using the "Browse Objects and Fields" method. I have un checked the old fields, and checked the new ones. The dataset is doing a REPLACE each day.


    So, while I do see the new columns of project data, I'm also still seeing the old columns too. 


    I've got several dataflows and Magic ETL's that run off of this primary data source --- and those all see the new fields, but also still see the old ones too.

  •  I see a very similar issue in my Google Analytics data sets  - one of the view names was changed and yet, the column still displays the old view name on every row, rather than updating it to the new view profile naming convention even though we are using a replace rather than an append.