I have a domo dataset created from a source ( Teradata ) via some tool. That domo dataset having 4 weeks of data I found that one of the four weeks data is incorrect and would like to delete only that 4th week data. I checked all options in domo I did not find an easy to delete that week data.
FYI, I have admin access but, still not be able to delete data from a domo dataset. One thing what I checked is to create another data set from the eixsiting dataset and leave the week where dirty data was there. But, the week data is huge which is about 130 millons of rows. Now, creating one more or derived dataset itslef is redundant and occupying the space. Do we hava any interface in dom where in I can write a simple SQL statement as below
delete from domodataset1 where item=1234;
This way I can easily delete data.
There isn't an interface where you can simply take a data set and delete rows. However, like you mentioned you can use SQL Dataflows to write a statement to filter out the rows you don't want, then create a secondary data set as the output. Once you have that, then you might not need your original data set and can delete it if you want to clean things up. That is, if your original dataset wouldn't have been updated anyway (replaced or appended to), then you can get rid of it.
In the works is a tool that will allow you to use Excel to load a dataset straight from Domo, make any changes you want, then publish that file back to Domo over the original data set. But in this case 130m rows is a lot and Excel wouldn't work with that.
There's a hidden function on DOMO's dataset. If you add ?_f=dataRepair to the end of a dataset url, new tab called Data Repair will pop up. In there, you can see all the uploaded data versions. If you want to clear the dataset, just delete all the versions.
Webhooks like this are mostly a backdoor for Domo employees but can be helpful for those who know them.
With data repair specifically, you can only remove pieces of the dataset that have come in as part of the historical append chain. Datasets that replace themselves will only have one link in this append chain
This was an amazing find.
I had duplicate data appended.
I was going to blow away the entire dataset and spend days doing another incremental upload.
This was such a gift.