Is there a way to add a "Last Updated On....." label to a card? Our data updates roughly every 2-3 hours and we would like the users to see on the card when the data was last updated. Example: Last Updated: 2/14/2020 10:21AM MST
A couple options for this. At the bottom of the card, it shows when the dataset was last updated. It looks like this:
You can also create a beast mode field and make it display in the summary number area. Your beast mode formula would look something like this:
CONCAT('Last run at: ',MAX(`_BATCH_LAST_RUN_`))
And it would display like this:
You could apply some additional date/time functions within your beast mode formula to get it formatted to your preference.
Not all datasets necessarily have the BatchLastRun Date (which ... admittedly is a bit of an oversight).
In the absence of the column, you could pull that data from the DomoGoverance_Dataset dataset and JOIN that data into your dataset. Just be aware that the data in the DG dataset will only be current as of the last time it was run.
No. that's not at all what i said!
Of course you can get it from the Dataset API.
If I needed up to date information like this in Domo, i think i'd try to find a clever way to script the collection of this information for a subset of datasets in a CSV via the JavaCLI, and then automate pushing that data into a dataset periodically.
This tutorial video isn't specifically about your use case, but it does walk you through setting up and using the Java CLI if you haven't touched it before. https://www.youtube.com/watch?v=ysUHYaICYD8&list=PLUy_qbtzH0S4eYAu13c-NMnyOuxJwspzh&index=1
thank you for your reply. Actually this raise another question to me.
Is there any feature for deleting specific records in dataset other than using dataflow(ETL/MYSQL etc)?
i know the function datarepair to delete records of each update but what if we want to delete only a part of records from last run for instance
Ultimately the answer is 'yes, sort of'. But it's much more nuanced.
The files in Domo's Vault (where your dataset gets stored and backed up) is most akin to file-based datalakes. Think Amazon S3 or Dropbox even.
Because data is functionally stored as a text file, you can't really ask to "delete a record" because ... it's a text file, there's no easy way to access one row of data in the CSV without piping it through some sort of parsing / processing engine (like an ETL).
You're accustomed to 'just' deleting records, in a database table because that recordset is stored in a structure that's fundamentally designed for that task (CRUD operations). The power of Data Lake architecture is it's ability to CHEAPLY and RAPIDLY store massive text files (but not necessarily to apply CRUD operations.
So ... change your paradigm as you think about how the data is stored in Domo / Data Lakes.
THAT SAID. Yes, you can 'just delete a row' or set of rows if you use PARTITIONING or UPSERT to store the data. PARTITIONING will (like the physical encyclopedias of yesterday) divide your dataset into volumes and store each volume as a separate file. So just like DELETE-ing or REPLACE-ing one (entire) book from an encyclopedia set, you can delete or replace one partition of a PARTITIONed dataset.
With UPSERT, your data is defined as having a key column that uniquely identifies each row of your dataset. You can use the java cli or API to delete or replace a list of upsert keys (and their associated values).
you don't delete records with partitioning so much as replace a volume. This is a google-able topic, but this might give an idea: https://www.tutorialgateway.org/table-partitioning-in-sql-server/
Flags / Options in the javaCLI when you upload data will allow you to define partition column and values. so just use 'help' in the CLI for examples.
If you're unfamiliar with the topic, it may be a good idea to reach out to Domo Professional Services or a consultant / partner who can walk you through it.