Cards on main dataset are very slow when editing

Reply
Highlighted
Major Blue Belt

Cards on main dataset are very slow when editing

Cards that are built on our main dataset are painfully slow to respond when editing. Even the simplest cards like guages, bar charts, etc. are extremely slow to respond. Any suggestions as to what may cause this? 


Accepted Solutions
Black Belt

if this is a production dataset, absolutely you should do everything possible to minimize the strain on the system.  content management activities like minimizing beast modes, removing unnecessary columns, removing unused cards etc. will all help.

 

additionally, follow best practices a DBA would implement to improve query performance (integrate CASE statements into ETL, avoid count distinct, materialize date-based functions into columns on the dataset etc.)

 

you can use the domo governance datasets to get a feel for what's being used, but more importantly is to identify what is NOT being used. this will require some clever ETL and dataset engineering, but the 'not used' is arguably more important than what's being used b/c that's the content that puts unnecessary strain on the system.


Jae Wilson
Check out my Domo Training YouTube Channel

**Say "Thanks" by clicking the heart in the post that helped you.
**Please mark the post that solves your problem by clicking on "Accept as Solution"

View solution in original post


All Replies
Highlighted
Black Belt

@Sweep_The_Leg ,

 

Data volume can be a contributing factor.

Filtering on DateTime instead of Date doesn't help either.

If a dataset was updated recently, performance can feel sluggish until the cache / query history warms up.

Are you working with a Fusion?

 

Maybe build cards on a 'Dev dataset' that has a subset of the data, and once the card is finished transfer to the Prod dataset.

 

Then once your dashboard is complete, kick it over to support and ask if they can optimise the dashboard for you.


Jae Wilson
Check out my Domo Training YouTube Channel

**Say "Thanks" by clicking the heart in the post that helped you.
**Please mark the post that solves your problem by clicking on "Accept as Solution"
Highlighted
Major Blue Belt

the dataset does have ~5m rows, but i'd say this all started about 2-3 months ago. we've been using this data set for a few years, obviously its grown with transactional volume, but the cards themselves are quite simple: Current Quarter, date (not date-time), currency, filters by employee names, etc. It does have a lot of beastmodes that have accumulated over the years. do you think that could be a factor?

Black Belt

if this is a production dataset, absolutely you should do everything possible to minimize the strain on the system.  content management activities like minimizing beast modes, removing unnecessary columns, removing unused cards etc. will all help.

 

additionally, follow best practices a DBA would implement to improve query performance (integrate CASE statements into ETL, avoid count distinct, materialize date-based functions into columns on the dataset etc.)

 

you can use the domo governance datasets to get a feel for what's being used, but more importantly is to identify what is NOT being used. this will require some clever ETL and dataset engineering, but the 'not used' is arguably more important than what's being used b/c that's the content that puts unnecessary strain on the system.


Jae Wilson
Check out my Domo Training YouTube Channel

**Say "Thanks" by clicking the heart in the post that helped you.
**Please mark the post that solves your problem by clicking on "Accept as Solution"

View solution in original post

Announcements
Domopalooza 2021 Call for Presenters: We want to hear how Domo is revolutionizing the way you do business!

Click here to submit your story.