Workbench UX

I would like to preface this by saying for the most part I'm very impressed with the capabilities of the software and the ease at which a user is able to harness those capabilities. That being said, there are a few minor things in DOMO's Workbench that I find mildly frustrating and occur fairly often.


  1. Add a way to save connection settings that can be refenced globally
    1. If you have on-site data storage you end up with lots of jobs that all share the same connection settings
    2. Saving those connections would make adding new jobs significantly quicker
    3. Also, when a connection changes (server gets migrated, credentials change, db gets renamed, switching from dev to prod, etc..) all DataSet Jobs that use that connection could be updated simultaneously by having them reference a saved connection instead of storing all the settings within the job
    4. A ancillary benefit is that those settings wouldn't need to be validated by each Data Set Job individually
  2. Add a way to change or at least display the "Update Method" setting on the "Source" edit screen.
    1. I have a tendency to forget to check whether it's set Append or Replace when I'm switching between back filling data and scheduling incremental updates
    2. This has caused data sets containing millions of rows to either get wiped out or duplicated and when the job is scheduled this occurs repeatedly till the problem is discovered
  3. Change the "verifying transport settings" validation window that pops up whenever you go to the Source Editor screen
    1. Having to wait for the modal window to finish and close every time I go to change or even just view a query is distracting
    2. Even though it only takes a few seconds, the fact that there is often no reason for the check because nothing has changed since the last time the check was run adds additional frustration
    3. Possible solutions...
      1. Remove the pop-up window and move the check to a background thread with a progress bar at the bottom of the main program window
      2. Only run the check when the connection settings are change, the job is manually run, or the changes to the job are saved
      3. Add a button to the source editor screen that manually validates the settings and add a setting to disable the automatic validation when the source page is viewed
      4. Split the query and connection settings and only run the validation when the connection setting editor is viewed
      5. After the validation is passed once within an instance of the workbench add a flag that skips it the next time the source page for that job’s source is viewed unless changes to the connection have been made
      6. Add a Close/Cancel button to the pop up window
      7. Add a Minimize/Ignore button to the pop up window that moves the validation to a background thread
    4.  Add a setting to disable the auto-save that gets called after any Data Set Job completes (I’m not sure whether this is currently just triggered by manually run jobs or if it also occurs when scheduled jobs complete).
      1. A common work flow I go through is to back-fill a large data set then make the changes necessary to the query, update method, and schedule to make the job append incremental updates automatically rather back-fill manually.
        1. The auto-save at the end means I can't make any of those changes till the back-fill finishes (which could take an hour or more) because my changes will get over written
        2. This has led to 3 common frustrating scenarios…
          1. I make the changes before the back-fill finishes then watch as my changes get saved over and I then must make all those same changes again
          2. I’m required to switch gears and start working on something else before the job completes and the changes to the job never get made. When I realize it wasn't scheduled I must determine what data was missed and append just that data. Finally after that completes I can change the job again to the scheduled the updates which opens me up to the possibility of repeating one of the scenarios again
          3. I make the changes before the job finishes and forget that those changes are going to be overwritten. This leaves me with the frustrations from both prior scenarios
  1. Add a “Save Job As” option
    1. Saving time when a new job is needed that happens to share the base configuration of an existing job
    2. Creating multiple versions of jobs to handle different tasks (back-filling, scheduled appends, repairing missing data, etc.) all related to a single data set
    3. Testing changes/updates to jobs without losing what the Job’s settings were originally in case they need to be referenced while making the changes or if those changes need to be reverted
      1. This occurs most often when trying to optimize perfemance

I'm new to platform and personally struggle with attention/memory more than most so it's possible that I'm alone in my critiques. If that's the case I would appreciate any suggestions for how I should alter my work flow to avoid some of these headaches.

11 votes

· Last Updated


  • btm



    Thank you for submitting this @kwmier. I am assigning to our product manager @michaelf to review and comment.

  • I don't understand @kwmier. Having to enter the same connection parameters to our SQL database for every.single.job. is the highlight of my time in Workbench. ?

    Love the suggestions you have here. Almost every single one of these I have personally wondered about while using Workbench.

  • Thanks @kwmier for the detailed feedback. We love hearing your feedback. We are working on a few of these and I've created a ticket to further discuss the rest of the items with the team. Thanks!

This discussion has been closed.