Checking the status of a job in the command line

Hi Everyone,


Is there a way to check the status of a workbench job from the command line without combing the log files?


We use A LOT of automated scripting to route files both before and after they are uploaded into Domo and I'd like to be able to make sure we don't accidentally move a file while it is currently in the queue to be imported or in the process of being imported itself.  Its important to note these are jobs through an FTP connection, not local.



Best Answer

  • anafziger
    anafziger 🟡
    Answer ✓

    Not to talk to myself - but with the help with the team we are working with I  can answer this question for anyone else wonder something similar.


    The command line runs the workbench in the context of the machine (background processes), so no job status is reported back to the command line interface. Viewing the logs (workbench) or looking in the Data Center in Domo is the only way to validate successful job completion. - A Domo engineer


    We plan on using some powershell and scraping the log files as a work around.


  • Hi all,

    Can anybody help @anafziger out?


  • For anyone who runs across this thread and is encountering a similar dilemma here is an important update.


    There are 2 commands to run domo jobs for the command line (I could only find 1 in the documentation):

    1. queue-job - using this command places the job in a queue to be run one at a time.  When using this command the CLI is freed up immediately and a Powershell script will continue to execute immediately.
    2. run-job - using this command runs the job immediately (running it in parrallel to any other jobs that may be running).  When using this command the CLI will not accept another command until the job has completed and Powershell script will wait before executing the rest of the script.

    Hopefully this helps someone else as much as it did us!


    Just curious, what documentation were you referring to? I am trying to build a powershell script to execute jobs in workbench. 

  • shindig
    shindig ⚪️

    I'll add to this thread in case it helps anyone. The difference between Queue-job vs. run-job played a role for us.


    We needed to run jobs based on whether conditions had been met. To start I wrote a PowerShell script that used queue-job to start the workbench jobs, so when our ETL events wrote to a log the script would check against it for dependencies and queue up jobs that were ready. This was fine and worked well for the most part, but occasionally the queue-job command would error in some way that the job never actually got queued. I also wanted to get the workbench job status, but like @anafziger said it's not accessible via command-line. We left that script as-is for a couple of years because it was working well enough that it wasn't a priority.


    To get the job status at completion, you can use run-job instead, since that will wait for the job to be run before allowing subsequent commands. To scale this, we're using the PowerShell Start-Job cmdlet so each workbench job execution can be kept as a PowerShell job. This way, you know when a job is complete. Once it's complete, we use export-job to write the job details and pull them from the JSON file. This includes last execution details like how long the job took, success status, execution message, etc., which are really helpful for getting a good picture of how your workbench jobs are looking without having to open workbench up every time. Once it's out of workbench you can build alerting off these elements, too.


    Hope that helps someone. Please let me know if there are better alternatives, this is just what we ended up doing. 


    @ChrisGainus didn't get a response but I work with him so he's set.

This discussion has been closed.