My ETL fail with no indication why

I had an ETL to create some aggregated data sets from a raw dataset. It was running great for months, new data would get added to the raw dataset, the ETL would run and the aggregates would be updated. Maybe the raw dataset grew too large because now it fails and gives no error or indication why.

I tried creating an ETL to chop the dataset into smaller datasets and that ETL fails also, still no indication why. Any help would be appreciated.

Best Answer

  • MarkSnodgrass
    MarkSnodgrass Portland, Oregon 🟤
    Accepted Answer

    Have you gone to the History tab and to the right of the Successful/Failed text clicked on the 3 dots to view the details? This should tell you what step in the ETL process is having the issue.

    One common issue is a data type changing for a column in the source dataset that is being used for aggregation.


  • jaeW_at_Onyx
    jaeW_at_Onyx Budapest / Portland, OR 🟤

    If you just open the ETL if there's an obvious error, the tile will be red.

    It is unlikely that the dataflow failed b/c the datset is too large unless the dataflow takes 24 hours to run (in which case Domo will force kill it).

  • stephen_lofgren
    edited May 15

    Thanks that helped me identify the problem. All the tasks failed and were in red with no indication why. Hovering over the job before clicking to see the details gave me the error. I am still trying to figure out how to solve it though. The error says

    "Failed to parse data 'nan' as type Number for column" with a specific column.

    I tried creating an ETL to fix it but that fails with same error. I even tried changing the collector that creates the data to change the type of the column to text but still I get the same error.

  • jaeW_at_Onyx
    jaeW_at_Onyx Budapest / Portland, OR 🟤

    If i had to guess, when Domo previews your data (it scans the first N-thousand rows) it determined your data type to be type Number. When it got to N-thousand +1 row it came across the text 'nan' which obviously is not a number. hence why it failed.

    Can you reupload your data?

    If you have Magic 2.0 you can explicitly tell Domo to set the column type as Text, and then use logic ( a formula tile) to correct the non-numeric values. Or define erro handling rules in the INPUT tile.

    ALTERNATIVELY you can build a dataset view here you explicitly set the column type to Text and then use that View inside a Magic 2.0 dataflow.