Unable to load this DataSet

I get the error message "Unable to load this DataSet" when trying to select an input dataset in a MySQL DataFlow. Anyone else encounter this issue?

Comments

  • I did get this once when I deleted a data set that was created from a Magic ETL upstream and then had created a similar, but different one with the same name. Ultimately I had to recreate the downstream Magic ETL and delete the one that could not load a data set. ultimately i think i corrupted it a bit by making the changes and deleting it. Doesnt solve your problem, but you may just have to recreate to solve it fastest way possible.
  • @tomer, did Shevy's reply help you out?

  • Thanks for following up! I think our issues were different. Recreating the dataset didn't help me, but I ended up figuring out -- the issue was that the dataset name was too long. That's something Domo should consider -- rather than saying the Dataset couldn't be loaded, it would be helpful to know that it's because the name is too long.

     

    Thanks for the help!

     

    Tomer

  • Glad to hear it ? Feel free to submit your idea to our "Ideas" section. 

  • Yes, agreed. I just encountered this problem as well and all it told me was "Unable to load this dataset". Like you said, it would be helpful to know why. Thankfully I found your suggestion, but had I not, I would have been very frustrated as to why my dataset was not loading.

  • Hi,

     

    I am also facing the same error "Unable to load this DataSet" when trying to write an SQL. Its not becaus eof name length coz I tried shortening the length but still it is not working. On the otherhand another dataset with a longer name is working. Can anyone help me with this?

     

    Thanks,

    Lakshmi

  • I hope this helps - 

    https://knowledge.domo.com/?cid=troubleshooting

     

    I am getting an "Unable to load this DataSet" error when attempting to load a DataSet into a MySQL DataFlow

    This error is usually caused by one of the following:

    • You may have one or more column header names that exceed the 64-character limit.

    • You may have row level data in one or more columns that exceeds the 1024-character limit.

    If the problem is in the column header names, you should be able to see this by previewing the DataSet.

    To find out if you have row level data that exceeds the limit, you will have to do more research. Look at the data previews and look to see if any columns appear wide (as if there were sentences in the rows).

    You can also pull the DataSet into a Redshift DataFlow then use the LEN function to find the length of the data or the MAXfunction to find the largest character amount for each column. For more information, see Creating an SQL DataFlow.

     
    Note: Magic does not have a header and row limit as MYSQL and Redshift do. Cards will load columns with more than 1024 characters but will automatically truncate the data in a given row to the limit.   
  • Thank you. Unfortunately, No. I have another dataset which is a superset to the problamatic dataset and bigger data set isnt creating issues but subset is creating a problem. If there is anything else I can check on, please share.