Pulling numbers as DECIMAL over SQL

I'm migrating cards to some new DataSets and am checking for conflicts in advance. We've been pushing Excel spreadsheets and will now be pulling from Postgres.

 

Domo has three numeric types: LONG, DOUBLE, and DECIMAL. I cannot figure out how to pull data into DECIMAL. I care only because they come in as DOUBLE and Domo sees that a a type conflict error when you migrate.

 

Does anyone know how to pull numbers out of a SQL source and have it import a a DECIMAL? I've tried casting to every standard numeric type in Postgres without luck. I've tried using TRUNC to specific numbers of decimal (2, 4, and 6) and that hasn't worked. At least as I tried it.

 

Suggestions very welcome and much apprecaited!

 

Tagged:

Best Answer

  • DataSquirrel
    Accepted Answer

    I've now resolved this issue with Tech Support and want to update this thread for the sake of the archives. Here's the short version:

    * The Postgres Connector (and probably others) doesn't support pulling data into DECIMAL. Can't be done.
    * When you migrate a card, you get a DataSet mismatch error.
    * The difference between DOUBLE and DECIMAL on charts is....there is no difference.

    So, the only real problem is that you get a warning during migration that you have to ignore. The main feature to add here would be a better report from the migration tool. As in, exact details of what column(s) are missing and/or have mis-matched types. I put this in somewhere on the Ideas forum some time back but can't find it now.

    For reference, here are the solutions available if you really want to get data into a DECIMAL type and your connector doesn't support it:

    * Use a MySQL DataFlow. (I think these work even if your source data is from something other than MySQL.)
    * Create an ETL DataFlow.
    * Use Workbench with a Schema transformation.
    * Not worry about it.

    The DataFlows feel like overkill and a maintenance hassle, so I'm going with "don't worry about it." We've used the APIs to pull the DataSet definitions down, so we can pre-screen them for differences in detail. Without that, it would be pretty scary to say "Yeah, the DataSets don't match...whatever." I've blown up a few cards in the past and when they're messed up, there's not way to get them back, so far as I have found.

    For reference, here's a good bit of info from Justin Luczak at Domo on the subject of DOUBLE versus DECIMAL:

     

    Yeah I wouldn't worry about it. On a practical level it is not going to make a difference to you at all in Domo. It will work the same on the charts you will see all the values that are in the data and be able to visualize them accordingly.

     

    I think setting up the dataflows will not beneficial with the time it will take. As far as the Domo environment goes it is not going to make a difference for you. As I mentioned we are just storing it as efficiently as possible. So if the values in the column are 6 D digits and 4 M digits then we will store that column at that size. and if the data changes the parse will recognize that and a lot more storage. You would see the same thing if you had a Char column that had been specified as a Char(1024) but the largest string is 5 characters long we won't store as 1024 ,we will make it a VARCHAR and adjust. In the dataflows and cards you will still see the same correct strings.

     

Answers

  • AS
    AS 🔵

    You're using Workbench to get the data, then?

     

    In the Workbench "Schema" menu there are options to rename destination columns as well as change column types (which, as you mentioned, you can also do directly in your queries).

    data types.PNG

    Pick decimal from that list if you haven't already tried that.

  • Thanks for answering! It sounds like I wasn't clear. We were using Workbench, we are moving to a Postgres pull. With our running setup on Workbench, the data comes into Domo as DECIMAL. I haven't been able to find any way to match that with the Postgres pull. I've tried every base numeric type in Postgres (smallint, int, decimal, double, real, foat, numeric...any I've forgoetten), and they come through as either LONG or DOUBLE, never DECIMAL.  I've also tried using TRUNC to a 2, 4, or 6 decimal places and it still comes through as a DOUBLE.  

     

    Does anyone know how to pull numbers from a SQL source and have Domo set it a a DECIMAL?

     

    Thanks!

  • AS
    AS 🔵

    What data transfer method are you using to pull from postgreSQL?  Are you using the API connector in the Domo web interface? I'm not aware of a way to override datatype coming out of those APIs.

    Could you possibly use Workbench and an ODBC connection and DSN on that server to connect to your database?  That would be pretty flexible, and unless that's not on a Windows machine, it should work.


  • @AS wrote:

    What data transfer method are you using to pull from postgreSQL?  Are you using the API connector in the Domo web interface? I'm not aware of a way to override datatype coming out of those APIs.

    Could you possibly use Workbench and an ODBC connection and DSN on that server to connect to your database?  That would be pretty flexible, and unless that's not on a Windows machine, it should work.


    Right now I'm using the PostgreSQL Connector and will be switching to the SSH version of the connector shortly.

     

    Clever suggeastion on using Workbench as a kind of relay, I hadn't thought of that. We're getting away from Workbench in this case and Postgres will be on Linux. I hate ODBC with a hatred that burns with the fire of 10,000 suns. I'd rather avoid Workbench as it would double all of the activity and add an additional point of possible failure. It feels a bit tail-wags-dog to enlist Workbench just to cast a number properly.

     

    That makes me think: Does it matter? Does Domo chart DECIMAL and DOUBLE values differently? One option is to change the existing Workbench setup to make the numbers DOUBLE and see if anything breaks. (Would anthing break?) If not, then the new DataSet will match the existing DataSet without Domo seeing any type conflict.

     

    And, for sure, the Postgres connector doesn't have any knobs to turn like Workbench does. You just set up the connection and drop in a single query. But I have found that you can have some impact by using a CAST statement in the SQL of the query. (In my case, the SQL is in a view, but that doesn't change what it does, just where it's stored.)

     

    In Postgres, you can change types using a syntax:

     

     

    value::type

     

    So

     

    mynumbervalue::decimal

    It's the same thing as using CAST, just a differnent syntax.

     

     

    Anyway, this can make a difference, but I can't find a version that convinces Domo to treat the number as a DECIMAL instead of a DOUBLE. Since I've gone this far, I might as well spell out what I've found.

     

    Below are Postgres types that translate to LONG in Domo:

    smallint

    integer

     

    Below are Postgres types that translate to DOUBLE in Domo:

    bigint
    decimal
    numeric
    real
    float

     

    I think that those are all of the base types, which was why I tried TRUNC(value::decimal,2) and 4 and 6 to see if they might come through as a DECIMAL. 

     

    No luck so far.

     

    Workbench can do it, so it can be done...I just don't know how. Workbench is just pushing data at Domo over the wire using a combination of public and unpublished APIs...but it's not magic. I guess I could set up a capture proxy on the Workbench machine and see what's going over the wire and then figure out a SQL expression that produces the same format. That seems like a lot of time to sort out some basic information that seems like it should be in the manuals. 

     

    Thanks for trying to help out!

     

     

     

  • I don't use the Magic ETL features in Domo as I prefer to prepare the data in advance, but I did check DataFlows out a few months back. Super cool UI, very nice. I do not want to use a DataFlow in this case because it makes our setup more complicated...but I figured it's worth a try.

     

    I can't get it to work. I get "Unexpected conversion error while converting value" and "Failed to convert data" error strings. I don't care much now as I don't want to use a DataFlow anyway.

     

    ...but this screen showed me something that I didn't expect: The column type names differ between this screen and the API results:

     

    API             UI

    LONG        WHOLE NUMBER

    DOUBLE    DECIMAL

    DECIMAL   DECIMAL (FIXED)

     

     

    I'm not sure that this makes much difference, but it does make it a bit hard to track down.

     

    I contacted Tech Support about this issues a few days back and have updated the case with details as I turn them up. So far, I've had no technical response. If I do get back some useful information, I'll post it here for the benefit of the archives.

     

  • Just as a sample of one of the errors I was seeing trying to convert a string column with values like "82.03" into a DECIMAL. Note that the source string column is populated from a numeric source and that there are no blank rows - so I don't think the problem is an obvious flaw in the source data.

     

     

    fail_string_to_decimal.png 

     

     

     

  • And here's a remedial question from me: How do I specify an output DataSet when trying a column conversion? I managed to find a combination that doesn't immediately fail, but the DataFlow Editor doesn't see any valid input source to build an output from (?)

    how_do_I_name_the_output_dataset.png

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

    Thanks!

     

     

  • AS
    AS 🔵

    As far as chart rendering goes, I don't think Domo generally cares.  They're just numbers. I agree, don't implement workbench just for data type changes.

    And making a dataflow to change datatypes is probably overkill.  I avoid doing dataflows for just one or two minor things.  If you're doing other prep work, then dataflows are great, but if you have a dataflow for just a datatype change then you have essentially two almost identical datasets (the input and the output).

     

  • AS
    AS 🔵

    I think here you'd need a successfully configured intermediate step in order to specify the output dataset.  So if you can't resolve the error on the prior step, you can't finalize the last step.

  • Aaron, thanks for the comments! I'll see if Tech Support has a fix but, agreed, a DataFlow to fix one thing feels really over the top and best to avoid.

     

    Thanks

This discussion has been closed.