Applying calculations to hundreds of columns in a dataset - best practice?
I have a dataset that has ~1,200 columns and I need to perform calculations on 495 of those columns (simple multiplication and division) and create a new column for each with the calculated value...
Trying to do this in ETL will take forever to setup (would need 3 calculators, 2 of which will have all of the the 495 columns specified in them), so I thought I'd do it as an SQL transform, but I've run into the "Identifier name must be 64 characters or less" problem. I can't edit the field names before the data gets into DOMO as it comes from a 3rd party source, so I figured I could use an ETL to rename the columns that were too long BUT the "select columns" transformation screen hangs my browser (assuming because there's ~1,200 columns) (I'm on a new MacBook Pro have tried Chrome, Firefox and Safari with same result).
So I figured I'd do the calculations in best mode and save the results back to the dataset - but surely this isn't best practice?!
Suggestions of the "correct" way to tackle this most welcome!
- 10.7K All Categories
- 13 Getting Started in the Community
- 39 Beastmode & Analytics
- 2.1K Data Platform & Data Science
- 59 Domo Everywhere
- 2.7K Charting
- 2.5K Ideas Exchange
- 1.3K Connectors
- 362 Workbench
- 303 Use Cases & Best Practices
- 503 APIs
- 120 Apps
- 48 News
- 753 Onboarding
- 1.2K 日本支部
- Private Company Board