Applying calculations to hundreds of columns in a dataset - best practice?
I have a dataset that has ~1,200 columns and I need to perform calculations on 495 of those columns (simple multiplication and division) and create a new column for each with the calculated value...
Trying to do this in ETL will take forever to setup (would need 3 calculators, 2 of which will have all of the the 495 columns specified in them), so I thought I'd do it as an SQL transform, but I've run into the "Identifier name must be 64 characters or less" problem. I can't edit the field names before the data gets into DOMO as it comes from a 3rd party source, so I figured I could use an ETL to rename the columns that were too long BUT the "select columns" transformation screen hangs my browser (assuming because there's ~1,200 columns) (I'm on a new MacBook Pro have tried Chrome, Firefox and Safari with same result).
So I figured I'd do the calculations in best mode and save the results back to the dataset - but surely this isn't best practice?!
Suggestions of the "correct" way to tackle this most welcome!