How to overcome column length limits?
I have a dataset where one column is open text with character counts typically over 2000 or so. When I load this dataset into Domo and try to parse through that field with text functions (e.g. substring, instr, etc), I find that the values in this field are being truncated to a limit of about 1024 bytes/characters. When I export the data out to CSV, the full data is there, but when I use DataFlows (Magic and MySQL) or if I use Beast Modes, I am stuck with the 1024 limit again.
Any ideas on how to expose the full dataset? I know the varchar type can support much more than 1024 bytes, I even tried an alter table statement to set the column type to varchar(10000) for example, but still returned the same results as before.
Comments
-
Does anyone know how to help on this issue?
1 -
I've ran into this issue as well. Very long strings in MySQL dataflow get truncated to 1024 characters, Can someone from Domo comment?
Thank you!
0 -
Can somebody help answer?
0
Categories
- 10.6K All Categories
- 1 APAC User Group
- 12 Welcome
- 36 Domo News
- 9.6K Using Domo
- 1.9K Dataflows
- 2.4K Card Building
- 2.2K Ideas Exchange
- 1.2K Connectors
- 339 Workbench
- 252 Domo Best Practices
- 11 Domo Certification
- 461 Domo Developer
- 47 Domo Everywhere
- 100 Apps
- 703 New to Domo
- 84 Dojo
- Domopalooza
- 1.1K 日本支部
- 4 道場-日本支部へようこそ
- 22 お知らせ
- 63 Kowaza
- 296 仲間に相談
- 649 ひらめき共有