Splitting comma delimited keywords and combining in one column

Use case:

For every account I have a set of comma delimited keywords stored in 1 column (think "SaaS,business,b2b,etc"). I ultimately want to create a list of all unique keywords, then compare that against my dataset to measure frequency ("How many accounts contain "SaaS"?)


My feeble attempt:

I've managed to split the column into 25 Keyword Columns, but am stumped as to how I can now turn these 25 columns back into 1 as a list (1 keyword per row). From there I want to de-dupe and have my clean set of unique keywords. 


I wish I could just do a union merge on these 25 columns into 1 (think copy paste each column below each other).


Any ideas?

Best Answer

  • Valiant
    Valiant 🔵
    Answer ✓

    If you've converted the list into multiple columns, you should be able to then take that table of columns into the ETL section and use the Collapse Columns widget. This allows you to specify a column that will 'hold the column labels' and another to hold the values. If you have no values, you can just use a placeholder.


    Your result of collapsing those columns should give you the single column  you're looking for with multiple rows for each value. 

    From there you can do a simple select and group by statement.


    Hope that helps,


    **Please mark "Accept as Solution" if this post solves your problem
    **Say "Thanks" by clicking the "heart" in the post that helped you.



  • This solved the first step very well! Thank you!


    Now I have a single column dataset with individual keywords as rows, nicely trimmed of spaces and de-duplicated ("SaaS", "B2B", etc)


    But I'm stumpted on step 2...


    I want to go back and compare how many accounts match a given keyword. In Excel I would just do something like search("SaaS", keyword) and count the values. Can I do a left join on a substring (keywords:accounts)?


    For example


    Account A: SaaS, B2B

    Account B: SaaS, B2C


    Keyword "SaaS" = 2

    Keyword "B2B" = 1


    Again, thanks for the help Valiant!