It's easy to remove Duplicates.....How do I KEEP only the Duplicates in a Dataset?

I have a large dataset that I am constantly adding to, and the unique identifier for each item is a 17 or 22 character Text String.  It is critical for me to quickly identify if I have duplicated a previous item when I add to the dataset.

 

ETL makes it simple to remove duplicates from a dataset.....but is there a way to eliminate everything BUT the duplicates???  Ideally, I'd like to either:

 

1.  Create an alert anytime a new duplicate value is added to the dataset, or

 

2.  Create an Output Dataset that consists ONLY of the rows that have a duplicate value in a specific column.

 

Thanks in advance for any help.

Best Answer

  • Anthony_G
    Accepted Answer

    Try this:

     

    Items you will need:

     

    Input table for your dataset.

    Group by

    2 Filters

    Join

    Output Dataset

     

    Connect Group by to your input dataset.

    In step 1) put the field you want to check for Duplicates.

    In 2) - also put the field you want to check for Duplicates, but do a Count on it and name it "Count"

     

    Connected this Group By to a filter - and filter for "Count Greater Than 1"

     

    Drag another connector from your Input Dataset to the Join, and drag a connector from your first Filter to the Join (basically joining the dataset to itself, after grouping and filtering). Join these two based on the field you were checking for duplicates.

     

    Connected your Join to 2nd filter - and filter for "Count is not null"

     

    Connect your Filter to an output dataset - which should produce a table showing you ONLY rows that have >1 of the field you were checking for.

     

    Optionally - you can also do a Select Columns between the 2nd filter and the output dataset, selecting all rows except the 2nd copy of the field you were looking for duplicates on. Because both input datasets will have that field, it will appear twice in the final set unless you remove it using this method.

     

    Good luck!

Answers

  • Thank you......that worked perfectly, and your instructions were perfectly clear and easy to implement!

  • Worked a treat.  thanks!

  • user052846
    user052846 ⚪️

    I tried this but I keep getting duplicates.

    What Join did you use and what which columns from what dataset did you drop?

    My filter found no nulls.

Sign In or Register to comment.

Hey, Stranger!

It looks like you're new here. Those who sign in get access to engage with even MORE fire content. To get involved, click one of these buttons!