Salesforce with date stamp

Anyone has any experience visualizing salesforce data with date stamp? I am trying to see how our total # of leads is progressing for each month. It seems to me that I have to join the salesforce data with domo calendar to do so. 

 

Basically, I need to be able to see what was the # of active leads in any given month in the past and see the trend from month to month. It is like a running total, taken into consideration additional new leads minus lost leads in the month.

 

I tried joining the dataset with domo calendar and the number just does not add up. I am not sure what I did wrong. Here's the sql code I used to join the two datasets:

 

SELECT d.dt
,h.*
FROM domo_calendar AS d
INNER JOIN `advanced_salesforce_data` AS h
ON CASE WHEN h.`CreatedDate` > h.`CloseDate` THEN d.dt >= h.`CreatedDate` ELSE (d.dt >= h.`CreatedDate` AND d.dt <= IfNull(h.`CloseDate`, CURRENT_DATE())) END
WHERE d.dt >= (SELECT MIN(`CreatedDate`) FROM `advanced_salesforce_data`)
AND d.dt <= CURRENT_DATE()

ORDER BY d.dt
,h.`Account.Name`

 

Any help would be appreciated. 

 

Thanks

 

J

 

Comments

  • I think your best bet is to convert your timestamp to just date form and use that to join your two datasets.

     

    In a mySQL data flow you can try:

    select h.*, d.`dt` from `advanced_salesforce_data`  h
    inner JOIN
    `calendar_domo_dimensions` d
    on date_format(d.`dt`,'%m-%d-%Y') = date_format(h.`CreatedDate`,'%m-%d-%Y')

     

    If you're using Redshift,  you can use:

    select sh.*, d."dt" from "advanced_salesforce_data" h
    inner JOIN
    "calendar_domo_dimensions" d
    on cdd."dt" = trunc(h."CreatedDate")

     

    As a side point, I just worked through something similar looking at our salesforce data and looking at what was open in past months. Because salesforce doesn't have a robust audit history, I had to do some ETL work to create the full opportunity histories (created date and close date were insufficient to tell if something had been active or not since our close date field is manual entry and not systematically tied to stages) - this may be something to consider if you haven't dug into your salesforce data yet.

  • jaeW_at_Onyx
    jaeW_at_Onyx Budapest / Portland, OR 🟤

    So much to chew on here.

     

    1) don't JOIN on a CASE.    Implement your date calculations as separate columns so it's easier to see what you're doing and troubleshoot.  It's Salesforce and SQL, performance 'does not matter', so the time to generate an extra table is negligible.  (or you could use ALTER and UPDATE commands if you'd rather do in place transforms.

     

    Logic like "h.`CreatedDate` > h.`CloseDate` " could be an 'isIndclude' flag so you can see the results.  Also, as @mhouston  you're probably not building a comprehensive history if you're doing daily replace.  Consider Recursive Queries or an incremental load strategy.

     

    IfNull(h.`CloseDate`, CURRENT_DATE())) should be a materialized column so that you have this value in your cards.  Think about it from the perspective of the end user. I don't want to have to troll through code if I disagree with your filter / output dataset.  You could easily include a 'CloseDate_Adjusted' column.

     

    If you do all the preprocessing, your goal should be a simple JOIN on BETWEEN and no WHERE clause.

     

    Do have a date and dateTime column.  For your analysis, the time element seems superfluous, and from the visualization perspective, you'll have better performance in analyzer with a lower cardinality dateOnly column.

     

    @mhouston  is right, you'll have difficulty talking about the change in pipeline if you don't start accumulating history.  for projects I've worked on, I set up a pipeline where each row of data represents a daily snapshot of opportunities that changed.  This is the easiest way to build a dataset that allows you to analyze the change in your pipeline day by day or month over month.