Is possible to send an E-Mail as final step of a DataFlow chain?



We are developing a dataflow that needs to send an email to a list of users after all the steps are done.

I tried to use smtp library in a python script as follows:



with smtplib.SMTP_SSL("", port, context=context) as server:
server.login(email, password)

Giving this error:


socket.gaierror: [Errno -2] Name or service not known


Is there any way send emails that fit our needs, regarding that we don't want to save the results in an output dataset, exporting that as csv for example, and manually send it


Thank you for your help.


Adrián Insua

Best Answer

  • NewsomSolutions



    Answer ✓

    @adrianiy why don't you just create an additional dataset at the end of your DataFlow called like 'DF Finished Status' or something...and create a card pointing to it...and then maybe have a field 'Finished' that can equal 1 or 2...and a date...then do an alert on the card to look for when finished = 1 and date < today...or something...and do it that way...then the alerts would send the emails/text out vs doing some complex script.  


  • Hmm I don't believe that's something that's an option. Just throwing an idea out that's sort of what you want. You could have a Python script that gets the metadata about the output dataset of the dataflow. You'll get a response that includes "dataCurrentAt" for each dataset. You could then make fairly simple logic where if the current date/time - dataCurrentAt = 1 hour or less then download it to CSV and send it out, all in Python using the Domo API. You would then have your script run hourly so that this process would happen automatically. I haven't sent emails from Python before but basically you would do something like this to authenticate

    import requests
    import json

    # Base64 version of combined id+secret
    base64auth = {base64clientID+Token}
    api_host = ''

    # Get token from Domo
    token = requests.get(';scope=audit data', headers={'Authorization': 'Basic ' + base64auth})
    token = json.loads(token.text)
    token = token['access_token']

    # GET request for the dataset list data
    dataset_endpoint = ""
    dataset_response = requests.get(dataset_endpoint, headers={'Accept': 'application/json','Authorization': 'bearer ' + token})


    This gives you a list of all your datasets. You would then loop through searching for the datasetID in question, and use the dataCurrentAt data for the calculation.


    Here's an example response from the above code


    {'columns': 10,
    'createdAt': '2017-11-28T06:33:31Z',
    'dataCurrentAt': '2019-06-07T07:01:56Z',
    'description': '',
    'id': '0e1044be-b700-4ad2-97bf-90e5476d501f',
    'name': 'Pages Viewed By Person',
    'owner': {'id': 0, 'name': 'DomoMetrics'},
    'pdpEnabled': False,
    'rows': 1852,
    'updatedAt': '2019-06-07T07:01:56Z'}



    That being said I don't know the full capabilities and limitations of the Python block in ETL as I don't have it enabled in my system.

    **Make sure to like any users posts that helped you and accept the ones who solved your issue.**
  • Thank you for both solutions.


    I tried what @NewsomSolutions said and i think it's what we need, easy to implement by us, and easy to use by users. The alert just triggers every time card updates and de count of rows is greater than 0.