What's the best and most cost effective way to connect Domo in AWS to Azure Databricks?
@kenhui521 can you ask a more specific question or provide context. there are many ways of retrieving data from Domo... what are you trying to accomplish, how much data do you need to move, with what frequency do you need to move it. What have you tried? what have you evaluated / discounted?
Thank you, @jaeW_at_Onyx
We built datalake on top of Azure Databricks. Looking forward the best practice to integrate them.
Currently, it's still in design phrase and lots of flexibility.
Looked into Azure Databricks link: https://docs.microsoft.com/en-us/azure/databricks/integrations/bi/power-bi
There are lots of connectors: Power BI, Tableau, Looker etc, but Domo.
Looked into Domo Connectors and couldn't find it either: https://www.domo.com/appstore/apps?appType=Connector
Looking forward for the best practice.
If Databricks is a data layer but you don't have a Domo connector prebuilt you have three major options
ODBC driver via Domo Workbench
use transfer data via APIs (use your favorite orchestration tool, PyDomo = Python Domo SDK), or Domo CLI, https://knowledge.domo.com/Administer/Other_Administrative_Tools/Command_Line_Interface_(CLI)_Tool#section_34
Ask your CSM / TSM though, Data Bricks certainly isn't a new player in the space, I wonder if Domo has unpublished connectors for that use case.
We got a connector created which is a Simba Spark JDBC connector which we use to connect to Databricks.
My company is also creating a data layer in Databricks. We can't use the JDBC driver. I would like to upvote a Databricks connector so we don't have to move the data to a separate location just to get it into Databricks.