How to connect Azure Data factory to Salesforce Commerce Cloud? - azure-data-factory

is there a way to connect to Azure Data factory to Salesforce Commerce Cloud ?
In Data Factory the I only see connectors to Salesforce Service & Marketing Cloud
if it's possible I'll appreciate it if someone could show me an example
Thank you !

Actually, from the Azure Data Factory connector overview, we can know that Salesforce Commerce Cloud is not supported.
The only way is that you must achieve that in code level. Then call the Function, Python or Notebook active to run it in Data Factory.
There isn't an exist code example we can provide for you. You need design it by yourself.

we may infer from the overview of the Azure Data Factory connector (https://docs.microsoft.com/en-us/azure/data-factory/connector-overview) that Salesforce Commerce Cloud is not supported. You can only accomplish that at the code level. In order to run the Function, Python, or Notebook in Data Factory, use the appropriate function. We are unable to give you an existing code example. You must create it on your own.

Related

Azure Data Factory Connector for crunchbase

Can someone let me know if there is a Azure Data Factory connector for Crunchbase?
Crunchbase is a leader in private-company data.
Alternatively, can someone let me know if its possible to connect to Crunchbase via REST in Data Factory
I have checked list of connectors given here Supported data stores, But unfortunately it does not have connector for Crunchbase.
If you need to move data to/from a data store that is not in the service built-in connector list, here are some extensible options:
For database and data warehouse, usually you can find a corresponding ODBC driver, with which you can use generic ODBC connector.
For SaaS applications:
If it provides RESTful APIs, you can use generic REST connector.
If it has OData feed, you can use generic OData connector.
If it provides SOAP APIs, you can use generic HTTP connector.
If it has ODBC driver, you can use generic ODBC connector.
For others, check if you can load data to or expose data as any supported data stores, e.g. Azure Blob/File/FTP/SFTP/etc, then let the service pick up from there. You can invoke custom data loading mechanism via Azure Function, Custom activity, Databricks/HDInsight, Web activity, etc.
Reference - https://learn.microsoft.com/en-us/azure/data-factory/connector-overview

Is there any way to call Bing-ads api through a pipeline and load the data into Bigquery through Google Data Fusion?

I'm creating a pipeline in Google Data Fusion that allows me to export my bing-ads data into Bigquery using my bing-ads developer token. I couldn't find any data sources that should be added to my pipeline in data fusion. Is fetching data from API calls even supported on Google Data Fusion and if it is, how can it be done?
HTTP based sources for Cloud Data Fusion are currently in development and will be released by Q3. Could you elaborate on your use case a little more, so we can make sure that your requirements will be covered by those plugins? For example, are you looking to build a batch or real-time pipeline?
In the meantime, you have the following two, more immediate options/workarounds:
If you are ok with storing the data in a staging area in GCS before loading it into BigQuery, you can use the HTTPToHDFS plugin that is available in the Hub. Use a path that starts with gs:///path/to/file
Alternatively, we also welcome contributions, so you can also build the plugin using the Cloud Data Fusion APIs. We are happy to guide you, and can point you to documentation and samples.

Does azure data factory uses data catalog services?

I am planning to use Azure Data factory for ETL process, I would like to know if Azure Data factory uses the metamodel that is captured in the Data Catalog. Please advice
No currently you can' t reuse Metadata stored in Azure Data Catalog in Azure Data Factory directly. You could try to reuse some of the Metadata, retrieving Data Assets via the Rest API (https://learn.microsoft.com/en-us/rest/api/datacatalog/data-catalog-data-asset), but I think that it will be faster doing the setup in Azure Data Factory. Also be aware that main focus of Data Factory is on movement and orchestration. For Big Data transformations, you will use e. g. Databricks activities, for "classic" ETL integrate SSIS.

Integrating external objects into SF without Salesforce or Lightning connect (from Postgres tables)

I have some tables from Postgres database to be integrated into Salesforce as external objects. I went through some video tutorials and documentations where I was recommended to use Salesforce Connect which supports providers with "OData" protocol support. Is it possible to integrate Postgres tables into Salesforce as external objects without Salesforce Connect?
Thanks.
Be careful with the phrase "external objects". To me, the use of those particular words implies the specific implementation of external data access/federation delivered with Salesforce Connect. I don't believe that there is any alternative if your goal is to create "real" external objects (named "objectname__x") within Salesforce.
There are, though, Salesforce integration solutions from the likes of Progress, Jitterbit, Mulesoft, and Informatica and others that can be used to access PostgreSQL, with varying degrees of coding being required. You won't get "external objects", but you will be able to access data residing off-cloud in a PostgreSQL database from your Salesforce system.
Hope this helps.
Currently the way to integrate data from external storages (Postgres in your case) without Salesforce Connect is implement your custom logic for synchronization using REST or SOAP API, Apex classes and triggers, Salesforce Workflows and Flows. Also you will need to implement appropriate interfaces on side of your data storage. Complexity of all these steps depends on complexity of your existing data model and infrastructure around it.

Is there a way to use AWS Data Pipeline for ETL project?

I have a data transformation task at hand and am currently in need of implementing an SSIS class package using AWS Data Pipeline. Is it possible to do custom code using its SDK to retrieve data from third party SOAP based web services?
I obviously need to pull data from third party SOAP Service and then do a lot of data massaging of my own before I can dump that data on an Amazon S3 storage.
Any help in this direction is welcome.