Use Azure to GET from RESTful API - rest

I would like to use Azure to retrieve JSON data from a REST api then store that data into a table. Data retrieval would occur daily and a parameter would be passed to the api to restrict the results to the prior day's data.
Which Azure component/mechanism should I use for calling the api?
The data would be the foundation for a data warehouse. Should I use Azure SQL table or Azure table?
I have recently begun exploring Azure and am not sure how to do this.
I look forward to feedback.
Thank you.

Take a look at Azure Functions. You can create an Azure Function that is periodically invoked, it has input bindings for different sources (or you can add some C# code to read from URL) and then place results into Azure Database.
Here is example of Azure Function that sends JSON to stored procedure:
https://www.codeproject.com/Articles/1169531/Sending-events-from-Azure-Event-Hub-to-Azure-SQL-D

Related

Tool for Azure cognitive search similar to Logstash?

My company has lots of data(Database: PostgreSQL) and now the requirement is to add search feature in that,we have been asked to use Azure cognitive search.
I want to know that how we can transform the data and send it to the Azure search engine.
There are few cases which we have to handle:
How will we transfer and upload on index of search engine for existing data?
What will be the easy way to update the data on search engine with new records in our Production Database?(For now we are using Java back end code for transforming the data and updating the index, but it is very time consuming.)
3.What will be the best way to manage when there's an update on existing database structure? How will we update the indexer without doing lots of work by creating the indexers every time?
Is there anyway we can automatically update the index whenever there is change in database records.
You can either write code to push data from your PostgreSQL database into the Azure Search index via the /docs/index API, or you can configure an Azure Search Indexer to do the data ingestion. The upside of configuring an Indexer to do the ingestion is that you can also configure it to monitor the datasource on a schedule for updates, and have those updates reflected into the search index automatically. For example via SQL Integrated Change Tracking Policy
PostgreSQL is a supported datasource for Azure Search Indexers, although the datasource is in preview (not get generally available).
Besides the answer above that involves coding on your end, there is a solution you may implement using Azure Data Factory PostgreSQL connector with a custom query that tracks for recent records and create a Pipeline Activity that sinks to an Azure Blob Storage account.
Then within Data Factory you can link to a Pipeline Activity that copies to an Azure Cognitive Search index and add a trigger to the pipeline to run at specified times.
Once the staged data is in the storage account in delimitedText format, you can also use built-in Azure Blob indexer with change tracking enabled.

How to Ingest SAP ODP OData services with a Delta approach via Azure Data Factory?

We are trying to consume SAP ODP OData services (See Using the OData Service for Extracting ODP Data), and due to large volumes we would like to use their Delta Token approach (i.e. CDC) so that we only need to get changes after the initial load. This seems like a common use case to me but I cannot find any Azure Data Factory (ADF) documentation that addresses this.
What ADF Connector should we use?
ADF OData Connector does not appear to allow HTTP Request Headers to be sent, so we can't pass Prefer: odata.track-changes.

Can XML be mapped to a SQL Server table in ADF?

We would like to use Azure Data Factory to read an XML document and be able to map the columns in the document to a SQL Server table so we can move the data contained in the document to a SQL table. Is this possible in ADF?
Please note that XML file type in copy activity is not supported based on the document.
I suggest you voting up an idea submitted by another Azure customer.
All of the feedback you share in these forums will be monitored and reviewed by the Microsoft engineering teams responsible for building Azure.
As workarounds,maybe you could get some clues from this link.
Azure Data factory now supports XML format in both copy activity and mapping data flow.
Learn more from https://techcommunity.microsoft.com/t5/azure-data-factory/azure-data-factory-adds-support-for-xml-format/ba-p/1529012.

Does azure data factory uses data catalog services?

I am planning to use Azure Data factory for ETL process, I would like to know if Azure Data factory uses the metamodel that is captured in the Data Catalog. Please advice
No currently you can' t reuse Metadata stored in Azure Data Catalog in Azure Data Factory directly. You could try to reuse some of the Metadata, retrieving Data Assets via the Rest API (https://learn.microsoft.com/en-us/rest/api/datacatalog/data-catalog-data-asset), but I think that it will be faster doing the setup in Azure Data Factory. Also be aware that main focus of Data Factory is on movement and orchestration. For Big Data transformations, you will use e. g. Databricks activities, for "classic" ETL integrate SSIS.

Is there a way to use AWS Data Pipeline for ETL project?

I have a data transformation task at hand and am currently in need of implementing an SSIS class package using AWS Data Pipeline. Is it possible to do custom code using its SDK to retrieve data from third party SOAP based web services?
I obviously need to pull data from third party SOAP Service and then do a lot of data massaging of my own before I can dump that data on an Amazon S3 storage.
Any help in this direction is welcome.