Can XML be mapped to a SQL Server table in ADF? - azure-data-factory

We would like to use Azure Data Factory to read an XML document and be able to map the columns in the document to a SQL Server table so we can move the data contained in the document to a SQL table. Is this possible in ADF?

Please note that XML file type in copy activity is not supported based on the document.
I suggest you voting up an idea submitted by another Azure customer.
All of the feedback you share in these forums will be monitored and reviewed by the Microsoft engineering teams responsible for building Azure.
As workarounds,maybe you could get some clues from this link.

Azure Data factory now supports XML format in both copy activity and mapping data flow.
Learn more from https://techcommunity.microsoft.com/t5/azure-data-factory/azure-data-factory-adds-support-for-xml-format/ba-p/1529012.

Related

Tool for Azure cognitive search similar to Logstash?

My company has lots of data(Database: PostgreSQL) and now the requirement is to add search feature in that,we have been asked to use Azure cognitive search.
I want to know that how we can transform the data and send it to the Azure search engine.
There are few cases which we have to handle:
How will we transfer and upload on index of search engine for existing data?
What will be the easy way to update the data on search engine with new records in our Production Database?(For now we are using Java back end code for transforming the data and updating the index, but it is very time consuming.)
3.What will be the best way to manage when there's an update on existing database structure? How will we update the indexer without doing lots of work by creating the indexers every time?
Is there anyway we can automatically update the index whenever there is change in database records.
You can either write code to push data from your PostgreSQL database into the Azure Search index via the /docs/index API, or you can configure an Azure Search Indexer to do the data ingestion. The upside of configuring an Indexer to do the ingestion is that you can also configure it to monitor the datasource on a schedule for updates, and have those updates reflected into the search index automatically. For example via SQL Integrated Change Tracking Policy
PostgreSQL is a supported datasource for Azure Search Indexers, although the datasource is in preview (not get generally available).
Besides the answer above that involves coding on your end, there is a solution you may implement using Azure Data Factory PostgreSQL connector with a custom query that tracks for recent records and create a Pipeline Activity that sinks to an Azure Blob Storage account.
Then within Data Factory you can link to a Pipeline Activity that copies to an Azure Cognitive Search index and add a trigger to the pipeline to run at specified times.
Once the staged data is in the storage account in delimitedText format, you can also use built-in Azure Blob indexer with change tracking enabled.

How to connect API as data source in Tableau?

I need to use two data sources. One is SQL and another one is the response from a rest API.
I tried to implement WDC, but it needs an HTML and user need to interact with UI and getting the response.
But I don't want to create a html page.
Is there any way to use an API response as a data source in Tableau?
The short answer is that you can not use API directly as a data source but you should build a pipeline to transform this into flat-file o populate a database table.
The alternative answer is to use Python to connect to the REST API. You can choose to use TabPy or follow some pre-build solution like this one. Personally, I don't know how the performances could be.

Is it possible to catalog data inside csv files inside Azure Blob Storage using Azure Data Catalog?

I want to catalog data stored in csv files in the Azure Blob Storage. I tried to see if there is anyway to get metadata of Blob Storage and found Data Catalog is an option. Thing is, csv file is handled as a blob type and we can not profile it. I want, csv files in blob storage to act as tables.
Is this possible using Azure Data Catalog?
Yes you can use Data Catalog, For updated Data Catalog features, please use the new Azure Purview service, which offers unified data governance for your entire data estate. I would recommend to use : Azure Purview( Still you possible through Data Catalog)
Registering assets from a data source copies the assets’ metadata to Azure, but the data remains in the existing data-source location.
For updated Data Catalog features, please use the new Azure Purview service, which offers unified data governance for your entire data estate.
Introduction to Azure Purview (preview) - Azure Purview
This article provides an overview of Azure Purview, including its features and the problems it addresses. Azure Purview enables any user to register, discover, understand, and consume data sources.
This article outlines how to register an Azure Blob Storage account in Purview and set up a scan.
For more information on Blob index tags categorize data in your storage account using key-value tag attributes. These tags are automatically indexed and exposed as a searchable multi-dimensional index to easily find data. This article shows you how to set, get, and find data using blob index tags. Use blob index tags to manage and find data on Azure Blob Storage

Use Azure to GET from RESTful API

I would like to use Azure to retrieve JSON data from a REST api then store that data into a table. Data retrieval would occur daily and a parameter would be passed to the api to restrict the results to the prior day's data.
Which Azure component/mechanism should I use for calling the api?
The data would be the foundation for a data warehouse. Should I use Azure SQL table or Azure table?
I have recently begun exploring Azure and am not sure how to do this.
I look forward to feedback.
Thank you.
Take a look at Azure Functions. You can create an Azure Function that is periodically invoked, it has input bindings for different sources (or you can add some C# code to read from URL) and then place results into Azure Database.
Here is example of Azure Function that sends JSON to stored procedure:
https://www.codeproject.com/Articles/1169531/Sending-events-from-Azure-Event-Hub-to-Azure-SQL-D

How edit the content of a data extract in Tableau?

I'm creating an extract from a table hosted on MS SQL Server in Tableau.
After I create the data extract, is there any way I can enable the end users to edit the data extract content? Something like an interface?
Thanks in advance
The Tableau Data Extract API (and publicly released products) do not allow you to modify the contents of an extract.
You can append new data rows to an extract, or refresh (i.e. regenerate) an extract.
Think of extracts like datamarts -- read-only snapshots of a portion of some other data store, designed to allow efficient analysis and reporting. They aren't intended to replace databases.
If you want users to make live updates, consider using a database and some sort of tech stack to allow form based updates.
This is called web-authoring in Tableau.
you can publish the extract to server and allow the users to connect to this data source for creating their own reports
using the server version of Tableau this can also be referred as Adhoc Reporting
More Info