Establish a connection between Azure CosmosDB SQL API and Google Bigquery - apache-kafka

I'm using Azure CosmosDB SQL API and I'm looking a way to send data from CosmosDB to Google Big query. I'm
planning to use Kafka or Azure ADF for the same. I'm not sure this is correct approach/tools.
Is there any best practice or tool or connecter which I can use to send data from CosmosDB to Google Bigquery.

Data Factory supports Azure Cosmos DB (SQL API) as source and sink, but doesn't support Google Bigquery as the sink.
It means that we can not copy the data from Cosmos DB(SQL API) to Google Bigquery.

Related

Migrate data from azure mongodb to azure search

I'm using Microsodt azure cloud provider in my project,where i have a mongodb installed on a VM on azure and also I have azure cognitive search instance . what I want to do is to migrate the data which i have on mongodb to azure search in order to create indexes and then use the restful apis on the client application.
my question is, is there a way to move data from mongodb to azure search please ?
Unfortunately there is not a built in Mongo DB connector for Azure Search as of now. However you have two options.
Migrate from Mongodb to Azure Cosmos DB (Mongo API) and then create an Azure Search Indexer for the Azure Cosmos DB in question, see https://learn.microsoft.com/en-us/azure/dms/tutorial-mongodb-cosmos-db-online
Write a custom code application that pulls data from Mongo and then send it to Azure Search indexes, you can do this by using the push API, take a look at https://learn.microsoft.com/en-us/azure/search/tutorial-optimize-indexing-push-api

Does Azure Data Factory (ADF) support linked service Azure Cosmos DB (Table API)?

Does Azure Data Factory (ADF) support linked service Azure Cosmos DB (Table API)? If not, is it possible to create linked service Azure Table Storage and provide connection string of Cosmos DB (Table API)?
Thank you!
Data Factory doesn't support Azure Cosmos DB (Table API), only support Azure Cosmos DB (SQL API):
Please reference: Supported data stores
When we create the linked Azure Table Storage, there is no way to provide the connection string of Cosmos DB (Table API).
Hope this helps.

MS Access DB to azure

I know we can migrate ms access database to ms sql and then migrate to azure sql. Is there any other option to migrate MS Access database to NoSQL database in azure (like azure tables) or any cheaper database in azure ?
Please help.
Is there any other option to migrate MS Access database to NoSQL
database in azure (like azure tables)
Surely,please get an idea of Azure Data Factory copy activity which could transfer your data from Access DB to other destinations:including azure SQL DB,azure table storage etc.
Please follow above tutorial to configure Access DB as source dataset and Table Storage as sink dataset in the copy activity, then execute it in the pipeline.
From a cost point of view,azure storage is relatively cheap for no sql storage.Please see the price details of it.

How can I transfer DevOps data to any BI tool?

I have data on Azure DevOps which gets updated every 5 minutes. Using this data I want to create a dashboard in some BI tool which will provide a consolidated view of the data. I am currently using MicroStrategy which does not support DevOps. Using Power Bi is not an option.
I want an indirect way through which I can pull the data in DevOps into MicroStrategy maybe through Azure Cosmos DB. So can I transfer data in devops to cosmos db??
You can create a simple application to get the data from Devops with Azure Devops REST API link and using Azure Cosmos DB API link to store these data in Cosmos DB.
This is sample project provided by Microsoft, introducing how to store and access data from an ASP.NET MVC application hosted on Azure Websites using Azure Cosmos DB service
https://github.com/Azure-Samples/documentdb-dotnet-todo-app
You can check the tutorial provided by Microsoft
https://learn.microsoft.com/en-us/azure/devops/pipelines/targets/cosmos-db?view=azure-devops

Connect to Azure SQL Database from Databricks Notebook

I wanted to load the data from Azure Blob storage to Azure SQL Database using Databricks notebook . Could anyone help me in doing this
I'm new to this, so I cannot comment, but why use Databricks for this? It would be much easier and cheaper to use Azure Data Factory.
https://learn.microsoft.com/en-us/azure/data-factory/tutorial-copy-data-dot-net
If you really need to use Databricks, you would need to either mount your Blob Storage account, or access it directly from your Databricks notebook or JAR, as described in the documentation (https://docs.azuredatabricks.net/spark/latest/data-sources/azure/azure-storage.html).
You can then read the files into DataFrames for whatever format they are in, and use the SQL JDBC connector to create a connection for writing the data to SQL (https://docs.azuredatabricks.net/spark/latest/data-sources/sql-databases.html).