Tool for Azure cognitive search similar to Logstash? - postgresql

My company has lots of data(Database: PostgreSQL) and now the requirement is to add search feature in that,we have been asked to use Azure cognitive search.
I want to know that how we can transform the data and send it to the Azure search engine.
There are few cases which we have to handle:
How will we transfer and upload on index of search engine for existing data?
What will be the easy way to update the data on search engine with new records in our Production Database?(For now we are using Java back end code for transforming the data and updating the index, but it is very time consuming.)
3.What will be the best way to manage when there's an update on existing database structure? How will we update the indexer without doing lots of work by creating the indexers every time?
Is there anyway we can automatically update the index whenever there is change in database records.

You can either write code to push data from your PostgreSQL database into the Azure Search index via the /docs/index API, or you can configure an Azure Search Indexer to do the data ingestion. The upside of configuring an Indexer to do the ingestion is that you can also configure it to monitor the datasource on a schedule for updates, and have those updates reflected into the search index automatically. For example via SQL Integrated Change Tracking Policy
PostgreSQL is a supported datasource for Azure Search Indexers, although the datasource is in preview (not get generally available).

Besides the answer above that involves coding on your end, there is a solution you may implement using Azure Data Factory PostgreSQL connector with a custom query that tracks for recent records and create a Pipeline Activity that sinks to an Azure Blob Storage account.
Then within Data Factory you can link to a Pipeline Activity that copies to an Azure Cognitive Search index and add a trigger to the pipeline to run at specified times.
Once the staged data is in the storage account in delimitedText format, you can also use built-in Azure Blob indexer with change tracking enabled.

Related

MongoDB Trigger for Azure Functions

Azure Functions don’t have a trigger for MongoDB right out of the box. Is there some custom MongoDB trigger out there that will allow for me to take advantage of Change Streams in MongoDB? Ideally, I would like to find a MongoDB trigger equivalent to the “CosmosDB” trigger in Azure Functions that takes advantage of Change Feed. If that doesn’t exist, is there some other way that I can take advantage of MongoDB change stream with Azure Functions? We use Azure Functions extensively and need some way to incorporate it with MongoDB. Specifically, we need a trigger for database changes in MongoDB. I’ve seen examples of using Azure Functions with MongoDB with a Http Trigger, but we need a trigger that makes use of the MongoDB change stream.

How can I connect PowerBi to Queries in Azure Devops?

I need to get he data from the queries of the Azure devops. Im trying to establish a direct connection between them. Im able to access all the other items like Boards,tasks, work items etc. But, im unable to see the Query. How can I rectify this issue?
Thanks in advance.
Generally, you can pull data from Analytics into Power BI in one of three ways:
Connect using the OData queries
Connect using the Azure DevOps Data Connector
Connect using the Power BI's OData Feed connector
More details, please check the following link:
https://learn.microsoft.com/en-us/azure/devops/report/powerbi/overview?view=azure-devops#supported-data-connection-methods
It seems you are using the second way. This connector only works with Boards data (work items) and does not support other data types. You can not establish a direct connection between Query and PowerBI. But, as the Query is used to list work items based on field criteria you specify, you can create a custom Analytics view in Azure DevOps and add filters by field criteria, then you can connect to this custom Analytics view in PowerBI.
https://learn.microsoft.com/en-us/azure/devops/report/powerbi/analytics-views-create?view=azure-devops
Or you can use OData queries to filter field criteria directly.

Is there any way to call Bing-ads api through a pipeline and load the data into Bigquery through Google Data Fusion?

I'm creating a pipeline in Google Data Fusion that allows me to export my bing-ads data into Bigquery using my bing-ads developer token. I couldn't find any data sources that should be added to my pipeline in data fusion. Is fetching data from API calls even supported on Google Data Fusion and if it is, how can it be done?
HTTP based sources for Cloud Data Fusion are currently in development and will be released by Q3. Could you elaborate on your use case a little more, so we can make sure that your requirements will be covered by those plugins? For example, are you looking to build a batch or real-time pipeline?
In the meantime, you have the following two, more immediate options/workarounds:
If you are ok with storing the data in a staging area in GCS before loading it into BigQuery, you can use the HTTPToHDFS plugin that is available in the Hub. Use a path that starts with gs:///path/to/file
Alternatively, we also welcome contributions, so you can also build the plugin using the Cloud Data Fusion APIs. We are happy to guide you, and can point you to documentation and samples.

How edit the content of a data extract in Tableau?

I'm creating an extract from a table hosted on MS SQL Server in Tableau.
After I create the data extract, is there any way I can enable the end users to edit the data extract content? Something like an interface?
Thanks in advance
The Tableau Data Extract API (and publicly released products) do not allow you to modify the contents of an extract.
You can append new data rows to an extract, or refresh (i.e. regenerate) an extract.
Think of extracts like datamarts -- read-only snapshots of a portion of some other data store, designed to allow efficient analysis and reporting. They aren't intended to replace databases.
If you want users to make live updates, consider using a database and some sort of tech stack to allow form based updates.
This is called web-authoring in Tableau.
you can publish the extract to server and allow the users to connect to this data source for creating their own reports
using the server version of Tableau this can also be referred as Adhoc Reporting
More Info

SQL Azure data synchronisation and maintaining the history of the database

I have an on-premise database. At the same time I have the database on cloud. When the on-premise database gets updated the SQL Azure database should also get updated. Only the changed fields should be updated. The rest should remain the same. How can this be achieved in minimal time?
There is a No code solution called Data SYNC CTP2 but you need to request access which unfortunately stopped for now. (http://connect.microsoft.com/sqlazurectps)
You could try using SYNC framework. Have a look at this article: http://blogs.msdn.com/b/sync/archive/2010/08/31/sql-server-to-sql-azure-synchronization-using-sync-framework-2-1.aspx
just a note. Neither Sync Framework or Sql Azure Data Sync does column level change tracking or synchronization. When a column in a row is changed, the entire row is sent during synchronization.
As Paras mentioned, Sql Azure Data Sync is in CTP stage (CTP2 now, with CTP3 supposed to come out this summer).
Sync Framework 2.1 however already supports synching with Azure.
check out Synchronizing with SQL Azure using Sync Framework
for links to various walkthroughs/samples