How can I connect PowerBi to Queries in Azure Devops? - azure-devops

I need to get he data from the queries of the Azure devops. Im trying to establish a direct connection between them. Im able to access all the other items like Boards,tasks, work items etc. But, im unable to see the Query. How can I rectify this issue?
Thanks in advance.

Generally, you can pull data from Analytics into Power BI in one of three ways:
Connect using the OData queries
Connect using the Azure DevOps Data Connector
Connect using the Power BI's OData Feed connector
More details, please check the following link:
https://learn.microsoft.com/en-us/azure/devops/report/powerbi/overview?view=azure-devops#supported-data-connection-methods
It seems you are using the second way. This connector only works with Boards data (work items) and does not support other data types. You can not establish a direct connection between Query and PowerBI. But, as the Query is used to list work items based on field criteria you specify, you can create a custom Analytics view in Azure DevOps and add filters by field criteria, then you can connect to this custom Analytics view in PowerBI.
https://learn.microsoft.com/en-us/azure/devops/report/powerbi/analytics-views-create?view=azure-devops
Or you can use OData queries to filter field criteria directly.

Related

Azure IoT hub message routing push with blob index tags

I have a setup which consists of devices sending data to Azure cloud IoT Hub using message routing (to storage endpoints) which land up as blobs in a container. The frequency of data push is high. On the other end, I want to be able to query my blob container to pull files based on specific dates.
I came across blob index tags which look like a promising solution to query and is supported by the Azure SDK for .net.
I was thinking to add tags to each blob ex: processedDate: <dd/mm/yyyy>, which would help me query on the same later.
I found out that while uploading the blobs manually it is possible to add the tags but not sure how to go about or where to configure the same in the message routing flow where blobs are created on the fly. So I am looking for a solution to add those tags in flight as they are being pushed on to the container.
Any help on this will be much appreciated.
Thanks much!
Presently, the Azure IoT Hub doesn't have a feature to populate a custom endpoint for instance headers, properties, tags, etc.
However, in your case such as a storage custom endpoint you can use an EventGridTrigger function to populate a blob based on your needs.

Tool for Azure cognitive search similar to Logstash?

My company has lots of data(Database: PostgreSQL) and now the requirement is to add search feature in that,we have been asked to use Azure cognitive search.
I want to know that how we can transform the data and send it to the Azure search engine.
There are few cases which we have to handle:
How will we transfer and upload on index of search engine for existing data?
What will be the easy way to update the data on search engine with new records in our Production Database?(For now we are using Java back end code for transforming the data and updating the index, but it is very time consuming.)
3.What will be the best way to manage when there's an update on existing database structure? How will we update the indexer without doing lots of work by creating the indexers every time?
Is there anyway we can automatically update the index whenever there is change in database records.
You can either write code to push data from your PostgreSQL database into the Azure Search index via the /docs/index API, or you can configure an Azure Search Indexer to do the data ingestion. The upside of configuring an Indexer to do the ingestion is that you can also configure it to monitor the datasource on a schedule for updates, and have those updates reflected into the search index automatically. For example via SQL Integrated Change Tracking Policy
PostgreSQL is a supported datasource for Azure Search Indexers, although the datasource is in preview (not get generally available).
Besides the answer above that involves coding on your end, there is a solution you may implement using Azure Data Factory PostgreSQL connector with a custom query that tracks for recent records and create a Pipeline Activity that sinks to an Azure Blob Storage account.
Then within Data Factory you can link to a Pipeline Activity that copies to an Azure Cognitive Search index and add a trigger to the pipeline to run at specified times.
Once the staged data is in the storage account in delimitedText format, you can also use built-in Azure Blob indexer with change tracking enabled.

Power BI Testreport: Failed Testrun without Bug

want to report failed test runs without related bug. Think a really detailed entity relationship diagram of the Devops data bases will give the answer. Unfortunately I found only parts that don't answer my question.
Thanks for any hint that kicks me forward.
Joe
You can refer to this document below to do it:
Connect to Analytics data by using the Power BI OData feed
Simple steps:
Open PowerBI
Get data > Other > Odata feed
Specify URL, format: https://analytics.dev.azure.com/{OrganizationName}/_odata/v3.0-preview/
Connect (You may choose Basic and specify Personal access token)
Select related table, for example:

How to retrieve Velocity data that's used for Azure Dev-Ops Analytics Velocity Dashboard Widget?

I want to retrieve the backend data that's used to specifically drive the Azure DevOps Analytics Velocity Dashboard Report [Committed and Completed Points for each interaction]. I would like to pull this data either using OData feed or the Visual Studio Team Services (Beta) in order to create custom PowerBI reports
[1] Example of the report I'm trying to pull the data from can be found here:https://learn.microsoft.com/en-us/azure/devops/report/analytics/_img/velocity-ax-catalog.png?view=vsts
Why you have to retrieve the data? You can directly create Power BI reports using Analytics views.
Please see What are Analytics views for details.
When using the Power BI Data Connector, these same default views
appear in the Navigator dialog. The view you select determines the set
of records, fields, and history which are loaded into Power BI.
Default Analytics views work well for customers with small datasets.
To learn more, see Default Analytics views.
If the default Analytics views do not meet your needs, you can create
custom views to fine-tune the records, fields, and history returned to
Power BI.
Please see below links for more information:
Simplify creation of your Power BI reports using Analytics views
Create an Analytics view in Azure DevOps
Power BI integration overview
I was able to replicate the Velocity report by using the "Stories - All history" view. You would need to modify the view to track daily history and then use the Iteration start date/Iteration end date along with the "date" column to filter out the Committed and Completed data.

Power Bi API push data to an existsing dataset from Power BI Desktop

I was able to create a dataset and push data to it via an application by Power BI Rest API.
Now I have a dataset that I have imported from .pbix file (Power BI Desktop). Can I push data to it via Power BI Rest API? I have tried it and I always get an error.
It seems that REST APIs can only manipulate the datasets and tables also created via REST APIs.
You can check this idea Push data via API into an existing model and vote it up.
If you are using Push dataset you are using a live connection, which means you can only have one PushDataset in your .pbix report.
Pushing data with REST API applies only to push datasets.
*** this is true at the moment. Might change in future updates of PowerBi.