I am trying to get usage data from Power BI into Power BI desktop in order to create an admin report. The report will show the usage of the different reports in Power BI.
In order to get this data I am using the Power BI REST APIs. Specifically calls such as:
GET https://api.powerbi.com/v1.0/myorg/admin/datasets -- To get datasets
GET https://api.powerbi.com/v1.0/myorg/admin/apps?$top={$top} -- To get apps
To get datasets in Power Query I can then write:
let
Source = Json.Document(
Web.Contents(
"https://api.powerbi.com/v1.0/myorg/admin/datasets", [Headers=[Authorization="Bearer MYKEY"]]))
in Source
This does retrieve the datasets. However the key used is taken from https://learn.microsoft.com/en-us/rest/api/power-bi/admin/apps-get-apps-as-admin#code-try-0
This link allows a user to try the API, which gives you a temporary key. In order to get a refreshable token/key, another call must be made to the API. In order to make this call I have created an application in Azure which has been granted rights by our admin. In order to retrieve the refreshable token, I have written this in Power Query (by using Azure HTTP POST requests for an access token from Power BI):
() =>
let
apiUrl = "https://login.windows.net/MY TENANT ID/oauth2/token",
body = [
client_id="My Client ID",
grant_type="client_credentials",
client_secret="My Client Secret",
resource="https://analysis.windows.net/powerbi/api"
],
Source = Json.Document(Web.Contents(apiUrl, [Headers = [Accept = "application/json"],
Content = Text.ToBinary(Uri.BuildQueryString(body))]))
in
Source
This call is successful and returns me the following (Photo):
The natural progression would then be to paste the generated access token into my first query, but this gives me an access error. "Expression.Error: Access to the resource is forbidden." When changing the data sources settings from anonymous to Windows I get another error message: "Expression.Error" The 'Authorization' header is only supported when connecting anonymously..."
Any ideas on what to do in order to get the data into Power BI would be greatly appreciated. Thanks.
It is probably not the answer you are looking for but my team was attempting to do the very same thing and found the only way to get this to work was to add some form of data store between PBI admin logs and the PBI dataset. We use CSV's and a sql database but an Azure Job runs on a schedule and collects the data and stores it locally, then PBI reads that stored data.
This is not a direct answer to your question, but you can build admin reports from the Power BI Service data using the Power BI REST API Connector from Github (link). Then you can connect to the service directly from PBI Desktop, without dealing yourself with OAuth and/or AAD authentication. The connector has some limitations, but it was very useful for our reporting.
I don't know if you can install this custom connector in the Power BI Service, but it works perfectly in Power BI Desktop.
Related
I'm trying to pull data from Hubspot to my SQL Server Database through an Azure Data Factory pipeline with the usage of a REST dataset. I have problems setting up the right pagination rules. I've already spent a day on Google and MS guides, but I find it hard to get it working properly.
This is the source API. I am able to connect and pull the first set of 20 rows. It gives an offset which is usable with vidoffset= which is returned in the body.
I need to return the result of vid-offset from the body to the HTTP request. Also the process needs to stop when has-more results in 'false'.
I tried to reproduce the same in my environment and I got the below results:
First I create a linked service with this URL: https://api.hubapi.com/contacts/v1/lists/all/contacts/all?hapikey=demo&vidOffset
Then after I created the pagination end condition rule with $.has-more and absolute URL.
For demo purpose, I took sink as a storage account.
The pipeline run success full look at the below image for reference.
For more information refer this Ms Document
I'm trying get Refresh Logs for each dataset from Power BI rest API with Power Shell script.
Documentation for API https://learn.microsoft.com/en-us/rest/api/power-bi/datasets/getrefreshhistory
But I'm getting below error in power shell. could you please help me ?
At first sight, you should use Invoke-PowerBIRestMethod instead of the generic Invoke-RestMethod. The first one will take care of adding the authorization token in the request for you, which you must add manually when using the generic cmdlet (and you didn't).
I am currently working at an Powershell script that will automatically download the billing data from my companies CSP platform. I use following powershell module (https://github.com/Microsoft/Partner-Center-PowerShell)
And use the following api(that the module calls I guess) To get the overall price from the last month of all my subscriptions. https://learn.microsoft.com/en-us/partner-center/develop/get-a-subscriptions-resource-usage-information. With the Powershell module that is wonderfull I've managed to recieve good data from my Azure Resources and print them out in a CSV file for Power BI to create a report of them.
My question now is when I use power BI to create graphs of my cost and usage. I don't see the name of my resources (vm, storage, sql) instead I see names of the types (Read Operations,LRS Write Additional IO, ...) And of course that's also a good indicator. I would love to see the VM (the name of the vm or storage or sql) with the highest cost and usage not which type. The ResourceName in the response of the api is not exactly right. The Resource name ( in Resource URL formatting) however is available with this api: https://learn.microsoft.com/en-us/partner-center/develop/get-a-customer-s-utilization-record-for-azure.
But here I cannot retrieve the cost of my Azure Resource. I tried to combine the 2 api's (one to retrieve the cost and the other one to retrieve the Resource URL with the Resource ID as a combiner) but strangly enough some customers have data in the usage api and not in the utilization api. So that didn't helped out well. My question today is: Is it possible to retrieve the resource name or URL with the response data i got from this api: https://learn.microsoft.com/en-us/partner-center/develop/get-a-subscriptions-resource-usage-information . Or is there another way of providing the name of the object instead of the service.
The resource usage feature will not return the resource name. So, you will need to combine data from Get-PartnerCustomerSubscriptionUtilization and Get-PartnerAzureRateCard to get the resource name and the partner cost associated with each billable meter. Since you are working with Power BI, it might be a good idea to check out Partner-Center-Query. It is another project that allows you query data from the Partner Center API through Power Query.
I've set up an Azure Data Factory pipeline to transfer the data from one table in our SQL Server Database to our new Azure Search service. The transfer job continuously fails giving the following error:
Copy activity encountered a user error at Sink side:
GatewayNodeName=SQLMAIN01,ErrorCode=UserErrorAzuerSearchOperation,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Error
happened when writing data to Azure Search Index
'001'.,Source=Microsoft.DataTransfer.ClientLibrary.AzureSearch,''Type=Microsoft.Rest.Azure.CloudException,Message=Operation
returned an invalid status code
'RequestEntityTooLarge',Source=Microsoft.Azure.Search,'.
From what I've read thus far, Request Entity Too Large error is a standard HTTP error 413 found inside REST API. Of all the research I've done though, nothing helps me understand how I can truly diagnose and resolve this error.
Has anyone dealt with this with specific context to Azure? I would like to find out how to get all of our database data into our Azure Search service. If there are adjustments that can be made on the Azure side to increase the allowed request size, the process for doing so certainly is not readily-available anywhere I've seen on the internet nor in the Azure documentation.
This error means that the batch size written by Azure Search sink into Azure Search is too large. The default batch size is 1000 documents (rows). You can decrease it to a value that balances size and performance by using writeBatchSize property of the Azure Search sink. See Copy Activity Properties in Push data to an Azure Search index by using Azure Data Factory article.
For example, writeBatchSize can be configured on the sink as follows:
"sink": { "type": "AzureSearchIndexSink", "writeBatchSize": 200 }
I'm developing an Azure application using this stack:
(Client) Angular/Breeze
(Server) Web API/Breeze Server/Entity Framework/SQL Server
With every request I want to ensure that the user actually has the authorization to execute that action using server-side code. My question is how to best implement this within the Breeze/Web API context.
Is the best strategy to:
Modify the Web API Controller and try to analyze the contents of the
Breeze request before passing it further down the chain?
Modify the EFContextProvider and add an authorization test to
every method exposed?
Move the security all into the database layer and make sure that a User GUID and Tenant GUID are required parameters for every query and only return relevant data?
Some other solution, or some combination of the above?
If you are using Sql Azure then one option is to use Azure Federation to do exactly that.
In a very simplistic term if you have TenantId in your table which stores data from multiple tenants then before you execute a query like SELECT Col1 FROM Table1, you execute USE FEDERATION... statement to restrict the query results to a particular TenantId only, and you don't need to add WHERE TenantId=#TenantId to your query,
USE FEDERATION example: http://msdn.microsoft.com/en-us/library/windowsazure/hh597471.aspx
Note that use of Sql Azure Federation comes with lots of strings attached when it comes to Building a DB schema one of the best blog I have found about it is http://blogs.msdn.com/b/cbiyikoglu/archive/2011/04/16/schema-constraints-to-consider-with-federations-in-sql-azure.aspx.