Getting error while i pull azuredevops data using ODatafeed in power BI for workitemrevisions - azure-devops

I am building team velocity chart in ADO in power BI. it requires planned and completed work items for each sprint.
I am unable to get planned work items for previous sprints.
I tried using workitemrevision details through OData feed, but it gives me error while i try to expand iterationpath and areapath details.
Can anyone help me out to solve this issue?
Here are the details of error are as follow, initially I used the simple query .=OData.feed("https://analytics.dev.azure.com/organization/project/_odata/V3.0-preview/workitemrevisions?$filter=workitemtype in ('User Story')",null,[implementation="2.0"])
It give error as:
Data source error:A null value was found. For the property named 'AreaPath'which has expected type'EDM.string[nullable=false]'does not allow null values.
After that I used (Omitvalues=ODataOmitValues.nulls,ODataversion=4].
But after that it gives error as:
DataFormat.error=We expected property 'Iterationpath',but the OData service omitted it from the response data.This can occur for number of reasons and does not necessarily imply that the data does not exists or is of default value.
Details: IterationPath

Related

Mapping Data Flows Error The stream is either not connected or column is unavailable

I have a meta-data driven pipeline and a mapping data flow to load my data. When I try to run this pipeline, I get the following error.
{"message":"at Derive 'TargetSATKey'(Line 42/Col 26): Column 'PersonVID' not found. The stream is either not connected or column is unavailable. Details:at Derive 'TargetSATKey'(Line 42/Col 26): Column 'PersonVID' not found. The stream is either not connected or column is unavailable","failureType":"UserError","target":"Data Vault Load","errorCode":"DFExecutorUserError"}
When I debug the mapping data flow, all the components in the data flow work as intended.
I guess that my source connection parameters aren't flowing through properly. Below is an email of my source connection
Please let me know if you have any thoughts and questions
I found a resolution to my problem. The error was the data being passed in was a string but when the variable was unpacked my variable value didn't have a quote around it. Putting in the quotes fixed it.
For example
'BusinessEntityID'
Please let me know if there are any questions

Odata query in power bi fails on computed column

I've been following the tutorial here:
Tutorial
I can get the odata uri parsed together just fine and get a json repsonse from azure devops that looks exactly like I expect. However when I take that same uri and use it as the odata source in Power Bi, I get the error:
Details: "OData: The property 'PartiallySuccessfulRate' does not exist on type 'Microsoft.VisualStudio.Services.Analytics.Model.PipelineRun'. Make sure to only use property names that are defined by the type or mark the type as open type."
If I remove them, the query works fine in powerbi.
Is there a way to make powerbi accept the computed columns? Or do I have to do the calculation in powerbi?
I would rather to these small calculations in power bi. I use most of the time Odata query for Dynamics as well. My main purpose of Oata query is to fetch only required data and not like millions of records.
Once this purpose is solved, I let powerbi do some calculations for me.
In this way it is easier for my Team to collaborate as well so that they can update/change as easily as they can.

How to force to set Pipelines' status to failed

I'm using Copy Data.
When there is some data error. I would export them to a blob.
But in this case, the Pipelines's status is still Succeeded. I want to set it to false. Is it possible?
When there is some data error.
It depends on what error you mentioned here.
1.If you mean it's common incompatibility or mismatch error, ADF supports built-in feature named Fault tolerance in Copy Activity which supports below 3 scenarios:
Incompatibility between the source data type and the sink native
type.
Mismatch in the number of columns between the source and the sink.
Primary key violation when writing to SQL Server/Azure SQL
Database/Azure Cosmos DB.
If you configure to log the incompatible rows, you can find the log file at this path: https://[your-blob-account].blob.core.windows.net/[path-if-configured]/[copy-activity-run-id]/[auto-generated-GUID].csv.
If you want to abort the job as soon as any error occurs,you could set as below:
Please see this case: Fault tolerance and log the incompatible rows in Azure Blob storage
2.If you are talking about your own logic for the data error,may some business logic. I'm afraid that ADF can't detect that for you, though it's also a common requirement I think. However,you could follow this case (How to control data failures in Azure Data Factory Pipelines?) to do a workaround. The main idea is using custom activity to divert the bad rows before the execution of copy activity. In custom activity, you could upload the bad rows into Azure Blob Storage with .net SDK as you want.
Update:
Since you want to log all incompatible rows and enforce the job failed at the same time, I'm afraid that it can not be implemented in the copy activity directly.
However, I came up with an idea that you could use If Condition activity after Copy Activity to judge if the output contains rowsSkipped. If so, output False,then you will know there are some skip data so that you could check them in the blob storage.

In which case will this error occurs "Restricted dimension(s): ga:userAgeBracket, ga:userGender can only be queried under certain conditions"?

I'm using Google Analytics Core Reporting API v4. When I query using the dimensions: ga:userAgeBracket & ga:userGender, I get the following error:
Restricted dimension(s): ga:userAgeBracket, ga:userGender can only be queried under certain conditions
Can someone tell me why this error occurs?
Not all dimensions and metrics can be queried together. This can be for several reasons it may not make sense to have them mixed. It may also be that a relation between them does not exist.
My guess would be that there is no relation between ga:userAgeBracket, ga:userGender. Gender came from double click cookie.

SSRS: Dropdown is not populated in filter in Report Builder

Whenever I try to apply filter to an attribute, which has ValueSelection= Dropdown, the dropdown is not populated and error message "The requested list could not be retrieved because the query is not valid or a connection could not be made to the data source" is shown instead.
If I set up ValueSelection=List I am getting a different error message:
An attempt has been made to use a semantic query extension associated with the data extension 'SQL' that is not registered for this report server.
(Microsoft.ReportingServices.SemanticQueryEngine)
This happens within BIDS environment and was observed both in SQL 2005 and SQL 2008.
I've already studied articles, which discussed the similiar problem, but neither of them applied to my case. The user account in data source has all necessary rights, data could be retrieved without any problem (for example if i try "Explore data" in data source view). The SQL profiler shows that no query is being sent to SQL Server when there is an attempt to populate dropdown. So nothing is wrong with the query, it is simply never executed.
Your connection is not working. Try to test you connection by trying a simple table and query output.
This will enable you to test the connection before trying anything advanced.
Got this problem and in my case it was caused by wrong connection string in Data Source - instead of just having a SQL Server name like "SOMESQLSERVER_MACHINE" I had for some reason "SOMESQLSERVER_MACHINE.our.corp.domain". It had to be the same, but then I realized that the domain is wrong, after removing it all works like a charm again. That said: it's always good idea to start with detailed checks on your basic settings.
Otherwise this could be a problem with permissions to the folders on Report Manager.