Azure Data Factory CICD error: The document creation or update failed because of invalid reference - azure-devops

All, when running a build pipeline using Azure Devops with ARM template, the process is consistently failing when trying to deploy a dataset or a reference to a dataset with this error:
ARM Template deployment: Resource Group scope (AzureResourceManagerTemplateDeployment)
BadRequest: The document creation or update failed because of invalid reference 'dataset_1'.
I've tried renaming the dataset and also recreating it to see if that would help.
I then deleted the dataset_1.json file from the repo and still get the same message so it's some reference to this dataset and not the dataset itself I think. I've looked through all the other files for references to this but they all look fine.
Any ideas on how to troubleshoot this?
thanks

try this
Looks like you have created 'myTestLinkedService' linked service, tested connection but haven't published it yet and trying to reference that linked service in the new dataset that you are trying to create using Powershell.
In order to reference any data factory entity from Powershell, please make sure those entities are published first. Please try publishing the linked service first from the portal and then try to run your Powershell script to create the new dataset/actvitiy.

I think I found the issue. When I went into the detailed logs I found that in addition to this error there was an error message about an invalid SQL connection string, so I though it may be related since the dataset in question uses Azure SQL database linked service.
I adjusted the connection string and this seems to have solved the issue.

Related

ADF Error code 2011: Required property 'connectionString' is not provided in connection properties

I am trying to connect to snowflake using linked service and copy data from adls to SF using adf pipeline. I created the linked service and tested the connection. It works fine. Even the debug over the pipeline works fine. althogh when I manually try to trigger the pipeline I get "Required property 'connectionString' is not provided in connection properties"
Thanks in advance.
This looks like a configuration issue and this behavior is noticed when the linked service is configured to use parameters (linked service parameterization), or a key vault is used, and if the connection string value isn't passed to the linked serviced during runtime.
Since it is working in debug mode, I would recommend publishing your linked service and in case if it is parameterized or using AKV, then please make sure that the connection string value is being passed and evaluated at runtime.

Release Pipeline error when using Azure Dacpac Task

I'm new to using Azure release pipelines and have been fighting issues trying to deploy a database project to a new Azure SQL database. Currently the pipeline is giving me the following error...
TargetConnectionString argument cannot be used in conjunction with any other Target database arguments
I've tried deploying with and without the TargetConnectionString included in my publish profile. Any suggestions or something else to try? I'm out of ideas.
TargetConnectionString
Specifies a valid SQL Server/Azure connection string to the target database. If this parameter is specified it shall be used exclusively of all other target parameters. (short form /tcs)
So please remove all other TargetXXX arguments.
(if you don't have them can you show what arguments you have inline and in publish profile - of course without data)

Not Able to Publish ADF Incremental Package

As Earlier Posted a thread for syncing Data from Premises Mysql to Azure SQL over here referring this article, and found that lookup component for watermark detection is only available for SQL Server Only.
So tried a work Around, that while using "Copy" Data Flow task ,will pick data greater than last watermark stored from Mysql.
Issue:
Able to validate package successfully but not able to publish same.
Question :
In Copy Data Flow Task i'm using below query to get data from MySql greater than watermark available.
Can't we use Query like below on other relational sources like Mysql
select * from #{item().TABLE_NAME} where #{item().WaterMark_Column} > '#{activity('LookupOldWaterMark').output.firstRow.WatermarkValue}'
CopyTask SQL Query Preview
Validate Successfully
Error With no Details
Debug Successfully
Error After following steps mentioned by Franky
Azure SQL Linked Service Error (Resolved by re configuring connection /edit credentials in connection tab)
Source Query got blank (resolved by re-selection source type and rewriting query)
Could you verify if you have access to create a template deployment in the azure portal?
1) Export the ARM Template: int he top-right of the ADFv2 portal, click on ARM Template -> Export ARM Template, extract the zip file and copy the content of the "arm_template.json" file.
2) Create ARM Template deployment: Go to https://portal.azure.com/#create/Microsoft.Template and log in with the same credentials you use in the ADFv2 portal (you can also get to this page going in the Azure portal, click on "Create a resource" and search for "Template deployment"). Now click on "Build your own template in editor" and paste the ARM template from the previous step in the editor and Save.
3) Deploy template: Click on existing resource group and select the same resource group as the one where your Data Factory is. Fill out the parameters that are missing (for this testing it doesn't really matter if the values are valid); Factory name should already be there. Agree the terms and click purchase.
4) Verify the deployment succeeded. If not let me know the error, it might be an access issue which would explain why your publish fails. (ADF team is working on giving a better error for this issue).
Did any of the objects publish into your Data Factory?

Use azure data factory Updating Azure Machine Learning models

When I use data factory to update Azure ML models like the document said (https://learn.microsoft.com/en-us/azure/data-factory/v1/data-factory-azure-ml-update-resource-activity),
I faced one problem:
The blob reference: test/model.ilearner has an invalid or missing file extension. Supported file extensions for this output type are: ".csv, .tsv, .arff".'.
I have searched the problem and found this solution:
https://disqus.com/home/discussion/thewindowsazureproductsite/data_factory_create_predictive_pipelines_using_data_factory_and_machine_learning_microsoft_azure/ .
But my linked service for the outputs of training service pipeline and update service pipeline are already different.
How can I solve this problem?

Data Factory job failing to submit to Azure Batch?

I'm trying to submit a Data Factory pipeline to Azure Batch compute, with a linked service that I have previously been using and works fine.
However, the pipeline is failing with the following message:
Azure Batch entity not found. Code: 'JobNotFound' Message: 'The specified job does not exist. RequestId:d8f2b8d6-b34b-4823-9a06-9037ff549185 Time:2016-05-26T10:21:43.1480686Z'
The two sentences seem inconsistent, one states that the batch entity wasn't found, thought code says JobNotFound, which is referring to a Azure Batch Job.
Would appreciate help.
I fixed the problem by deleting the batch account, and creating a new one with a different name.
Could not figure out what was causing the issue.