Data Factory job failing to submit to Azure Batch? - azure-data-factory

I'm trying to submit a Data Factory pipeline to Azure Batch compute, with a linked service that I have previously been using and works fine.
However, the pipeline is failing with the following message:
Azure Batch entity not found. Code: 'JobNotFound' Message: 'The specified job does not exist. RequestId:d8f2b8d6-b34b-4823-9a06-9037ff549185 Time:2016-05-26T10:21:43.1480686Z'
The two sentences seem inconsistent, one states that the batch entity wasn't found, thought code says JobNotFound, which is referring to a Azure Batch Job.
Would appreciate help.

I fixed the problem by deleting the batch account, and creating a new one with a different name.
Could not figure out what was causing the issue.

Related

How to delete only tables from target database using DACPAC or using SQLPackege.exe arguments which is not in Source DB? [Azure DevOps]

I have tried multiple ways to delete only tables from target database that is not in Source Database using dacpac.
If anyone has better suggestion or solution to maintain Source and target DB similar in terms of tables only.
Suggest solution in any of these:
dacpac file.
SQL Project in .net
SQLPackege.exe arguments.
Added These SQLPackage.exe arguments:
/p:DropObjectsNotInSource=true /p:BlockOnPossibleDataLoss=false /p:AllowDropBlockingAssemblies=true
Facing these errors:
*** Could not deploy package.
Warning SQL72012:
Warning SQL72015:
Warning SQL72014:
Warning SQL72045:
These StackOverFlow links didn't help me or similar:
Deploy DACPAC with SqlPackage from Azure Pipeline is ignoring arguments and dropping users
Is it possible to exclude objects/object types from sqlpackage?
I am expecting that my Source and Target DBs should have equal tables

Azure Data Factory CICD error: The document creation or update failed because of invalid reference

All, when running a build pipeline using Azure Devops with ARM template, the process is consistently failing when trying to deploy a dataset or a reference to a dataset with this error:
ARM Template deployment: Resource Group scope (AzureResourceManagerTemplateDeployment)
BadRequest: The document creation or update failed because of invalid reference 'dataset_1'.
I've tried renaming the dataset and also recreating it to see if that would help.
I then deleted the dataset_1.json file from the repo and still get the same message so it's some reference to this dataset and not the dataset itself I think. I've looked through all the other files for references to this but they all look fine.
Any ideas on how to troubleshoot this?
thanks
try this
Looks like you have created 'myTestLinkedService' linked service, tested connection but haven't published it yet and trying to reference that linked service in the new dataset that you are trying to create using Powershell.
In order to reference any data factory entity from Powershell, please make sure those entities are published first. Please try publishing the linked service first from the portal and then try to run your Powershell script to create the new dataset/actvitiy.
I think I found the issue. When I went into the detailed logs I found that in addition to this error there was an error message about an invalid SQL connection string, so I though it may be related since the dataset in question uses Azure SQL database linked service.
I adjusted the connection string and this seems to have solved the issue.

Release Pipeline error when using Azure Dacpac Task

I'm new to using Azure release pipelines and have been fighting issues trying to deploy a database project to a new Azure SQL database. Currently the pipeline is giving me the following error...
TargetConnectionString argument cannot be used in conjunction with any other Target database arguments
I've tried deploying with and without the TargetConnectionString included in my publish profile. Any suggestions or something else to try? I'm out of ideas.
TargetConnectionString
Specifies a valid SQL Server/Azure connection string to the target database. If this parameter is specified it shall be used exclusively of all other target parameters. (short form /tcs)
So please remove all other TargetXXX arguments.
(if you don't have them can you show what arguments you have inline and in publish profile - of course without data)

Use azure data factory Updating Azure Machine Learning models

When I use data factory to update Azure ML models like the document said (https://learn.microsoft.com/en-us/azure/data-factory/v1/data-factory-azure-ml-update-resource-activity),
I faced one problem:
The blob reference: test/model.ilearner has an invalid or missing file extension. Supported file extensions for this output type are: ".csv, .tsv, .arff".'.
I have searched the problem and found this solution:
https://disqus.com/home/discussion/thewindowsazureproductsite/data_factory_create_predictive_pipelines_using_data_factory_and_machine_learning_microsoft_azure/ .
But my linked service for the outputs of training service pipeline and update service pipeline are already different.
How can I solve this problem?

Spring GS - Creating a Batch Service missing output from db query

I have run the complete source for Getting Started - Creating a Batch Service
Knowing that the sample uses the memory-based database provided by the #EnableBatchProcessing, is the db query result expected or it will only be available if data will be persisted permanently?
After adding some debug lines, it seems that the DB query is executed first before the job gets executed. Was this the expected behavior?
Is there anything I'm missing here.
Thanks
Alex
You aren't missing anything. This is related to issue number 8 for that guide (https://github.com/spring-guides/gs-batch-processing/issues/8). I just created a pull request to address this issue. You can view the PR here (https://github.com/spring-guides/gs-batch-processing/pull/9) until it's merged.
UPDATE
The PR has been merged and the guid has been updated. The new version should no longer have this issue.