Diagnostic settings to log latest status of each Azure Data Factory Pipeline run - azure-data-factory

I have an ADF Pipeline that runs once every day. I use Diagnostic settings to log pipeline runs and use the following kusto query:
AzureDiagnostics
| order by TimeGenerated desc
I see the following results:
Is there a way to display only the last status (Succeeded/Failed) for each pipeline run?

you could use the arg_max() aggregation function.
for example:
AzureDiagnostics
| where TimeGenerated > ago(1d)
| summarize arg_max(TimeGenerated, *) by runId

Related

How to print Talend Job Description and other details like job creation date?

like jobName variable ,what is the variable for jobDescription , along with jobName I want to print job Description and job Creation date, please see attached image.
Job metadata like description, author, creation date are not added to the generated java code, and so it is not possible to retrieve them from the job. They are only stored in source code. You can however print project name, jobname and version, using these variables:
job name: jobName
job version: jobVersion
project: projectName

How to refer previous task and stop the build in azure devops if there is no new data to publish an artifact

Getsolution.exe will give New data available or no new data available, if new data available then next jobs should be executed else nothing should be executed. How should i do it? (i am working on classic editor)
example: i have set of tasks, consider 4 tasks:
task-1: builds the solution
task-2: runs the Getstatus.exe which get the status of data available or no data available
task-3: i should be able to use the above task and make a condition/use some api query and to proceed to publish an artifact if data is available if not cleanly break out of the task and stop the build. it Shouldn't proceed to publish artifact or move to the next available task
task-4:publish artifact
First what you need is to set a variable in your task where you run Getstatus.exe:
and then set condition in next tasks:
If you set doThing to different valu than Yes you will get this:
How to refer previous task and stop the build in azure devops if there is no new data to publish an artifact
Since we need to execute different task based on the different results of Getstatus.exe running, we need set the condition based on the result of Getstatus.exe running.
To resolve this, just like the Krzysztof Madej said, we could set variable(s) based on the return value of Getstatus.exe in the inline powershell task:
$dataAvailable= $(The value of the `Getstatus.exe`)
if ($dataAvailable -eq "True")
{
Write-Host ("##vso[task.setvariable variable=Status]Yes")
}
elseif ($dataAvailable -eq "False")
{
Write-Host ("##vso[task.setvariable variable=Status]No")
}
Then set the different condition for next task:
You could check the document Specify conditions for some more details.

reading from configuration database inside of data factory

I need to create a data factory pipeline to move data from sftp to blob storage. At this point, I'm only doing a POC, and I'd like know how I would read configuration settings and kick off my pipeline based on those configuration settings.
Example of config settings would be (note that there would be around 1000 of these):
+--------------------+-----------+-----------+------------------------+----------------+
| sftp server | sftp user | sftp pass | blob dest | interval |
+--------------------+-----------+-----------+------------------------+----------------+
| sftp.microsoft.com | alex | myPass | /myContainer/destroot1 | every 12 hours |
+--------------------+-----------+-----------+------------------------+----------------+
How do you kick off a pipeline using some external configuration file/store?
Take a look at Lookup activity and linked service Parameterize

Azure data factory Lookup

I have a lookup activity which reads data from a SQL table and the output of this is passed onto multiple Execute Pipeline tasks as parameters. The flow is as follows
Lookup -> Exct Pipeline 1 -> Exct Pipeline 2 -> Exct Pipeline 3
.This works fine for the first pipeline however the second execute Pipeline fails with the following error.
> "The template validation failed: 'The inputs of template action 'Exct
> Pipeline 2' at line '1 and column '178987' cannot reference action
> 'Lookup'. Action 'Lookup' must either be in 'runAfter' path or within
> a scope action on the 'runAfter' path of action 'Execute Pipeline 3',
> or be a Trigger"
Another point to be noted is that the Pipeline runs fine when triggered.It only fails when in debug.
Has anyone else seen this issue?

report veritas Backupexec 16 list of servers with last succesful job associated

I am using the PowerShell module BEMCLI and I want to create a report with these columns: list of servers, Jobs associated to the server with the last successful run.
I can get the list of servers with: Get-BEAgentServer
I can get also the list of jobs in success in a period with:
Get-BEJobHistory -JobStatus Succeeded -FromStartTime (Get-Date).AddHours(-24) | ft -auto
Is there an easy way to get what I want?