I am now working on Talend Open Studio. I have many jobs.
I need to modify the content of my unique context repository, for instance, to add a new context variable. thus, i wish to spread this new context variable in all the jobs I have.
For now, I had to open each job and manually to add manually the context variable I want to spread into the jobs:
Is there a way to directly spread a context variable in all my jobs from the context repository I have modified ?
This is a bug even I have come across. On some occasions, just updating and saving the context groups propagates the updates across all the jobs using that context group. At some other times, it does not.
As per the link below, Shong from Talend team says that this is supposed to be manual as not all context variables are needed in all jobs. However, I personally feel, this should be the other way round. Whatever is not needed should be manually removed, else all updates should be reflected in all jobs.
https://www.talendforge.org/forum/viewtopic.php?id=19199
Related
I have a reasonably complex release pipeline in Azure DevOps that releases a number of Azure apps, a database etc.
Each step is genericised using a library variable for the environment. For example:
But library variables are linked to a release or a selection of stages.
Currently I have to clone the entire pipeline and link a new library variable group in the clone to publish a different environment, but this is heavy on unwanted duplication and maintenance.
How can I run the same release pipeline with different library variables?
If I could do this, it would be possible to have a release for a given branch, for example, but I cannot see a way to do it.
At of this time, it is not supported to select which variable groups to use when you create a release.
If you only have one or several variables, I think you can use pipeline variables instead of variable groups, so that you can update them at release time. Here are the detailed steps:
Go to your pipeline editing page and select "Variables" tab. Click "Add" to add a variable. Then check the option "Settable at release time".
Try re-create your release. You will find the variables defined in #1 and you can edit them before create the release.
If you have many variables, I suggest you try to change the structure of your pipeline to make it more suitable for deployment to multiple environments. As Daniel said, you can use stages for each environment, and then use the variable group in stages scope.
I am configuring a release pipeline in Azure DevOps and I want the variables that get generated along the tasks to persist across re-execution of the same release. I wanted to know if that is possible.
The main goal is to create a pipeline that i can redeploy in case of a failure, if for example I have a release pipeline with 30 tasks, I would want handle skipping the tasks that got completed, but once I reach the relevant task, I need the persisted variable values.
I have looked online and I see it isn't possible to persist variables across phases, but does that also mean it cannot be persisted in the same release pipeline if I redeploy it?
From searching stack exchange and google I got to the following GitHub issue on the subject, I just wasn't sure if it also affects my own situation in the same way.
https://github.com/Microsoft/azure-pipelines-tasks/issues/4743
You have that by default unless I interpret you wrong. When redeploying the same release pipeline variables values you define (in the pipeline) do not change.
Calculated values are not persisted
How to run a talend job in multiple instances at the same time with different context group?
You can't run the same Talend Job in multiple context groups as a group is a collection of context variables that are assigned to a Job.
You can run multiple instances of the same Job in a different context, by passing the context at runtime.
e.g. --context=Prod
This assumes that you have considered all other conflicts that may occur, for example, directories and files that they job may use.
I would suggest, if you have not already done this, to externalise your context values so that, when you pass your context at runtime, values are dynamically loaded and you can have different values for different context.
Once your job is building as a jar, you can have multiple instance at the same time.
I would like to update some of the parameters of an ADF pipeline (e.g. concurrency level) of lots of mappings. I am not able to find out any cmdlet to be able to do this through powershell. I know I can drop existing pipeline and create new one, but that will start reprocessing all the Ready slices for that pipelines active period, which I don't want. Because in that case it will involve calculating up to what point existing pipeline has processed slices. And then this is only temporary, at some stage again I am going to revert back settings. I just want pipelines to change one of its properties. Doing this manually through the UI is slow and tedious. I am guessing there is no way around this, but let me know if you know of.
You can still use "New-AzureRmDataFactoryPipeline" for this Update scenario:
https://msdn.microsoft.com/en-us/library/mt619358.aspx
Use with the -Force parameter to force it to proceed even if the message reads "... may overwrite the existing resource".
Under the hood, it's the same HTTP PUT api call used by Azure UX Portal. You can verify that with Fiddler.
The already executed slices won't be re-run unless you set their status back to PendingExecution.
This rule applies to LinkedService and Dataset as well but NOT the top level DataFactory resource. A New-AzureRmDataFactory will cause the service to delete the existing DF along with all its sub-resources and create a brand new one. So be careful from there.
The result of SqlWorkflowInstanceStore.WaitForEvents does not tell me what type of workflow is runnable. The constructor of WorkflowApplication takes a workflow definition, and at a minimum, I need to be able to store a workflow ID in the store and query it, so that I can determine which workflow definition to load for the WorkflowApplication.
I also don't want to create a SqlWorkflowInstanceStore for each custom workflow type, since there may be thousands of different workflows.
I thought about trying to use WorkflowServiceHost, but not every workflow has a Receive activity and I don't think it is feasible to have thousands of WorkflowServiceHosts running, each supporting a different workflow type.
Ideally, I just want to query the database for a runnable workflow, determine its workflow definition ID, load the appropriate XAML from a workflow definition table, instantiate WorkflowApplication with the workflow definition, and call LoadRunnableInstance().
I would like to have a way to correlate which workflow is related to a given HasRunnableWorkflowEvent raised by the SqlWorkflowInstanceStore (along with the custom workflow definition ID), or have an alternate way of supporting potentially thousands of different custom workflow types created at runtime. I must also load balance the execution of workflows across multiple application servers.
There's a free product from Microsoft that does pretty much everything you say there, and then some. Oh, and it's excellent too.
Windows Server AppFabric. No, not Azure.
http://www.microsoft.com/windowsserver2008/en/us/app-main.aspx
-Oisin