How to update dataflow node(Add node on existing node) - dataflow

I add a node to the existing pipeline and update it to dataflow,use --update command ,but unsuccessful。how to add node on existing pipeline.
enter image description here

Related

How to add new container in the already existed task definition using cloudformation?

I have setup CICD using Codepipeline and CodeBuild that deploy my application to ECS Fargate container. This is the setup for the frontend of my app.
For the backend I have another same CICD. But this time, I want to deploy my backend to the same ECS Fargate container using Cloudformation. I know I have to update the task definition.
How I can update the already existed task definition that only create backend container of my app to the same task definition that we have used for frontend. And it should not affect the frontend container?
Is there any workaround for this?
You can't do that. Task definitions are immutable. You can only create new reversion of a task definition and deploy the new reversion. You can't change existing reversion. From docs:
A task definition revision is a copy of the current task definition with the new parameter values replacing the existing ones. All parameters that you do not modify are in the new revision.
To update a task definition, create a task definition revision. If the task definition is used in a service, you must update that service to use the updated task definition.

AWS CLI - Create New Revision of Task Definition

In AWS ECS with the UI, I can create a new revision of a task definition.
I go to Task Definitions -> Select my Task Definition -> Select my Revision -> Click Create new revision.
With AWS UI, the container definition properties are copied across from the old revision to the new revision.
With AWS CLI, how do I copy across the container definition from the old revision to the new revision? Is there a simple CLI command I can use without having to manually extract properties from the old definition to then create the new definition?
This is my AWS CLI solution so far:
I'm getting the image with:
aws ecr describe-images ...
And the container definition with:
aws ecs describe-task-definition ...
I'm then extracting the container definition properties, placing them in a json string $CONTAINER_DEFINITION and then creating a new revision with:
aws ecs register-task-definition --family $TASK_DEFINITION --container-definitions $CONTAINER_DEFINITION
When I check the UI, the old revision's container definition properties are not copied across to the new revision's container definition.
I expected the container definition properties to be copied across from the old revision to the new revision, as that would be the same behaviour as AWS UI.
I am trying to do exactly the same - create a new revision of an existing task definition using an updated container version. I suspect your approach is registering an entirely new task definition, rather than creating an incremental version of an existing one.
Update... managed to get this working using Powershell and the AWS CLI. The PS commands below read the task definition, edit the container version in the container definitions, then get the container defs as JSON and pass them back into the register command.
$taskDef = aws ecs describe-task-definition --task-definition <task definition name> --region=eu-west-1 | ConvertFrom-Json
$taskDef.taskDefinition.containerDefinitions[0].image = "<container>:<version>"
$containerDefinitions = $taskDef.taskDefinition.containerDefinitions | ConvertTo-Json -Depth 10
aws ecs register-task-definition --family "<task definition name>" --container-definitions $containerDefinitions --memory 8192
The trick to generate a revision (rather than a new task def) appeared to be the family parameter which is set to the existing task definition name. Not sure why it required the memory parameter, but it worked.

Run Kubectl DevOps task with run-time specified resource details

We're building out a release pipeline in Azure DevOps which pushes to a Kubernetes cluster. The first step in the pipeline is to run an Azure CLI script which sets up all the resources - this is an idempotent script so we can run it each time we run the release pipeline. Our intention is to have a standardised release pipeline which we can run against several clusters, existing and new.
The final step in the pipeline is to run the Kubectl task with the apply command.
However, this pipeline task requires specifying in advance (at the time of building the pipeline) the names of the resource group and cluster against which it should be executed. But the point of the idempotent script in the first step is to ensure that the resources and to create if not.
So there's the possibility that neither the resource group nor the cluster will exist before the pipeline is run.
How can I achieve this in a DevOps pipeline if the Kubectl task requires a resource group and a cluster to be specified at design time?
This Kubectl task works with service connection type: Azure Resource Manager. And it requires to select Resource group field and Kubernetes cluster field after you select the Azure subscription, as below.
After testing, we find that these 2 fields supports variable. Thus you could use variable in these 2 fields, and using PowerShell task to set variable value before this Kubectl task. See: Set variables in scripts for details.

how to handle ECS deploys in CodePipeline for changes in task definition

I am deploying an ECS Fargate task with two containers: 1 reverse proxy nginx and 1 python server. For each I have an ECR repository, and I have a CI/CD CodePipeline set up with
CodeCommit -> CodeBuild -> CodeDeploy
This flow works fine for simple code changes. But what if I want to add another container? I can certainly update my buildspec.yml to add the building of the container, but I also need to 1) update my task definition, and 2) assign this task definition to my service.
Questions:
1) If I use the CLI in my CodeBuild stage to create a new task definition and associate it with my service, won't this trigger a deploy? And then my CodeDeploy will try to deploy again, so I'll end up deploying twice?
2) This approach ends up creating a new task definition and updating the service on every single deploy. Is this bad? Should I have some logic to pull down the LATEST task revision and diff the JSON from CodeCommit version and only update if there is a difference?
Thanks!
The CodePipeline's ECS Job Worker copies the Task Definition and updates the Image and ImageTag for the container specified in the 'imagedefinitions.json' file, then updates the ECS Service with this new TaskDef. The job worker cannot add a new container in the task definition.
If I use the CLI in my CodeBuild stage to create a new task definition and associate it with my service, won't this trigger a deploy? And then my CodeDeploy will try to deploy again, so I'll end up deploying twice?
I don't think so. Is there a CloudWatch event rule that triggers CodeDeploy in such fashion?
This approach ends up creating a new task definition and updating the service on every single deploy. Is this bad? Should I have some logic to pull down the LATEST task revision and diff the JSON from CodeCommit version and only update if there is a difference?
The ECS Deploy Job worker creates a new task definition revision every time a deployment occurs so if this is official behaviour, I wouldn't consider it bad as such.
I will question why you need to add new containers to your Task definition in runtime during deploys. Your task definition in general should be modified less often, and only the image:tag in it should be modified regularly - something the ECS Deploy action helps you achieve.

How to activate a particular node through java code / workflow in Day CQ

I want to activate only the modified node in my Day CQ project.
Node structure is:
Parent Node
Child Node 1
Child Node 2
Child Node 3
Requirement is:
If I am creating any node under parent node workflow should run.
Workflow should activate only newly created child node and parent node and not all child nodes.
Is it possible ??? Please give solution also in comment......
This can be easily relaised in using the workflow console.
Open the workflow console create a workflow model using in the models tab.
Then open the model and add the "Activate Page/Asset Step [com.day.cq.wcm.workflow.process.ActivatePageProcess]" to the model.
Save the model and switch to the launcher tab.
Create a new launcher for the path of the your parent node and select event type "Created".