How can CodePipeline do migration db to RDS with ECS - amazon-ecs

I made CodePipeline will deploy container to ECS.
And I`m trying to make migration task via Deploy step in CodePipeline.
I can run one-off task manually from Task Definition.
But CodePipeline cannot run one-off task.
Any idea? I need other services?
Best regards, Takuto.

A common solution for "one off" tasks is to run them as a separate action before you deploy. So, for example you might have a separate CodeBuild action run the migrations first, then run your fleet deployment.

Related

Is there a way we can deploy all the cloudformation templates from a gitlab repository using gitlab pipeline in aws in a single stack?

I'm looking for an option to pick all the templates from the repository without hardcode the yml template files and in future if new templates are added, the pipeline should automatically pick all of them and do the deploy and create a single stack in aws environment, without making any modification to gitlab-ci.yml/pipeline file.
I tried using deploy CLI command, it deploy all the templates but then it goes for update and start deleting one by one and only the last template resource will be available after the pipeline execution is complete.
Let me know if there is an option to do this?

How to run script on AWS through Jenkins build

I am new to Jenkins and AWS. I have a MongoDB scripts on AWS EC2 instance. The first script need to run before the jenkins build and stores the snippets of DB. Second script need to run post build to restore to that snippets. Scripts are done ready to be used. I just couldn't find a exact way to get to AWS through build and implement this in a Jenkins job. Any help would be appreciated. Thanks
You can use Jenkins stages to to the Prebuild operation, Build and then Post build operations. With the stages you can use a plugin like SSH Pipeline Steps to remotely execute commands on your EC2 instance.

Azure Pipeline Parallelism option

I am new to azure pipelines, started learning & in the process of creating my very 1st yaml pipeline.
My project is private, I am using a multi-stage templated pipeline, self-hosted as need to concurrently deploy a java web application to 7 VMs using mvn tomcat7 plugin run: command
so as to run selenium automation tests in parallel across all the VMs. A template pipeline which is called 7 times to deploy to all the VMs is such that it needs to stay running as
necessitated by the embedded tomcat instance on each of the VMs which in turn requires the ablitiy to have parallelism enabled, pay extra for to achieve this.
My question is; is there another way without having to pay extra for parallelism or turning my project to be a public one ?
I think what you want is parallel job. Only job could execute publish tasks in parallel.
And from this document, you could use parallel job freely when you change your project to public. And the job can run for up to 360 minutes(6 hours).
You need is that under Project Settings--> Overview--> change Visibility to public.
After that, under pipeline, add the publish task for each new agent job. So that, you could executes the publish task in parallel.

How to deploy an ASP.Net docker containerized application to an On-Premise Server, using Azure CI-CD Pipelines?

I have a multi-layer Asp.Net Application running. Due to it's multi-layer nature, I have to build a container for it and deploy it as a container.
Is there anyway I can deploy it to an existing server using Azure-Pipelines?
All other support that I am finding online is related to deploying to Azure App Services, however I would like to deploy to an existing production environment.
Is there anyway I can deploy it to an existing server using Azure-Pipelines?
Since you are deploying to the local environment, you can use Self-hosted Agent(Build Pipeline and Release Pipeline) or Deployment Group(Release Pipeline).
Then you could try the following pipeline settings.
Here is a blog about ASP.Net Application Deployment in Docker for Windows.
You could use Command Line Task to run the docker command. In this case, you can move the local build and deploy process to azure devops
By the way, if you have the Container registry Service connection in Azure Devops, you could use the Docker task or Docker Compose task.

serverless deploy: Stop watching after CloudFormation has the update

I'm using Bitbucket Pipelines to do CD for a Serverless app. I want to use as few "build minutes" as possible for each deployment. The lifecycle of the serverless deploy command, when using AWS as the backing, seems to be:
Push the package to CloudFormation. (~60 seconds)
Sit around watching the logs from CloudFormation until the deployment finishes. (~20-30 minutes)
Because of the huge time difference, I don't want to do step two. So my question is simple: how do I deploy a serverless app such that it only does step one and returns success or failure based on whether or now CloudFormation successfully accepted the new package?
I've looked at the docs for serverless deploy and I can't see any options to enable that. Also, there seem to be AWS specific options in the serverless deploy command already, so maybe this is an option that the serverless team will consider if there is no other way to do this.
N.B. As for, "how will you know if CloudFormation fails?", for that, I would rather set up notifications to come from CloudFormation directly. The build can just have the responsibility of pushing to CloudFormation.
I don't think you can do it with serverless deploy. You can try serverless package command that will store the package in .serverless folder or you can specify the path using --package. Package will create a CloudFormation template file e.g. cloudformation-template-update-stack.json. You can then call Create Stack API action to create the stack. It will return the stack ID without waiting for all the resources to be created.