AWS ECS does not update a container with latest source code from ECR - amazon-ecs

I have ESC service with EC2 task running on an EC2 instance. I am building and pushing docker image into ECR using GitHub Actions.
The ECS task has been updated with latest ECR image, but the problem is that the task has not been updated as latest source code.
When I pulled ECR latest docker image onto my local and run, it is working correctly with updated code. But, on ECS, it is not.
Anyone, please guide me why it has been happening and what is solution.

I resolved it. It was caused by that I mounted app root into EFS volume. So it was replacing an ECS container with old code on EFS volume, although ECS agent pull new docker image from ECR.

Related

Kubernetes how to make Deployment to update image auto CI/CD

I am using gcp and kubernetes.
I have gcp repository and container registry.
I have a trigger for build container after pushing into the master branch.
I don't know how to set some auto-trigger to deploy new version of the container (docker file).
How can I automate the build process?
You need some extra pieces to do it, for example if you use Helm to package your deployment you can use Flux to trigger the automated deployment.
https://helm.sh/
https://fluxcd.github.io/flux/
There are two solutions here.
You can expand the build step. Cloud Build can also push changes to your GKE cluster. You can read more about this here
What you currently have is a solid CI pipeline, for the CD, you can use Spinnaker for GCP, which was released recently. This integrates well with GCE, GKE and GAE and allows you to automate the CD portion.

SBT deploy image to ECR and then use the image in Elastic Beanstalk

I cannot for the life of me figure out how to associate an image from the elastic container registry with an elastic beanstalk environment. I'm modifying a project that used to upload the image to S3, and I'm changing it so that it stores the image in ECR instead.
The registry for the image and the EB environment are both currently being set up with terraform. Previously the S3 bucket for the image was set in the docker settings of my build.sbt file, but I cannot find an equivalent for stating the name of the ECR repository. I am able to successfully deploy my docker image to ECR using the sbt-ecr plugin, but I cannot find anything that would then let elastic beanstalk know where this image is.
This is my first time using EB so I might just be misunderstanding how it works, but I honestly can't see any link between the code and EB. The deployment pipeline previously just ran
sbt realease with-defaults
and then
aws elasticbeanstalk update-environment --environment-name <env-name> --version-label <version>
How am I supposed to tell it to look for the image in the ECR repository? I don't even understand how it was working before, let alone how to get it to work now. All of the EB plugins I've found have just deployed the local image to EB rather than using something deployed somewhere. When I google the general issue I keep seeing things telling me to add stuff to my Dockkerrun.aws.json file, but I guess that's something that sbt has previously been building for me as I've never seen it before.
Any help would be greatly appreciated.

aws codeBuild buildspec.yml example for github

I am trying to use AWS CodeBuild for building my code from github. These are the steps I followed so far,
1) Created a windows docker image with all the pre-req software
needed (git, npm, node.js etc) and pushed to Amazon ECS.
2)Created a project in AWS CodeBuild using
a) github as the source (What to build)
b) docker image created in Step 1 (How to build)
I setup buildspec.yml as below:
env:
#variables:
#parameter-store:
phases:
#install:
#pre_build:
build:
commands:
- git clone https://github.com/OrgName/RepName.git "c:\www\localfolder"
#post_build:
#artifacts:
#files:
But this is always failing during DOWNLOAD_SOURCE STEP saying "CodeBuild is experiencing Issues"
Please suggest how to setup buildspec.yml for github clone\fetch\checkout purpose.
Thanks.
The issue you encountered may not be related to git clone\fetch\checkout failure. The build could also fail at "DOWNLOAD_SOURCE" step if CodeBuild failed/timed out when pulling the Windows Docker image; especially when the image is large.
Workarounds you can try:
1) use the windows image provided by CodeBuild and install the pre-req software during the install phase. (you will need to update your buildspec.yml)
OR
2) use a BUILD_GENERAL1_LARGE instance. maybe you will need to increase the timeout too.

Automated deployment for Docker images which is in php technology

I am working on an automated Azure build for Docker application.
I need to connect to container registry and pull the images from container and push it to Docker Swarm resource deployed in Azure.
Can you please suggest me the steps.
I need to automate using a PowerShell script

Can you share Docker Images uploaded to Google Container Registry between different accounts?

We'd like to have a separate test and prod project on the Google Cloud Platform but we want to reuse the same docker images in both environments. Is it possible for the Kubernetes cluster running on the test project to use images pushed to the prod project? If so, how?
Looking at your question, I believe by account you mean project.
The command for pulling an image from the registry is:
$ gcloud docker pull gcr.io/your-project-id/example-image
This means as long as your account is a member of the project which the image belongs to, you can pull the image from that project to any other projects that your account is a member of.
Yes, it's possible since the container images are on a per-container basis.