Building a Dockerfile in DevOps pipeline - ibm-cloud

I've put a Dockerfile in the root directory of my project. I know the Dockerfile works because I've built it locally. But when I try to build it in the pipeline, it says Dockerfile not found in project and aborts.
The documentation indicates that it should work so long as the Dockerfile is in the root directory. So I'm really confused.
do I need to supply some additional information in the build job to point it do the Dockerfile?
Thanks.

Related

Deploy a test site azure devops pipeline (LAMP)

So I've been looking at getting the pipeline build for automation E2E Acceptance testing going.
I've been able to get the pipeline running on an agent, now the default path that the code is checked out to is
/home/myusername/myagent/_work/1/s
which is $(Pipeline.Workspace), and cannot be changed to custom path like /var/www/html/ and accepts only relative path as described here.
If I try to point the apache webroot to the above dir, I get the forbidden error. I tried
running apache as the directory owner as described here
giving full 777 permission to the directory
Now, how do I get apache to interact with those files or checkout the files inside the apache webroot /var/www/htmlin the first place so no additional work is required?
OS: Ubuntu 18

Somthing went wrong building your image. COPY failed: stat /var/lib/docker/tmp/docker-builder982586077/*/MyFamilyManager.API

I am trying to build and deploy asp net core container app to heroku app using Github Actions. Due to some reason i am getting COPY failed error. Same DockerFile is working fine in my local.
Please find my docker file and github actions below.
Docker File, Workflow File, Action Logs
The path you specified for .csproj is incorrect based on your github action workflow.
However, without making any further changes to the Dockerfile, if you change your Github Action workflow by setting the correct working-directory: src & dockerfile: 'Services/Core/MyFamilyManager.API/', it will fix your problem.
Looks like there is no way to make it work with my solution folder structure. docker always uses DockerFile folder as context, so it will not be able to access my other parent folders.
I found a way by creating docker compose file at my root of my solution to build container image.
I have created and published Github action. I hope this will help others with similar issue.
https://github.com/marketplace/actions/deploy-multiple-docker-images-to-heroku-apps

Azure pipelines, how to use a Dockerfile from an other repository?

I'm trying to build a Azure DevOps pipeline that uses a separate repository for Dockerfiles / templates. Whats the cleanest way to use Dockerfiles from another repository?
We have experimented with having the template refer to the dockerfile but the build server seems to not have access to the filepath.
In Dockerfile repository:
steps:
- script: docker build -f pathTo/Dockerfile
In build repository:
steps:
- template: Dockerfilerepository.yml
We want this to create a docker build process in side the building repository but we instead get this error message:
unable to prepare context: unable to evaluate symlinks in Dockerfile path: lstat /home/vsts/work/1/s/pipelines: no such file or directory
you'd need to checkout that repo separately and then you can use files in that repo, you can use a script step for that, something like this:
git clone https://x-access-token:$(github-access-token)#github.com/ORG/OTHER_PRIVATE_REPO.git
but its probably not the best idea to keep docker files in a separate repo

Gitlab Runner - New folder for each build

I'm using Gitlab CI for my project. When I push on develop branch, it runs tests and update the code on my test environment (a remote server).
But the gitlab runner is already using the same build folder : builds/a3ac64e9/0/myproject/myproject
But I would like to create a now folder every time :
builds/a3ac64e9/1/yproject/myproject
builds/a3ac64e9/2/yproject/myproject
builds/a3ac64e9/3/yproject/myproject
and so on
Using this, I could just update my website by changing a symbolic link pointing to the last runner directory.
Is there a way to configure Gitlab Runner this way ?
While it doesn't make sense to use your build directory as your deployment directory, you can setup a custom build directory
Open config.toml in a text editor: (more info on where to find it here)
Set enabled = true under [runners.custom_build_dir] (more info here)
[runners.custom_build_dir]
enabled = true
In your .gitlab-ci.yml file, under variables set GIT_CLONE_PATH. It must start with $CI_BUILDS_DIR/, e.g. $CI_BUILDS_DIR/$CI_JOB_ID/$CI_PROJECT_NAME, which will probably give you what you're looking for, although if you have multiple stages, they will have different job IDs. Alternatively, you could try $CI_BUILDS_DIR/$CI_COMMIT_SHA, which would give you a unique folder for each commit. (More info here)
variables:
GIT_CLONE_PATH: '$CI_BUILDS_DIR/$CI_JOB_ID/$CI_PROJECT_NAME'
Unfortunately there is currently an issue with using GIT_BUILDS_DIR in GIT_CLONE_PATH, if you're using Windows and Powershell, so you may have to do something like this as a work-around, if all your runners have the same build directory: GIT_CLONE_PATH: 'C:\GitLab-Runner/builds/$CI_JOB_ID/$CI_PROJECT_NAME'
You may want to take a look at the variables available to you (predefined variables) to find the most suitable variables for your path.
You might want to read the following answer Changing the build intermediate paths for gitlab-runner
I'll repost my answer here:
Conceptually, this approach is not the way to go; the build directory is not a deployment directory, it's a temporary directory, to build or to deploy from, whereas on a shell executor this could be fixed.
So what you need is to deploy from that directory with a script as per gitlab-ci.yml below, to the correct directory of deployment.
stages:
- deploy
variables:
TARGET_DIR: /home/ab12/public_html/$CI_PROJECT_NAME
deploy:
stage: deploy
script:
mkdir -pv $TARGET_DIR
rsync -r --delete ./ $TARGET_DIR
tags:
- myrunner
This will move your projectfiles in /home/ab12/public_html/
naming your projects as project1 .. projectn, all your projects could use this same .gitlab-ci.yml file.
You can not achieve this only with Gitlab CI runner configuration, but you can create 2 runners, and assign them exclusively to each branch by using a combination of only and tags keywords.
Assuming your two branches are named master and develop and two runners have been tagged with master_runner and develop_runner tags, your .gitlab-ci.yml can look like this:
master_job:
<<: *your_job
only:
- master
tags:
- master_runner
develop_job:
<<: *your_job
only:
- develop
tags:
- develop_runner
(<<: *your_job is your actual job that you can factorize)

Is there a way to "pull" and "up" with docker-compose without creating build folders in testing enviroment?

So, i have a docker-compose file that has a build command in each service. In development, docker-compose up works ok. In test enviroment, i want to docker-compose pull and docker-compose build the images, and it works ok, except it needs the folder in build command created in testing server.
Is it really necesary or is there a way to pull and up the containers without create the build folders in the testing server?
There is docker-compose up --no-build