How to synchronize version control system revisions with Docker image versions? - deployment

What is a common, conventional pattern for associating Docker image versions in Docker registry and Git (for example) revisions? So user of repository will be able to switch at any revision and recreate environment that corresponds to that specific revision.
For example I have a repository with a project and integration server which automatically builds Docker images for every commit in repository. How these images can be tied to corresponding repository commits?
Using Docker you have two different repositories - with a project and with Docker images - so they must be synchronized, but what are recommended workflows for this?

Docker Hub supports two types of repos, and you haven't said which kind you are using. In both cases, I would suggest you put the git revision or tag name in the version portion of the image name, e.g. "account/repo:version"
Manual Repository (docker push)
For manual builds, you should docker tag each version with the git revision. This is the recommended method because it gives you the most control and can be automated on your build system.
Automated Build Repository
For automated builds on the Docker Hub, you will need to manually create a new entry for each revision or tag you want built. This requires going to the Hub web UI and creating a new build configuration for your repo, as pictured below. There is no API at this time which allows you to change build settings on your repo, nor does the automated build system automatically add Docker Hub tags when you add tags to your Git Hub repo.
Note at this time (2014-10-07) there is a bug where the automated builds do not actually build from tags in your source code repository, they build from head of master branch. The status is shown on https://status.docker.com
Here is how to add a build configuration to an automated build:

Related

How to link folder / GitHub repository to Heroku?

I am hosting a Discord bot on Heroku so it stay live 24/7. I have the code local on my computer and update the code by running the below code. My only question is how can I access the code on another computer to work away from home.
git add .
git commit -am "making it better"
git push Heroku master
There isn't a way to "make the folder a GitHub link". Heroku builds your application and its runtime into a slug and this slug is what runs on your dynos. There is no way to update the code you're running without building a new slug.
However, you can deploy directly from GitHub, either manually or automatically when new commits are added to a branch. I strongly recommend having a good test suite in either case, but this is especially important if you want to do automatic deployments.

How do I handle a large number of files as an input to a build when using VSTS?

To set expectations, I'm new to build tooling. We're currently using a hosted agent but we're open to other options.
We've got a local application that kicks off a build using the VSTS API. The hosted build tasks involve the Get sources step from a GitHub repo to the local file system in VSO. The next step we need to copy over a large number of files (upwards of about 10000 files), building the solution, and running the tests.
The problem is that the cloned GitHub repo is in the file system in Visual Studio Online, and my 10000 input files are on a local machine. That seems like a bit much, especially since we plan on doing CI and may have many builds being kicked off per day.
What is the best way to move the input files into the cloned repo so that we can build it? Should we be using a hosted agent for this? Or is it best to do this on our local system? I've looked in the VSO docs but haven't found an answer there. I'm not sure if I asking the right questions here.
There are some ways to handle the situation, you can follow the way which is closest to your situations.
Option 1. Add the large files to the github repo
If the local files are only related to the code of the github repo, you should add the files into the same repo so that all the required files will be cloned in Get Sources step, then you can build directly without copy files step.
Option 2. Manage the large files in another git repo, and then add the git repo as submodule for the github repo
If the local large files are also used for other code, you can manage the large files in a separate repo, and treat it as submodule for github repo by git submodule add <URL for the separate repo>. And in your VSTS build definition, select Checkout submodules in Get sources step. Then the large files can be used directly when you build the github code.
Option 3. Use private agent on your local machine
If you don’t want add the large files in the github repo or a separate git repo for some reasons, you can use a private agent instead. But the build run time may not improve obviously, because the changed run time is only the different between copying local files to server and copying local files to the same local machine.

Isolate Configs from Build Box (VSTS) when working with GIT

We are implementing an environment in Visual Studio Team Services (VSTS).
We have a Git Repo tied to VSTS
The problem is how to keep the Config files separate so they retain their unique values in their local environments? But without uploading Configs VSTS fails to Build within it's environment.
We don't want the same config settings that are in VSTS to always Sync to Local Environments nor do we want to Push our local configs to the Master. Obviously, we can Exclude on Push but the question is how to configure VSTS in a manner that allows it to Build successfully without requiring config files to be uploaded to the Repo?
Reviewing this post, I'm not sure whether or not Repo based configs are required: How to handle multiple configurations in VSTS Release management?
And yes we will eventually have multiple configs to allow Staging and Production releases.
The direct answer is No. Usually git only track source code for projects, and VSTS usually can build successful without config files. I’m not sure what’s your project is, so we can deal with the situation for that you need to push the config file to VSTS but also do not effect local settings when git pull (assume the name of the config file is project.config):
Option 1:
If it’s ok for you to rename project.config when you build in VSTS, you can use project.config for your local environments and projectRemote.config for the remote repo: gitnore project.config in .gitignore, and create projectRemote.config file to push to remote.
Option 2:
Keep local project.config version when pull changes from remote. You can keep local versions by:
touch .gitattributes
echo 'project.config merge=ours' >> .gitattributes
git config --global merge.ours.driver true
Note: the merge strategies seems only works when the pull is not fast forward.

How to trigger Jenkins jobs with new tag, but not commits and vice versa?

Scenario:
Job 1 should be triggered on every new commit to github repository.
Job 2 should be triggered only when adding a tag to that same github repository.
If I configure the Github Plugin to use webhooks, it seems that if I try to set 'Branches to build' to anything permissive (**/* or refs/heads/*), it will build any push to github - which includes adding and removing tags.
Additionally, I can't seem to find a way to ignore all commits, and ONLY build tags.
I'm using Jenkins 2.32.3, Git Plugin 3.1.0, and Github Plugin 1.26.1
Im going to start off by saying that this might vary depending on jenkins and plugin versions.
But in the normal git plugin, when you go to the job config you can select Additional Build Behavior. Under add you can select to ignore commits with certain messages (i.e. comments) or to ignore commits by certain commiters (i.e. the build user)

Want to autobuild a docker image when a third repository is updated

I have a Github repository containing a Dockerfile. Linked to this repository there is a Docker(hub) repository for the autobuilding. The autobuild works fine.
One of the steps of this Dockerfile is to download files from another (third) Github repository: the software to run inside the container.
The question is whether there is a known mechanism for triggering the Docker image when this third repository (containing the application source code, and not the dockerfile) is updated?
Thanks.
In Docker Hub, go to your repository and look at "Build Setting" > "Build Triggers". This will give you a URL that accepts POSTs and triggers a build.
Then go to the Github repository that should trigger the build and add that trigger-url under "Settings" > "Webhooks & Services" > "Webhooks" and you can choose when Github POSTs to it (triggering the build).