`docker compose up --build` in a `--parallel` mode - docker-compose

I can run:
docker-compose build --parallel
Is it possible to run:
docker-compose up --build
with a possibility to pass the --parallel flag to the underlying docker-compose build run?

The docker-compose up doesn't accept --parallel flag unfortunately. Check here.
What I usually do is:
docker-compose build --parallel
docker-compose up

Related

docker-compose entrypoint with mutiple commands

How to run docker-compose entrypoint configuration option with multiple bash commands
commands:
yarn install
yarn build
sleep infinity
In docker-compose.yml, for service gvhservice
gvhservice:
entrypoint:
- "/bin/sh"
- -ecx
- |
yarn install
yarn build
sleep infinity
OR
optionally, add all these commands to a file say - entrypoint.sh
and in docker-compose.yml,
gvhservice:
entrypoint: entrypoint.sh
OR,
Using the option of entrypoint.sh and command configuration option in docker-compose.yml (suitable for a variable number of commands to be passed during runtime)
entrypoint.sh
#!/bin/sh
set -ex
exec "$#"
docker-compose.yml
command:
- /bin/sh
- -ecx
- |
yarn install
yarn build
sleep infinity

docker-compose flag to always rebuild docker container?

this is a part of my docker-compose file:
super-nice-service:
build: ./Path/To/Dockerfile/
is there a flag i can add so this project is always getting rebuild whenever i do "docker-compose up" ?
Thanks
There is no single command to do build and compose up you can try below, this will do a clean build of containers and bring the services up
docker-compose rm --all &&
docker-compose pull &&
docker-compose build --no-cache &&
docker-compose up -d --force-recreate &&

How can I run localstack in docker-compose with one command on a Mac?

Reading the docs for localstack
I don't understand this line:
"(Note that on MacOS you may have to run TMPDIR=/private$TMPDIR docker-compose up if $TMPDIR contains a symbolic link that cannot be mounted by Docker.)
"
Following these instructions works, but I'd like to be able to run my docker-compose with one command, and I now have to run docker-compose build then TMPDIR=/private$TMPDIR docker-compose up. Any way to combine the commands successfully?
You could try using the --build switch with the up command:
TMPDIR=/private$TMPDIR docker-compose up --build
--force-recreate might help too:
TMPDIR=/private$TMPDIR docker-compose up --force-recreate
You could even combine the two:
TMPDIR=/private$TMPDIR docker-compose up --force-recreate --build

Docker container for app tests with postgres database

I'm new to Docker.
I'm trying to run my node app tests in a Docker container.
I want to run the tests with a real postgres db.
I'm creating this container with the following Dockerfile:
# Set image
FROM postgres:alpine
# Install node latest
RUN apk add --update nodejs nodejs-npm
# Set working dir
WORKDIR .
# Copy the current directory contents into the container at .
ADD src src
ADD .env.testing .env
ADD package.json .
ADD package-lock.json .
# Run tests
CMD npm install && npm run coverage
From the image docs, when I run the container with:
$ docker run build-name -d postgres
I see that the container takes time to start postgresql service.
When I run the container without the "-d postgres" param:
$ docker run build-name
The service does not start and the tests fail due to "could not connect to server".
Questions:
A. How can I run the tests AFTER the postgresql service starts?
B. I saw some examples using docker-composer but can I do this without composer?
Thanks
Thanks to #Bogdan I found the complete solution:
Dockerfile should be:
# Set image
FROM postgres:alpine
# Install node latest
RUN apk add --update nodejs nodejs-npm
# Set working dir
WORKDIR .
# Copy the current directory contents into the container at .
ADD src src
ADD .env.testing .env
ADD package.json .
ADD package-lock.json .
# Install
RUN npm install
# Init container
CMD psql -U postgres -c "SELECT 1;" postgres
Build container:
$ docker build -t test .
Run container:
$ docker run --name startedtest -d test -d postgres
Run tests after conatiner is running:
$ docker exec startedtest some_create_schema_script && npm run coverage
If the goal is just to run the tests in the Postgres container, one solution could be to install NodeJs in your postgres:alpine derived image and run the container normally. Once the database is up, you can run npm using docker exec like this:
docker exec <container_id> npm run coverage

CircleCI 2.0 testing with docker-compose and code checkout

This is my circle.yml:
version: 2
jobs:
build:
working_directory: /app
docker:
- image: docker:stable-git
steps:
- checkout
- setup_remote_docker
- run:
name: Install dependencies
command: |
apk add --no-cache py-pip bash
pip install docker-compose
- run:
name: Start service containers and run tests
command: |
docker-compose -f docker-compose.test.yml up -d db es redis
docker-compose run web bash -c "cd myDir && ./manage.py test"
This works fine in that it brings up my service containers (db, es, redis) and I build a new image for my web container. However, my working code is not inside the freshly built image (so "cd myDir" always fails).
I figure the following lines in my Dockerfile should make my code available when it's built but it appears that it doesn't work like that:
ENV APPLICATION_ROOT /app/
RUN mkdir -p $APPLICATION_ROOT
WORKDIR $APPLICATION_ROOT
ADD . $APPLICATION_ROOT
What am I doing wrong and how can I make my code available inside my test container?
Thanks,
Use COPY, Your Dockerfile should look something like this.
FROM image
COPY . /opt/app
WORKDIR "/opt/app"
(More commands)
ENTRYPOINT