Running Airflow locally with Docker - error after docker-compose up - docker-compose

I am trying to run airflow locally with docker. But I got an error message after executing docker-compose up :
ERROR - Triggerer's async thread was blocked for 0.26 seconds, likely by a badly-written trigger. Set PYTHONASYNCIODEBUG=1 to get more information on overrunning coroutines.
The docker-compose version is v2.10.2, and I'm using an m1 chip.
Here's the full traceback of docker-compose up
> docker-compose up
[+] Running 7/0
⠿ Container airflow-dags-redis-1 Running 0.0s
⠿ Container airflow-dags-postgres-1 Running 0.0s
⠿ Container airflow-dags-airflow-init-1 Created 0.0s
⠿ Container airflow-dags-airflow-worker-1 Running 0.0s
⠿ Container airflow-dags-airflow-triggerer-1 Running 0.0s
⠿ Container airflow-dags-airflow-scheduler-1 Running 0.0s
⠿ Container airflow-dags-airflow-webserver-1 Running 0.0s
Attaching to airflow-dags-airflow-init-1, airflow-dags-airflow-scheduler-1, airflow-dags-airflow-triggerer-1, airflow-dags-airflow-webserver-1, airflow-dags-airflow-worker-1, airflow-dags-postgres-1, airflow-dags-redis-1
airflow-dags-airflow-webserver-1 | /home/airflow/.local/lib/python3.8/site-packages/airflow/configuration.py:356: FutureWarning: The auth_backends setting in [api] has had airflow.api.auth.backend.session added in the running config, which is needed by the UI. Please update your config before Apache Airflow 3.0.
airflow-dags-airflow-webserver-1 | warnings.warn(
airflow-dags-airflow-init-1 | /home/airflow/.local/lib/python3.8/site-packages/airflow/configuration.py:356: FutureWarning: The auth_backends setting in [api] has had airflow.api.auth.backend.session added in the running config, which is needed by the UI. Please update your config before Apache Airflow 3.0.
airflow-dags-airflow-init-1 | warnings.warn(
airflow-dags-airflow-init-1 | The container is run as root user. For security, consider using a regular user account.
airflow-dags-airflow-triggerer-1 | /home/airflow/.local/lib/python3.8/site-packages/airflow/configuration.py:356: FutureWarning: The auth_backends setting in [api] has had airflow.api.auth.backend.session added in the running config, which is needed by the UI. Please update your config before Apache Airflow 3.0.
airflow-dags-airflow-triggerer-1 | warnings.warn(
airflow-dags-airflow-worker-1 | /home/airflow/.local/lib/python3.8/site-packages/airflow/configuration.py:356: FutureWarning: The auth_backends setting in [api] has had airflow.api.auth.backend.session added in the running config, which is needed by the UI. Please update your config before Apache Airflow 3.0.
airflow-dags-airflow-worker-1 | warnings.warn(
airflow-dags-airflow-scheduler-1 | /home/airflow/.local/lib/python3.8/site-packages/airflow/configuration.py:356: FutureWarning: The auth_backends setting in [api] has had airflow.api.auth.backend.session added in the running config, which is needed by the UI. Please update your config before Apache Airflow 3.0.
airflow-dags-airflow-scheduler-1 | warnings.warn(
airflow-dags-airflow-init-1 |
airflow-dags-airflow-init-1 | /home/airflow/.local/lib/python3.8/site-packages/airflow/configuration.py:356: FutureWarning: The auth_backends setting in [api] has had airflow.api.auth.backend.session added in the running config, which is needed by the UI. Please update your config before Apache Airflow 3.0.
airflow-dags-airflow-init-1 | warnings.warn(
airflow-dags-airflow-init-1 | DB: postgresql+psycopg2://airflow:***#postgres/airflow
airflow-dags-airflow-init-1 | Performing upgrade with database postgresql+psycopg2://airflow:***#postgres/airflow
airflow-dags-airflow-init-1 | [2022-11-23 07:47:45,036] {db.py:1462} INFO - Creating tables
airflow-dags-airflow-init-1 | INFO [alembic.runtime.migration] Context impl PostgresqlImpl.
airflow-dags-airflow-init-1 | INFO [alembic.runtime.migration] Will assume transactional DDL.
airflow-dags-airflow-scheduler-1 | /home/airflow/.local/lib/python3.8/site-packages/airflow/configuration.py:356: FutureWarning: The auth_backends setting in [api] has had airflow.api.auth.backend.session added in the running config, which is needed by the UI. Please update your config before Apache Airflow 3.0.
airflow-dags-airflow-scheduler-1 | warnings.warn(
airflow-dags-airflow-webserver-1 | [2022-11-23 07:47:52 +0000] [83] [INFO] Starting gunicorn 20.1.0
airflow-dags-airflow-init-1 | Upgrades done
airflow-dags-airflow-worker-1 | /home/airflow/.local/lib/python3.8/site-packages/airflow/configuration.py:356: FutureWarning: The auth_backends setting in [api] has had airflow.api.auth.backend.session added in the running config, which is needed by the UI. Please update your config before Apache Airflow 3.0.
airflow-dags-airflow-worker-1 | warnings.warn(
airflow-dags-airflow-init-1 | /home/airflow/.local/lib/python3.8/site-packages/airflow/configuration.py:356: FutureWarning: The auth_backends setting in [api] has had airflow.api.auth.backend.session added in the running config, which is needed by the UI. Please update your config before Apache Airflow 3.0.
airflow-dags-airflow-init-1 | warnings.warn(
airflow-dags-airflow-webserver-1 | [2022-11-23 07:48:20,076] {providers_manager.py:215} INFO - Optional provider feature disabled when importing 'airflow.providers.google.leveldb.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package
airflow-dags-airflow-scheduler-1 | BACKEND=redis
airflow-dags-airflow-scheduler-1 | DB_HOST=redis
airflow-dags-airflow-scheduler-1 | DB_PORT=6379
airflow-dags-airflow-webserver-1 | [2022-11-23 07:48:25 +0000] [83] [INFO] Listening at: http://0.0.0.0:8080 (83)
airflow-dags-airflow-webserver-1 | [2022-11-23 07:48:25 +0000] [83] [INFO] Using worker: sync
airflow-dags-airflow-webserver-1 | [2022-11-23 07:48:25 +0000] [125] [INFO] Booting worker with pid: 125
airflow-dags-airflow-webserver-1 | [2022-11-23 07:48:25 +0000] [127] [INFO] Booting worker with pid: 127
airflow-dags-airflow-webserver-1 | [2022-11-23 07:48:25 +0000] [129] [INFO] Booting worker with pid: 129
airflow-dags-airflow-webserver-1 | [2022-11-23 07:48:25 +0000] [131] [INFO] Booting worker with pid: 131
airflow-dags-airflow-scheduler-1 |
airflow-dags-airflow-worker-1 | BACKEND=redis
airflow-dags-airflow-worker-1 | DB_HOST=redis
airflow-dags-airflow-worker-1 | DB_PORT=6379
airflow-dags-airflow-init-1 | [2022-11-23 07:49:15,765] {providers_manager.py:215} INFO - Optional provider feature disabled when importing 'airflow.providers.google.leveldb.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package
airflow-dags-airflow-init-1 | [2022-11-23 07:49:21,885] {providers_manager.py:215} INFO - Optional provider feature disabled when importing 'airflow.providers.google.leveldb.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package
airflow-dags-airflow-triggerer-1 | ____________ _____________
airflow-dags-airflow-triggerer-1 | ____ |__( )_________ __/__ /________ __
airflow-dags-airflow-triggerer-1 | ____ /| |_ /__ ___/_ /_ __ /_ __ \_ | /| / /
airflow-dags-airflow-triggerer-1 | ___ ___ | / _ / _ __/ _ / / /_/ /_ |/ |/ /
airflow-dags-airflow-triggerer-1 | _/_/ |_/_/ /_/ /_/ /_/ \____/____/|__/
airflow-dags-airflow-triggerer-1 | [2022-11-23 07:49:26,287] {triggerer_job.py:100} INFO - Starting the triggerer
airflow-dags-airflow-worker-1 |
airflow-dags-airflow-scheduler-1 | /home/airflow/.local/lib/python3.8/site-packages/airflow/configuration.py:356: FutureWarning: The auth_backends setting in [api] has had airflow.api.auth.backend.session added in the running config, which is needed by the UI. Please update your config before Apache Airflow 3.0.
airflow-dags-airflow-scheduler-1 | warnings.warn(
airflow-dags-airflow-init-1 | airflow already exist in the db
airflow-dags-airflow-triggerer-1 | [2022-11-23 07:49:49,628] {triggerer_job.py:343} ERROR - Triggerer's async thread was blocked for 0.28 seconds, likely by a badly-written trigger. Set PYTHONASYNCIODEBUG=1 to get more information on overrunning coroutines.
airflow-dags-airflow-init-1 | /home/airflow/.local/lib/python3.8/site-packages/airflow/configuration.py:356: FutureWarning: The auth_backends setting in [api] has had airflow.api.auth.backend.session added in the running config, which is needed by the UI. Please update your config before Apache Airflow 3.0.
airflow-dags-airflow-init-1 | warnings.warn(
airflow-dags-airflow-init-1 | 2.3.3
airflow-dags-airflow-init-1 exited with code 0
airflow-dags-airflow-webserver-1 | [2022-11-23 07:50:04,847] {providers_manager.py:215} INFO - Optional provider feature disabled when importing 'airflow.providers.google.leveldb.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package
airflow-dags-airflow-webserver-1 | [2022-11-23 07:50:06,945] {providers_manager.py:215} INFO - Optional provider feature disabled when importing 'airflow.providers.google.leveldb.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package
airflow-dags-airflow-webserver-1 | [2022-11-23 07:50:08,388] {providers_manager.py:215} INFO - Optional provider feature disabled when importing 'airflow.providers.google.leveldb.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package
airflow-dags-airflow-webserver-1 | [2022-11-23 07:50:08,893] {providers_manager.py:215} INFO - Optional provider feature disabled when importing 'airflow.providers.google.leveldb.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package
airflow-dags-airflow-webserver-1 | [2022-11-23 07:50:10,370] {providers_manager.py:215} INFO - Optional provider feature disabled when importing 'airflow.providers.google.leveldb.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package
airflow-dags-airflow-webserver-1 | [2022-11-23 07:50:12,999] {providers_manager.py:215} INFO - Optional provider feature disabled when importing 'airflow.providers.google.leveldb.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package
airflow-dags-airflow-webserver-1 | [2022-11-23 07:50:13,291] {providers_manager.py:215} INFO - Optional provider feature disabled when importing 'airflow.providers.google.leveldb.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package
airflow-dags-airflow-webserver-1 | [2022-11-23 07:50:15,743] {providers_manager.py:215} INFO - Optional provider feature disabled when importing 'airflow.providers.google.leveldb.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package
airflow-dags-airflow-triggerer-1 | [2022-11-23 07:50:20,086] {triggerer_job.py:343} ERROR - Triggerer's async thread was blocked for 0.35 seconds, likely by a badly-written trigger. Set PYTHONASYNCIODEBUG=1 to get more information on overrunning coroutines.
airflow-dags-airflow-webserver-1 | [2022-11-23 07:50:25 +0000] [83] [CRITICAL] WORKER TIMEOUT (pid:125)
airflow-dags-airflow-webserver-1 | [2022-11-23 07:50:26 +0000] [83] [CRITICAL] WORKER TIMEOUT (pid:127)
airflow-dags-airflow-webserver-1 | [2022-11-23 07:50:26 +0000] [83] [CRITICAL] WORKER TIMEOUT (pid:129)
airflow-dags-airflow-webserver-1 | [2022-11-23 07:50:26 +0000] [83] [CRITICAL] WORKER TIMEOUT (pid:131)
airflow-dags-airflow-webserver-1 | [2022-11-23 07:50:26 +0000] [125] [INFO] Worker exiting (pid: 125)
airflow-dags-airflow-webserver-1 | [2022-11-23 07:50:26 +0000] [127] [INFO] Worker exiting (pid: 127)
airflow-dags-airflow-webserver-1 | [2022-11-23 07:50:26 +0000] [131] [INFO] Worker exiting (pid: 131)
airflow-dags-airflow-webserver-1 | [2022-11-23 07:50:27 +0000] [83] [WARNING] Worker with pid 131 was terminated due to signal 9
airflow-dags-postgres-1 | 2022-11-23 07:50:27.412 UTC [231] LOG: unexpected EOF on client connection with an open transaction
airflow-dags-airflow-webserver-1 | [2022-11-23 07:50:27 +0000] [83] [WARNING] Worker with pid 129 was terminated due to signal 9
airflow-dags-airflow-webserver-1 | [2022-11-23 07:50:27 +0000] [83] [WARNING] Worker with pid 127 was terminated due to signal 9
airflow-dags-airflow-webserver-1 | [2022-11-23 07:50:27 +0000] [180] [INFO] Booting worker with pid: 180
airflow-dags-airflow-webserver-1 | [2022-11-23 07:50:27 +0000] [182] [INFO] Booting worker with pid: 182
airflow-dags-airflow-webserver-1 | [2022-11-23 07:50:27 +0000] [83] [WARNING] Worker with pid 125 was terminated due to signal 9
airflow-dags-airflow-webserver-1 | [2022-11-23 07:50:27 +0000] [184] [INFO] Booting worker with pid: 184
airflow-dags-airflow-webserver-1 | [2022-11-23 07:50:27 +0000] [186] [INFO] Booting worker with pid: 186
airflow-dags-airflow-triggerer-1 | [2022-11-23 07:50:42,521] {triggerer_job.py:343} ERROR - Triggerer's async thread was blocked for 0.26 seconds, likely by a badly-written trigger. Set PYTHONASYNCIODEBUG=1 to get more information on overrunning coroutines.
airflow-dags-airflow-worker-1 | /home/airflow/.local/lib/python3.8/site-packages/airflow/configuration.py:356: FutureWarning: The auth_backends setting in [api] has had airflow.api.auth.backend.session added in the running config, which is needed by the UI. Please update your config before Apache Airflow 3.0.
airflow-dags-airflow-worker-1 | warnings.warn(
airflow-dags-airflow-triggerer-1 | [2022-11-23 07:51:14,861] {triggerer_job.py:343} ERROR - Triggerer's async thread was blocked for 0.20 seconds, likely by a badly-written trigger. Set PYTHONASYNCIODEBUG=1 to get more information on overrunning coroutines.
airflow-dags-airflow-triggerer-1 | [2022-11-23 07:51:23,311] {triggerer_job.py:343} ERROR - Triggerer's async thread was blocked for 0.35 seconds, likely by a badly-written trigger. Set PYTHONASYNCIODEBUG=1 to get more information on overrunning coroutines.
airflow-dags-airflow-triggerer-1 | [2022-11-23 07:51:24,691] {triggerer_job.py:343} ERROR - Triggerer's async thread was blocked for 0.38 seconds, likely by a badly-written trigger. Set PYTHONASYNCIODEBUG=1 to get more information on overrunning coroutines.
airflow-dags-airflow-triggerer-1 | [2022-11-23 07:51:25,637] {triggerer_job.py:343} ERROR - Triggerer's async thread was blocked for 0.36 seconds, likely by a badly-written trigger. Set PYTHONASYNCIODEBUG=1 to get more information on overrunning coroutines.
airflow-dags-airflow-triggerer-1 | [2022-11-23 07:51:31,709] {triggerer_job.py:343} ERROR - Triggerer's async thread was blocked for 0.40 seconds, likely by a badly-written trigger. Set PYTHONASYNCIODEBUG=1 to get more information on overrunning coroutines.
airflow-dags-airflow-triggerer-1 | [2022-11-23 07:51:41,635] {triggerer_job.py:343} ERROR - Triggerer's async thread was blocked for 0.24 seconds, likely by a badly-written trigger. Set PYTHONASYNCIODEBUG=1 to get more information on overrunning coroutines.
airflow-dags-airflow-webserver-1 | [2022-11-23 07:51:57,670] {providers_manager.py:215} INFO - Optional provider feature disabled when importing 'airflow.providers.google.leveldb.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package
airflow-dags-airflow-webserver-1 | [2022-11-23 07:51:58,599] {providers_manager.py:215} INFO - Optional provider feature disabled when importing 'airflow.providers.google.leveldb.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package
airflow-dags-airflow-webserver-1 | [2022-11-23 07:51:59,615] {providers_manager.py:215} INFO - Optional provider feature disabled when importing 'airflow.providers.google.leveldb.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package
airflow-dags-airflow-webserver-1 | [2022-11-23 07:52:01,891] {providers_manager.py:215} INFO - Optional provider feature disabled when importing 'airflow.providers.google.leveldb.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package
airflow-dags-airflow-webserver-1 | [2022-11-23 07:52:01,955] {providers_manager.py:215} INFO - Optional provider feature disabled when importing 'airflow.providers.google.leveldb.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package
airflow-dags-airflow-webserver-1 | [2022-11-23 07:52:03,564] {providers_manager.py:215} INFO - Optional provider feature disabled when importing 'airflow.providers.google.leveldb.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package
airflow-dags-airflow-webserver-1 | [2022-11-23 07:52:04,475] {providers_manager.py:215} INFO - Optional provider feature disabled when importing 'airflow.providers.google.leveldb.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package
airflow-dags-airflow-webserver-1 | [2022-11-23 07:52:06,434] {providers_manager.py:215} INFO - Optional provider feature disabled when importing 'airflow.providers.google.leveldb.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package
airflow-dags-airflow-triggerer-1 | [2022-11-23 07:52:26,095] {triggerer_job.py:343} ERROR - Triggerer's async thread was blocked for 0.20 seconds, likely by a badly-written trigger. Set PYTHONASYNCIODEBUG=1 to get more information on overrunning coroutines.
airflow-dags-airflow-webserver-1 | [2022-11-23 07:52:28 +0000] [83] [CRITICAL] WORKER TIMEOUT (pid:180)
airflow-dags-airflow-webserver-1 | [2022-11-23 07:52:28 +0000] [83] [CRITICAL] WORKER TIMEOUT (pid:182)
airflow-dags-airflow-webserver-1 | [2022-11-23 07:52:28 +0000] [83] [CRITICAL] WORKER TIMEOUT (pid:184)
airflow-dags-airflow-webserver-1 | [2022-11-23 07:52:28 +0000] [83] [CRITICAL] WORKER TIMEOUT (pid:186)
airflow-dags-airflow-webserver-1 | [2022-11-23 07:52:28 +0000] [182] [INFO] Worker exiting (pid: 182)
airflow-dags-airflow-webserver-1 | [2022-11-23 07:52:28 +0000] [184] [INFO] Worker exiting (pid: 184)
airflow-dags-airflow-webserver-1 | [2022-11-23 07:52:28 +0000] [180] [INFO] Worker exiting (pid: 180)
airflow-dags-airflow-webserver-1 | [2022-11-23 07:52:28 +0000] [186] [INFO] Worker exiting (pid: 186)
airflow-dags-airflow-webserver-1 | [2022-11-23 07:52:30 +0000] [83] [WARNING] Worker with pid 184 was terminated due to signal 9
airflow-dags-airflow-webserver-1 | [2022-11-23 07:52:30 +0000] [83] [WARNING] Worker with pid 180 was terminated due to signal 9
airflow-dags-airflow-webserver-1 | [2022-11-23 07:52:30 +0000] [83] [WARNING] Worker with pid 186 was terminated due to signal 9
airflow-dags-airflow-webserver-1 | [2022-11-23 07:52:30 +0000] [236] [INFO] Booting worker with pid: 236
airflow-dags-airflow-webserver-1 | [2022-11-23 07:52:30 +0000] [238] [INFO] Booting worker with pid: 238
airflow-dags-airflow-webserver-1 | [2022-11-23 07:52:30 +0000] [83] [WARNING] Worker with pid 182 was terminated due to signal 9
airflow-dags-airflow-webserver-1 | [2022-11-23 07:52:30 +0000] [240] [INFO] Booting worker with pid: 240
airflow-dags-airflow-webserver-1 | [2022-11-23 07:52:30 +0000] [242] [INFO] Booting worker with pid: 242
airflow-dags-airflow-triggerer-1 | [2022-11-23 07:53:38,534] {triggerer_job.py:343} ERROR - Triggerer's async thread was blocked for 0.27 seconds, likely by a badly-written trigger. Set PYTHONASYNCIODEBUG=1 to get more information on overrunning coroutines.
airflow-dags-airflow-triggerer-1 | [2022-11-23 07:53:43,427] {triggerer_job.py:343} ERROR - Triggerer's async thread was blocked for 0.21 seconds, likely by a badly-written trigger. Set PYTHONASYNCIODEBUG=1 to get more information on overrunning coroutines.
airflow-dags-airflow-triggerer-1 | [2022-11-23 07:53:50,683] {triggerer_job.py:343} ERROR - Triggerer's async thread was blocked for 0.21 seconds, likely by a badly-written trigger. Set PYTHONASYNCIODEBUG=1 to get more information on overrunning coroutines.
airflow-dags-airflow-triggerer-1 | [2022-11-23 07:53:53,144] {triggerer_job.py:343} ERROR - Triggerer's async thread was blocked for 0.20 seconds, likely by a badly-written trigger. Set PYTHONASYNCIODEBUG=1 to get more information on overrunning coroutines.
airflow-dags-airflow-webserver-1 | [2022-11-23 07:54:13,423] {providers_manager.py:215} INFO - Optional provider feature disabled when importing 'airflow.providers.google.leveldb.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package
airflow-dags-airflow-webserver-1 | [2022-11-23 07:54:14,241] {providers_manager.py:215} INFO - Optional provider feature disabled when importing 'airflow.providers.google.leveldb.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package
airflow-dags-airflow-triggerer-1 | [2022-11-23 07:54:14,871] {triggerer_job.py:343} ERROR - Triggerer's async thread was blocked for 0.21 seconds, likely by a badly-written trigger. Set PYTHONASYNCIODEBUG=1 to get more information on overrunning coroutines.
airflow-dags-airflow-webserver-1 | [2022-11-23 07:54:16,329] {providers_manager.py:215} INFO - Optional provider feature disabled when importing 'airflow.providers.google.leveldb.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package
airflow-dags-airflow-webserver-1 | [2022-11-23 07:54:18,351] {providers_manager.py:215} INFO - Optional provider feature disabled when importing 'airflow.providers.google.leveldb.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package
airflow-dags-airflow-webserver-1 | [2022-11-23 07:54:19,050] {providers_manager.py:215} INFO - Optional provider feature disabled when importing 'airflow.providers.google.leveldb.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package
airflow-dags-airflow-webserver-1 | [2022-11-23 07:54:19,515] {providers_manager.py:215} INFO - Optional provider feature disabled when importing 'airflow.providers.google.leveldb.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package
airflow-dags-airflow-webserver-1 | [2022-11-23 07:54:21,955] {providers_manager.py:215} INFO - Optional provider feature disabled when importing 'airflow.providers.google.leveldb.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package
airflow-dags-airflow-webserver-1 | [2022-11-23 07:54:23,718] {providers_manager.py:215} INFO - Optional provider feature disabled when importing 'airflow.providers.google.leveldb.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package
airflow-dags-airflow-triggerer-1 | [2022-11-23 07:54:26,018] {triggerer_job.py:343} ERROR - Triggerer's async thread was blocked for 0.21 seconds, likely by a badly-written trigger. Set PYTHONASYNCIODEBUG=1 to get more information on overrunning coroutines.
I allocated more resources to docker to fix this (CPUs: 4, Memory: 10GB, Swap: 2GB) but it didn't work.
Any comments would be appreciated.

Related

Docker compose with NestJS PostgreSQL not connecting

I have tried to dockerize NestJS app with PostgreSQL. Postgres is refusing connection and is also showing one log that says database system was shut down at <<some timestamp>>. This is docker-compose.yml and the logs.
version: '3'
services:
postgres:
image: postgres
restart: always
volumes:
- ./pgdata:/var/lib/postgresql/data
ports:
- '5432:5432'
environment:
- POSTGRES_DB=gm
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
pgadmin:
image: dpage/pgadmin4
environment:
- PGADMIN_DEFAULT_EMAIL=admin#gmail.com
- PGADMIN_DEFAULT_PASSWORD=admin
- PGADMIN_LISTEN_PORT=5050
ports:
- "5050:5050"
api:
image: gm-server
build:
dockerfile: Dockerfile
context: .
volumes:
- .:/home/node
ports:
- '8081:4001'
depends_on:
- postgres
env_file: .env
command: npm run start:prod
volumes:
pgdata:
server-postgres-1 | PostgreSQL Database directory appears to contain a database; Skipping initialization
server-postgres-1 |
server-postgres-1 | 2023-01-04 09:36:45.249 UTC [1] LOG: starting PostgreSQL 15.0 (Debian 15.0-1.pgdg110+1) on x86_64-pc-linux-gnu, compiled by gcc (Debian 10.2.1-6) 10.2.1 20210110, 64-bit
server-postgres-1 | 2023-01-04 09:36:45.250 UTC [1] LOG: listening on IPv4 address "0.0.0.0", port 5432
server-postgres-1 | 2023-01-04 09:36:45.250 UTC [1] LOG: listening on IPv6 address "::", port 5432
server-postgres-1 | 2023-01-04 09:36:45.255 UTC [1] LOG: listening on Unix socket "/var/run/postgresql/.s.PGSQL.5432"
server-postgres-1 | 2023-01-04 09:36:45.261 UTC [29] LOG: database system was shut down at 2023-01-04 09:36:27 UTC
server-postgres-1 | 2023-01-04 09:36:45.274 UTC [1] LOG: database system is ready to accept connections
server-api-1 |
server-api-1 | > nestjs-app#0.0.1 start:prod
server-api-1 | > node dist/main
server-api-1 |
server-api-1 | [Nest] 19 - 01/04/2023, 9:36:47 AM LOG [NestFactory] Starting Nest application...
server-api-1 | [Nest] 19 - 01/04/2023, 9:36:47 AM LOG [InstanceLoader] MulterModule dependencies initialized +61ms
server-api-1 | [Nest] 19 - 01/04/2023, 9:36:47 AM LOG [InstanceLoader] MulterModule dependencies initialized +1ms
server-api-1 | [Nest] 19 - 01/04/2023, 9:36:47 AM LOG [InstanceLoader] ConfigHostModule dependencies initialized +0ms
server-api-1 | [Nest] 19 - 01/04/2023, 9:36:47 AM LOG [InstanceLoader] AppModule dependencies initialized +1ms
server-api-1 | [Nest] 19 - 01/04/2023, 9:36:47 AM LOG [InstanceLoader] ConfigModule dependencies initialized +0ms
server-api-1 | [Nest] 19 - 01/04/2023, 9:36:47 AM ERROR [ExceptionHandler] connect ECONNREFUSED 127.0.0.1:5432
server-api-1 | Error: connect ECONNREFUSED 127.0.0.1:5432
server-api-1 | at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1300:16)
server-api-1 exited with code 1
And I have also tried most of the relevant answers (before Stackoverlow stars marking me as duplicate) and they didn't work. Yes, I have tried changing the host to host.docker.internal as suggested by the previous
For more clarity, here is my typeorm-datasource config in NestJS
import { DataSource } from 'typeorm';
export const typeOrmConnectionDataSource = new DataSource({
type: 'postgres',
host: 'host.docker.internal',
port: 5432,
username: 'postgres',
password: 'postgres',
database: 'gm',
entities: [__dirname + '/**/*.entity{.ts,.js}'],
migrations: [__dirname + '/migrations/**/*{.ts,.js}'],
logging: true,
synchronize: false,
migrationsRun: false,
});
Why this problem is different than the other "duplicate" questions?
The reason this problem is different is due to the reason because
the other threads also don't solve the issue.
even if we consider they do, the solutions didn't work for me.
More evidence?
I have tried all solutions for this one
Another solution with didn't work.
Another one, even followed the chat
Apparently the problem lies in NestJS not compiling properly because of my customization to its scripts. Check if that's an issue.
After fixing this issue, Just follow the instructions to use "postgres" as host, and it will work (if you are facing the same issue).

Why are my postgres user and volume not created by docker-compose?

I am having trouble composing a docker running postgres. I cannot get it to create the user postgres, nor do I see the volume I specify. I had this working before, I didn't think I changed anything, but obviously I did. In any case, here is what happens now:
Step 1: Clean environment of previous data:
docker stop $DOCKERID
docker rm $DOCKERID
sudo rm -rf .pgdata
Check that no volumes remain:
╰─ docker volume ls ─╯
DRIVER VOLUME NAME
Step 2: Create the data folder and then the container:
mkdir .pgdata
docker-compose -f docker-compose-pg-only.yml up &
docker-compose-pg-only.yml
version: '3.3'
services:
postgres:
image: postgres:13.2
restart: unless-stopped
environment:
POSTGRES_DB: mydb
POSTGRES_USER: postgres
POSTGRES_PASSWORD: abc123
PGDATA: /var/lib/postgresql/data/pgdata
ports:
- "5432:5432"
volumes:
-
type: bind
source: /path/to/postgres-docker-build/.pgdata
target: /var/lib/postgresql/data
networks:
- reference
networks:
reference:
The outpuut of docker-compose:
+] Running 1/1 ─╯
⠿ Container postgres-docker-build-db-1 Create... 0.1s
Attaching to postgres-docker-build-db-1
postgres-docker-build-db-1 | The files belonging to this database system will be owned by user "postgres".
postgres-docker-build-db-1 | This user must also own the server process.
postgres-docker-build-db-1 |
postgres-docker-build-db-1 | The database cluster will be initialized with locale "en_US.utf8".
postgres-docker-build-db-1 | The default database encoding has accordingly been set to "UTF8".
postgres-docker-build-db-1 | The default text search configuration will be set to "english".
postgres-docker-build-db-1 |
postgres-docker-build-db-1 | Data page checksums are disabled.
postgres-docker-build-db-1 |
postgres-docker-build-db-1 | fixing permissions on existing directory /var/lib/postgresql/data/pgdata ... ok
postgres-docker-build-db-1 | creating subdirectories ... ok
postgres-docker-build-db-1 | selecting dynamic shared memory implementation ... posix
postgres-docker-build-db-1 | selecting default max_connections ... 100
postgres-docker-build-db-1 | selecting default shared_buffers ... 128MB
postgres-docker-build-db-1 | selecting default time zone ... UTC
postgres-docker-build-db-1 | creating configuration files ... ok
postgres-docker-build-db-1 | running bootstrap script ... ok
postgres-docker-build-db-1 | performing post-bootstrap initialization ... sh: locale: not found
postgres-docker-build-db-1 | 2022-06-25 17:25:25.749 UTC [31] WARNING: no usable system locales were found
postgres-docker-build-db-1 | ok
postgres-docker-build-db-1 | syncing data to disk ... initdb: warning: enabling "trust" authentication for local connections
postgres-docker-build-db-1 | You can change this by editing pg_hba.conf or using the option -A, or
postgres-docker-build-db-1 | --auth-local and --auth-host, the next time you run initdb.
postgres-docker-build-db-1 | ok
postgres-docker-build-db-1 |
postgres-docker-build-db-1 |
postgres-docker-build-db-1 | Success. You can now start the database server using:
postgres-docker-build-db-1 |
postgres-docker-build-db-1 | pg_ctl -D /var/lib/postgresql/data/pgdata -l logfile start
postgres-docker-build-db-1 |
postgres-docker-build-db-1 | waiting for server to start....2022-06-25 17:25:26.486 UTC [37] LOG: starting PostgreSQL 14.1 on x86_64-pc-linux-musl, compiled by gcc (Alpine 10.3.1_git20211027) 10.3.1 20211027, 64-bit
postgres-docker-build-db-1 | 2022-06-25 17:25:26.496 UTC [37] LOG: listening on Unix socket "/var/run/postgresql/.s.PGSQL.5432"
postgres-docker-build-db-1 | 2022-06-25 17:25:26.511 UTC [38] LOG: database system was shut down at 2022-06-25 17:25:26 UTC
postgres-docker-build-db-1 | 2022-06-25 17:25:26.517 UTC [37] LOG: database system is ready to accept connections
postgres-docker-build-db-1 | done
postgres-docker-build-db-1 | server started
postgres-docker-build-db-1 | CREATE DATABASE
postgres-docker-build-db-1 |
postgres-docker-build-db-1 |
postgres-docker-build-db-1 | /usr/local/bin/docker-entrypoint.sh: ignoring /docker-entrypoint-initdb.d/*
postgres-docker-build-db-1 |
postgres-docker-build-db-1 | waiting for server to shut down....2022-06-25 17:25:26.709 UTC [37] LOG: received fast shutdown request
postgres-docker-build-db-1 | 2022-06-25 17:25:26.713 UTC [37] LOG: aborting any active transactions
postgres-docker-build-db-1 | 2022-06-25 17:25:26.714 UTC [37] LOG: background worker "logical replication launcher" (PID 44) exited with exit code 1
postgres-docker-build-db-1 | 2022-06-25 17:25:26.715 UTC [39] LOG: shutting down
postgres-docker-build-db-1 | 2022-06-25 17:25:26.740 UTC [37] LOG: database system is shut down
postgres-docker-build-db-1 | done
postgres-docker-build-db-1 | server stopped
postgres-docker-build-db-1 |
postgres-docker-build-db-1 | PostgreSQL init process complete; ready for start up.
postgres-docker-build-db-1 |
postgres-docker-build-db-1 | 2022-06-25 17:25:26.840 UTC [1] LOG: starting PostgreSQL 14.1 on x86_64-pc-linux-musl, compiled by gcc (Alpine 10.3.1_git20211027) 10.3.1 20211027, 64-bit
postgres-docker-build-db-1 | 2022-06-25 17:25:26.840 UTC [1] LOG: listening on IPv4 address "0.0.0.0", port 5432
postgres-docker-build-db-1 | 2022-06-25 17:25:26.840 UTC [1] LOG: listening on IPv6 address "::", port 5432
postgres-docker-build-db-1 | 2022-06-25 17:25:26.848 UTC [1] LOG: listening on Unix socket "/var/run/postgresql/.s.PGSQL.5432"
postgres-docker-build-db-1 | 2022-06-25 17:25:26.858 UTC [51] LOG: database system was shut down at 2022-06-25 17:25:26 UTC
postgres-docker-build-db-1 | 2022-06-25 17:25:26.864 UTC [1] LOG: database system is ready to accept connections
Step 3: Test the container
╰─ docker exec -it 66b79bef5769 /bin/sh ─╯
/ #
/ #
/ # su postgres
/ $ psql
postgres-docker-build-db-1 | 2022-06-25 17:30:23.626 UTC [70] FATAL: role "postgres" does not exist
psql: error: connection to server on socket "/var/run/postgresql/.s.PGSQL.5432" failed: FATAL: role "postgres" does not exist
/ $ psql -U postgres
postgres-docker-build-db-1 | 2022-06-25 17:30:53.226 UTC [73] FATAL: role "postgres" does not exist
psql: error: connection to server on socket "/var/run/postgresql/.s.PGSQL.5432" failed: FATAL: role "postgres" does not exist
/ $ psql -U postgres -h localhost
psql: error: connection to server at "localhost" (127.0.0.1), port 5432 failed: FATAL: role "postgres" does not exist
postgres-docker-build-db-1 | 2022-06-25 17:31:04.399 UTC [75] FATAL: role "postgres" does not exist
/ $ psql -U postgres -h localhost -p 5432
postgres-docker-build-db-1 | 2022-06-25 17:31:14.465 UTC [77] FATAL: role "postgres" does not exist
psql: error: connection to server at "localhost" (127.0.0.1), port 5432 failed: FATAL: role "postgres" does not exist
When I run docker volume ls, I do not see anything:
╰─ docker volume ls ─╯
DRIVER VOLUME NAME

docker kong error to get connection with postgres

My dockerCompose.yml:
version: "3.7"
networks:
kong-net:
volumes:
kong_data: {}
pghr:
external: true
pginv:
external: true
services:
#######################################
# Postgres: The database used by Kong
#######################################
kong-database:
image: postgres:11
container_name: kong-postgres
restart: on-failure
networks:
- kong-net
volumes:
- kong_data:/var/lib/postgresql/data
environment:
POSTGRES_USER: kong
POSTGRES_PASSWORD: kong
POSTGRES_DB: kong
ports:
- "5434:5434"
healthcheck:
test: ["CMD", "pg_isready", "-U", "kong"]
interval: 30s
timeout: 30s
retries: 3
#######################################
# Kong database migration
#######################################
kong-migration:
image: kong:2.0.3-alpine
command: kong migrations bootstrap
networks:
- kong-net
restart: on-failure
environment:
KONG_DATABASE: postgres
KONG_PG_HOST: kong-database
KONG_PG_DATABASE: kong
KONG_PG_USER: kong
KONG_PG_PASSWORD: kong
depends_on:
- kong-database
#######################################
# Kong: The API Gateway
#######################################
kong:
image: kong:2.0.3-alpine
restart: on-failure
container_name: kong
networks:
- kong-net
environment:
KONG_DATABASE: postgres
KONG_PG_HOST: kong-database
KONG_PG_DATABASE: kong
KONG_PG_USER: kong
KONG_PG_PASSWORD: kong
KONG_PROXY_LISTEN: 0.0.0.0:8000
KONG_PROXY_LISTEN_SSL: 0.0.0.0:8443
KONG_ADMIN_LISTEN: 0.0.0.0:8001
depends_on:
- kong-database
healthcheck:
test: ["CMD", "kong", "health"]
interval: 10s
timeout: 10s
retries: 10
ports:
- "8000:8000"
- "8001:8001"
- "8443:8443"
- "8444:8444"
#######################################
# Konga database prepare
#######################################
konga-prepare:
image: pantsel/konga:latest
command: "-c prepare -a postgres -u postgresql://kong:kong#kong-database:5434/konga"
networks:
- kong-net
restart: on-failure
links:
- kong-database
depends_on:
- kong-database
#######################################
# Konga: Kong GUI
#######################################
konga:
image: pantsel/konga:latest
container_name: konga
restart: always
networks:
- kong-net
environment:
DB_ADAPTER: postgres
DB_HOST: kong-database
DB_USER: kong
TOKEN_SECRET: FUEDASHFUAEHFEUAHFEU;
DB_DATABASE: kong
NODE_ENV: production
depends_on:
- kong-database
ports:
- "1337:1337"
but i get this on docker - compose logs on my container KONG:
kong | 2020/09/02 21:51:04 [error] 1#0: init_by_lua error: /usr/local/share/lua/5.1/kong/cmd/utils/migrations.lua:20: New migrations available; run 'kong migrations up' to proceed
kong | stack traceback:
kong | [C]: in function 'error'
kong | /usr/local/share/lua/5.1/kong/cmd/utils/migrations.lua:20: in function 'check_state'
kong | /usr/local/share/lua/5.1/kong/init.lua:392: in function 'init'
kong | init_by_lua:3: in main chunk
kong | nginx: [error] init_by_lua error: /usr/local/share/lua/5.1/kong/cmd/utils/migrations.lua:20: New migrations available; run 'kong migrations up' to proceed
kong | stack traceback:
kong | [C]: in function 'error'
kong | /usr/local/share/lua/5.1/kong/cmd/utils/migrations.lua:20: in function 'check_state'
kong | /usr/local/share/lua/5.1/kong/init.lua:392: in function 'init'
kong | init_by_lua:3: in main chunk
kong | 2020/09/02 21:51:08 [notice] 1#0: using the "epoll" event method
kong | 2020/09/02 21:51:08 [notice] 1#0: openresty/1.15.8.3
kong | 2020/09/02 21:51:08 [notice] 1#0: built by gcc 9.2.0 (Alpine 9.2.0)
kong | 2020/09/02 21:51:08 [notice] 1#0: OS: Linux 5.4.0-45-generic
kong | 2020/09/02 21:51:08 [notice] 1#0: getrlimit(RLIMIT_NOFILE): 1048576:1048576
kong | 2020/09/02 21:51:08 [notice] 1#0: start worker processes
kong | 2020/09/02 21:51:08 [notice] 1#0: start worker process 22
kong | 2020/09/02 21:51:08 [notice] 1#0: start worker process 23
kong | 2020/09/02 21:51:08 [notice] 1#0: start worker process 24
kong | 2020/09/02 21:51:08 [notice] 1#0: start worker process 25
kong | 2020/09/02 21:51:08 [notice] 1#0: start worker process 26
kong | 2020/09/02 21:51:08 [notice] 1#0: start worker process 27
kong | 2020/09/02 21:51:08 [notice] 1#0: start worker process 28
kong | 2020/09/02 21:51:08 [notice] 1#0: start worker process 29
kong | 2020/09/02 21:51:08 [notice] 1#0: start worker process 30
kong | 2020/09/02 21:51:08 [notice] 1#0: start worker process 31
kong | 2020/09/02 21:51:08 [notice] 1#0: start worker process 32
kong | 2020/09/02 21:51:08 [notice] 1#0: start worker process 33
kong | 2020/09/02 21:51:08 [notice] 22#0: *5 [lua] cache_warmup.lua:46: cache_warmup_single_entity(): Preloading 'services' into the core_cache..., context: init_worker_by_lua*
kong | 2020/09/02 21:51:08 [notice] 22#0: *5 [lua] cache_warmup.lua:85: cache_warmup_single_entity(): finished preloading 'services' into the core_cache (in 0ms), context: init_worker_by_lua*
kong | 2020/09/02 21:51:08 [notice] 22#0: *5 [lua] cache_warmup.lua:46: cache_warmup_single_entity(): Preloading 'plugins' into the core_cache..., context: init_worker_by_lua*
kong | 2020/09/02 21:51:08 [notice] 22#0: *5 [lua] cache_warmup.lua:85: cache_warmup_single_entity(): finished preloading 'plugins' into the core_cache (in 0ms), context: init_worker_by_lua*
on my container Konga-prepare:
konga-prepare_1 | debug: Preparing database...
konga-prepare_1 | Using postgres DB Adapter.
konga-prepare_1 | Failed to connect to DB Error: connect ECONNREFUSED 192.168.64.2:5434
konga-prepare_1 | at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1141:16) {
konga-prepare_1 | errno: 'ECONNREFUSED',
konga-prepare_1 | code: 'ECONNREFUSED',
konga-prepare_1 | syscall: 'connect',
konga-prepare_1 | address: '192.168.64.2',
konga-prepare_1 | port: 5434
konga-prepare_1 | }
on my container konga:
konga | _e: error: relation "public.konga_users" does not exist
konga | at Connection.parseE (/app/node_modules/sails-postgresql/node_modules/pg/lib/connection.js:539:11)
konga | at Connection.parseMessage (/app/node_modules/sails-postgresql/node_modules/pg/lib/connection.js:366:17)
konga | at Socket.<anonymous> (/app/node_modules/sails-postgresql/node_modules/pg/lib/connection.js:105:22)
konga | at Socket.emit (events.js:310:20)
konga | at Socket.EventEmitter.emit (domain.js:482:12)
konga | at addChunk (_stream_readable.js:286:12)
konga | at readableAddChunk (_stream_readable.js:268:9)
konga | at Socket.Readable.push (_stream_readable.js:209:10)
konga | at TCP.onStreamRead (internal/stream_base_commons.js:186:23) {
konga | length: 118,
konga | severity: 'ERROR',
konga | code: '42P01',
konga | detail: undefined,
konga | hint: undefined,
konga | position: '377',
konga | internalPosition: undefined,
konga | internalQuery: undefined,
konga | where: undefined,
konga | schema: undefined,
konga | table: undefined,
konga | column: undefined,
konga | dataType: undefined,
konga | constraint: undefined,
konga | file: 'parse_relation.c',
konga | line: '1159',
konga | routine: 'parserOpenTable'
konga | },
konga | rawStack: 'error: relation "public.konga_users" does not exist\n' +
konga | ' at Connection.parseE (/app/node_modules/sails-postgresql/node_modules/pg/lib/connection.js:539:11)\n' +
konga | ' at Connection.parseMessage (/app/node_modules/sails-postgresql/node_modules/pg/lib/connection.js:366:17)\n' +
konga | ' at Socket.<anonymous> (/app/node_modules/sails-postgresql/node_modules/pg/lib/connection.js:105:22)\n' +
konga | ' at Socket.emit (events.js:310:20)\n' +
konga | ' at Socket.EventEmitter.emit (domain.js:482:12)\n' +
konga | ' at addChunk (_stream_readable.js:286:12)\n' +
konga | ' at readableAddChunk (_stream_readable.js:268:9)\n' +
konga | ' at Socket.Readable.push (_stream_readable.js:209:10)\n' +
konga | ' at TCP.onStreamRead (internal/stream_base_commons.js:186:23)',
konga | details: 'Details: error: relation "public.konga_users" does not exist\n'
konga | }
I'm not able to imagine the reason for the error, my dockercompose seems to be configured correctly.
I configured the right port, but I have these errors I tried to fix and I can't.
Production
In case of MySQL or PostgresSQL adapters, Konga will not perform db migrations when running in production mode.
You can manually perform the migrations by calling $ node ./bin/konga.js prepare , passing the args needed for the database connectivity.
so in first run,you need set NODE_ENV as development,not production

Postgres not accepting connection in Docker

I am running the docker project on docker toolbox windows 7 sp1 and the project does not give any error but still due to postgres not working the whole project in not functional.
The code of Docker compose file is :
version: '3'
services:
postgres:
image: 'postgres:latest'
restart: always
ports:
- "5432:5432"
environment:
POSTGRES_DB: "db"
POSTGRES_PASSWORD: postgres_password
POSTGRES_HOST_AUTH_METHOD: "trust"
DATABASE_URL: postgresql://postgres:p3wrd#postgres:5432/postgres
deploy:
restart_policy:
condition: on-failure
window: 15m
redis:
image: 'redis:latest'
nginx:
restart: always
build:
dockerfile: Dockerfile.dev
context: ./nginx
ports:
- '3050:80'
api:
depends_on:
- "postgres"
build:
dockerfile: Dockerfile.dev
context: ./server
volumes:
- ./server/copy:/usr/src/app/data
environment:
- REDIS_HOST=redis
- REDIS_PORT=6379
- PGUSER=postgres
- PGHOST=postgres
- PGDATABASE=postgres
- PGPASSWORD=postgres_password
- PGPORT=5432
client:
depends_on:
- "postgres"
build:
dockerfile: Dockerfile.dev
context: ./client
volumes:
- ./client/copy:/usr/src/app/data
- /usr/src/app/node_modules
worker:
build:
dockerfile: Dockerfile.dev
context: ./worker
volumes:
- ./worker/copy:/usr/src/app/data
- /usr/src/app/node_modules
depends_on:
- "postgres"
But when i run the project i get this :
redis_1 | 1:C 29 May 2020 05:07:37.909 # oO0OoO0OoO0Oo Redis is starting oO0OoO0OoO0Oo
redis_1 | 1:C 29 May 2020 05:07:37.910 # Redis version=6.0.3, bits=64, commit=00000000, modified=0, pid=1, just started
redis_1 | 1:C 29 May 2020 05:07:37.911 # Warning: no config file specified, using the default config. In order to specify a config file use redis-server /path/to/redis.conf
redis_1 | 1:M 29 May 2020 05:07:37.922 * Running mode=standalone, port=6379.
redis_1 | 1:M 29 May 2020 05:07:37.928 # WARNING: The TCP backlog setting of 511 cannot be enforced because /proc/sys/net/core/somaxconn is set to the lower value of 128.
redis_1 | 1:M 29 May 2020 05:07:37.929 # Server initialized
redis_1 | 1:M 29 May 2020 05:07:37.929 # WARNING overcommit_memory is set to 0! Background save may fail under low memory condition. To fix this issue add 'vm.overcommit_memory = 1' to /etc/sysctl.conf and then reboot or run the command 'sysctl vm.overcommit_memory=1' for this to take effect.
redis_1 | 1:M 29 May 2020 05:07:37.933 * Loading RDB produced by version 6.0.3
redis_1 | 1:M 29 May 2020 05:07:37.934 * RDB age 8 seconds
redis_1 | 1:M 29 May 2020 05:07:37.934 * RDB memory usage when created 0.81 Mb
redis_1 | 1:M 29 May 2020 05:07:37.934 * DB loaded from disk: 0.001 seconds
postgres_1 |
postgres_1 | PostgreSQL Database directory appears to contain a database; Skipping initialization
postgres_1 |
postgres_1 | 2020-05-29 05:07:38.927 UTC [1] LOG: starting PostgreSQL 12.3 (Debian 12.3-1.pgdg100+1) on x86_64-pc-linux-gnu, compiled by gcc (Debian 8.3.0-6) 8.3.0, 64-bit
postgres_1 | 2020-05-29 05:07:38.928 UTC [1] LOG: listening on IPv4 address "0.0.0.0", port 5432
postgres_1 | 2020-05-29 05:07:38.929 UTC [1] LOG: listening on IPv6 address "::", port 5432
postgres_1 | 2020-05-29 05:07:38.933 UTC [1] LOG: listening on Unix socket "/var/run/postgresql/.s.PGSQL.5432"
postgres_1 | 2020-05-29 05:07:38.993 UTC [24] LOG: database system was shut down at 2020-05-29 05:07:29 UTC
api_1 |
api_1 | > # dev /usr/src/app
api_1 | > nodemon
api_1 |
api_1 | [nodemon] 1.18.3
api_1 | [nodemon] to restart at any time, enter `rs`
api_1 | [nodemon] watching: *.*
With or without data volumes the same error comes and due to which the project is not running. Please Help
I would assume that once the API starts, it attempts to connect to the Postgres. If yes, this is a typical error that many Docker developers experience where the Postgres DB is not yet ready to accept connections and apps are trying to connect to it.
You can solve the problem by trying either of the following approaches:
Make your API layer wait for a certain amount of time (Enough for the Postgres DB to boot up)
Thread.Sleep(60); # should be enough so that Postgres DB can start
Implement a retry mechanism that will wait for let's say 10 seconds everytime the connection fails to establish.
If this doesn't work, I would recommend for you to check whether there is a Postgres DB installed outside of the container that owns the port you were trying to access.
Along with Allan Chua's answer, Please mention a startup dependency on Postgres service in your docker-compose file.
depends_on:
- postgres
Add this to your api service.

How to connect to postgres through docker-compose network?

I use docker-compose and I try to connect to the postgres database from the web container.
I use this URI:
postgresql://hola:hola#postgres/holadb
I get this error:
Connection refused
Is the server running on host "postgres" (172.18.0.2) and accepting
TCP/IP connections on port 5432?
docker-compose.yml
version: '2'
services:
web:
restart: always
build: ./web
expose:
- "8000"
volumes:
- /usr/src/app/project/static
command: /usr/local/bin/gunicorn -w 2 -b :8000 project:app
depends_on:
- postgres
postgres:
image: postgres:9.6
ports:
- "5432:5432"
environment:
- POSTGRES_USER=hola
- POSTGRES_PASSWORD=hola
- POSTGRES_DB=holadb
volumes:
- ./data/postgres:/var/lib/postgresql/data
I remove ./data/postgres before building and running.
Logs
postgres_1 | The files belonging to this database system will be owned by user "postgres".
postgres_1 | This user must also own the server process.
postgres_1 |
postgres_1 | The database cluster will be initialized with locale "en_US.utf8".
postgres_1 | The default database encoding has accordingly been set to "UTF8".
postgres_1 | The default text search configuration will be set to "english".
postgres_1 |
postgres_1 | Data page checksums are disabled.
postgres_1 |
postgres_1 | fixing permissions on existing directory /var/lib/postgresql/data ... ok
postgres_1 | creating subdirectories ... ok
postgres_1 | selecting default max_connections ... 100
postgres_1 | selecting default shared_buffers ... 128MB
postgres_1 | selecting dynamic shared memory implementation ... posix
postgres_1 | creating configuration files ... ok
postgres_1 | running bootstrap script ... ok
web_1 | [2017-06-03 16:54:14 +0000] [1] [INFO] Starting gunicorn 19.7.1
web_1 | [2017-06-03 16:54:14 +0000] [1] [INFO] Listening at: http://0.0.0.0:8000 (1)
web_1 | [2017-06-03 16:54:14 +0000] [1] [INFO] Using worker: sync
web_1 | [2017-06-03 16:54:14 +0000] [7] [INFO] Booting worker with pid: 7
web_1 | [2017-06-03 16:54:14 +0000] [8] [INFO] Booting worker with pid: 8
postgres_1 | performing post-bootstrap initialization ... ok
postgres_1 |
postgres_1 | WARNING: enabling "trust" authentication for local connections
postgres_1 | You can change this by editing pg_hba.conf or using the option -A, or
postgres_1 | --auth-local and --auth-host, the next time you run initdb.
postgres_1 | syncing data to disk ... ok
postgres_1 |
postgres_1 | Success. You can now start the database server using:
postgres_1 |
postgres_1 | pg_ctl -D /var/lib/postgresql/data -l logfile start
postgres_1 |
postgres_1 | waiting for server to start....LOG: database system was shut down at 2017-06-03 16:54:16 UTC
postgres_1 | LOG: MultiXact member wraparound protections are now enabled
postgres_1 | LOG: database system is ready to accept connections
postgres_1 | LOG: autovacuum launcher started
postgres_1 | done
postgres_1 | server started
postgres_1 | CREATE DATABASE
postgres_1 |
postgres_1 | CREATE ROLE
postgres_1 |
postgres_1 |
postgres_1 | /usr/local/bin/docker-entrypoint.sh: ignoring /docker-entrypoint-initdb.d/*
postgres_1 |
postgres_1 | LOG: received fast shutdown request
postgres_1 | LOG: aborting any active transactions
postgres_1 | LOG: autovacuum launcher shutting down
postgres_1 | LOG: shutting down
postgres_1 | waiting for server to shut down....LOG: database system is shut down
postgres_1 | done
postgres_1 | server stopped
postgres_1 |
postgres_1 | PostgreSQL init process complete; ready for start up.
postgres_1 |
postgres_1 | LOG: database system was shut down at 2017-06-03 16:54:18 UTC
postgres_1 | LOG: MultiXact member wraparound protections are now enabled
postgres_1 | LOG: database system is ready to accept connections
postgres_1 | LOG: autovacuum launcher started
I don't understand why it does not work. Thank you in advance for your help.
The web container tries to connect while postgres is still initializing...
Waiting some delay solved my issue.
EDIT: I use Docker Compose Healthcheck to do this.
You need to setup a network to allow communication between containers. Something like this should work:
version: '2'
services:
web:
restart: always
build: ./web
expose:
- "8000"
volumes:
- /usr/src/app/project/static
command: /usr/local/bin/gunicorn -w 2 -b :8000 project:app
depends_on:
- postgres
networks: ['mynetwork']
postgres:
restart: always
build:
context: ./postgresql
volumes_from:
- data
ports:
- "5432:5432"
networks: ['mynetwork']
networks: {mynetwork: {}}
More info here and here.