ERROR when installing subprocces cffi during docker-compose installation - docker-compose

I am currently coding my .gitlab-ci.yml to setup the CI/CD of my project.But when the pipeline runs I get an ERROR during the installation of my docker-compose.#
My .gitlab-ci.yml file
image: docker:stable
stages:
- deploy
services:
- docker:dind
variables:
DOCKER_CLIENT_TIMEOUT: 240
COMPOSE_HTTP_TIMEOUT: 240
before_script:
- apk add python3 py3-pip
- python3 -m pip install --upgrade pip setuptools wheel
- pip3 install dxlmispservice
- pip3 install setuptools-rust
- pip3 install docker-compose
The Log of the Pipeline
Building wheel for PyYAML (pyproject.toml): started
Building wheel for PyYAML (pyproject.toml): finished with status 'done'
Created wheel for PyYAML: filename=PyYAML-5.4.1-cp38-cp38-linux_x86_64.whl size=45656 sha256=9de5adf1f8c62bc5c5b9da385015c331f5c6100035a7de96c1ffab4ad2ec2eb2
Stored in directory: /root/.cache/pip/wheels/dd/c5/1d/5d7436173d3efd4a14dcb510eb0b29525ecb6b0e41489e716e
Building wheel for cffi (setup.py): started
Building wheel for cffi (setup.py): finished with status 'error'
error: subprocess-exited-with-error

Related

Unable to uninstall pyspark from Azure pipeline. Uninstallation is stuck

I have a test pipeline in Azure Devops. In the pipeline, I want to uninstall pyspark module. I do it with pip uninstall pyspark. Below is the pipeline
trigger: none
jobs:
- job: 'QA_Pipeline'
timeoutInMinutes: 300
pool: vmss-deep-dev-pool-002
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: '3.8'
- script: |
pip install --upgrade --force-reinstall pip setuptools wheel
pip uninstall pyarrow
pip uninstall pyspark
pip install -I azure-cli==2.18.0
pip install -I databricks-connect==9.1.30
displayName: 'Install modules'
...
...
When I execute this pipeline, the pyspark installation is found but the uninstallation never finishes.
Below is the stacktrace
Collecting pip
Using cached pip-23.0-py3-none-any.whl (2.1 MB)
Collecting setuptools
Using cached setuptools-67.0.0-py3-none-any.whl (1.1 MB)
Collecting wheel
Using cached wheel-0.38.4-py3-none-any.whl (36 kB)
Installing collected packages: wheel, setuptools, pip
Attempting uninstall: wheel
Found existing installation: wheel 0.38.4
Uninstalling wheel-0.38.4:
Successfully uninstalled wheel-0.38.4
Attempting uninstall: setuptools
Found existing installation: setuptools 67.0.0
Uninstalling setuptools-67.0.0:
Successfully uninstalled setuptools-67.0.0
Attempting uninstall: pip
Found existing installation: pip 23.0
Uninstalling pip-23.0:
Successfully uninstalled pip-23.0
Successfully installed pip-23.0 setuptools-67.0.0 wheel-0.38.4
WARNING: Skipping pyarrow as it is not installed.
Found existing installation: pyspark 3.3.1
Uninstalling pyspark-3.3.1:
Would remove:
/opt/hostedtoolcache/Python/3.8.16/x64/bin/beeline
/opt/hostedtoolcache/Python/3.8.16/x64/bin/beeline.cmd
/opt/hostedtoolcache/Python/3.8.16/x64/bin/docker-image-tool.sh
/opt/hostedtoolcache/Python/3.8.16/x64/bin/find-spark-home
/opt/hostedtoolcache/Python/3.8.16/x64/bin/find-spark-home.cmd
/opt/hostedtoolcache/Python/3.8.16/x64/bin/find_spark_home.py
/opt/hostedtoolcache/Python/3.8.16/x64/bin/load-spark-env.cmd
/opt/hostedtoolcache/Python/3.8.16/x64/bin/load-spark-env.sh
/opt/hostedtoolcache/Python/3.8.16/x64/bin/pyspark
/opt/hostedtoolcache/Python/3.8.16/x64/bin/pyspark.cmd
/opt/hostedtoolcache/Python/3.8.16/x64/bin/pyspark2.cmd
/opt/hostedtoolcache/Python/3.8.16/x64/bin/run-example
/opt/hostedtoolcache/Python/3.8.16/x64/bin/run-example.cmd
/opt/hostedtoolcache/Python/3.8.16/x64/bin/spark-class
/opt/hostedtoolcache/Python/3.8.16/x64/bin/spark-class.cmd
/opt/hostedtoolcache/Python/3.8.16/x64/bin/spark-class2.cmd
/opt/hostedtoolcache/Python/3.8.16/x64/bin/spark-shell
/opt/hostedtoolcache/Python/3.8.16/x64/bin/spark-shell.cmd
/opt/hostedtoolcache/Python/3.8.16/x64/bin/spark-shell2.cmd
/opt/hostedtoolcache/Python/3.8.16/x64/bin/spark-sql
/opt/hostedtoolcache/Python/3.8.16/x64/bin/spark-sql.cmd
/opt/hostedtoolcache/Python/3.8.16/x64/bin/spark-sql2.cmd
/opt/hostedtoolcache/Python/3.8.16/x64/bin/spark-submit
/opt/hostedtoolcache/Python/3.8.16/x64/bin/spark-submit.cmd
/opt/hostedtoolcache/Python/3.8.16/x64/bin/spark-submit2.cmd
/opt/hostedtoolcache/Python/3.8.16/x64/bin/sparkR
/opt/hostedtoolcache/Python/3.8.16/x64/bin/sparkR.cmd
/opt/hostedtoolcache/Python/3.8.16/x64/bin/sparkR2.cmd
/opt/hostedtoolcache/Python/3.8.16/x64/lib/python3.8/site-packages/pyspark-3.3.1.dist-info/*
/opt/hostedtoolcache/Python/3.8.16/x64/lib/python3.8/site-packages/pyspark/*
It has been around 4 hours and I do not see any progress. I tried stopping the job and restarting but it does not help. Any advice will be appreciated. Thanks
pip uninstall -y pyspark seems to do the trick. It works now

Can't find dependencies when deploying function in google cloud build

So im trying to create a google cloud function that imports a python package called pdftotext. Now in order to pip install pdftotext you have to install some system dependencies. i.e:
sudo apt install build-essential libpoppler-cpp-dev pkg-config python3-dev
Now my solution to doing that is to create a requirements.txt and a cloudbuild.yml file that I upload to google source repositories and then use a cloud build trigger that listens to the repo, and deploys the function when something is pushed to the repo.
my cloudbuild.yml file looks like this:
steps:
# Install OS Dependencies
- name: "docker.io/library/python:3.9"
id: "OS Dependencies"
entrypoint: bash
args:
- '-c'
- |
apt-get update
apt-get install -y build-essential libpoppler-cpp-dev pkg-config python3-dev
apt-get install -y pip
pip3 install -t /workspace/lib -r requirements.txt
# Deploy Function
- name: "gcr.io/cloud-builders/gcloud"
id: "Deploy Function"
args:
[
"functions",
"deploy",
"pdf_handler",
"--entry-point",
"main",
"--source",
".",
"--runtime",
"python39",
"--memory",
"256MB",
"--service-account",
"my_service_account",
"--trigger-http",
"--timeout",
"540",
"--region",
"europe-west1",
]
options:
logging: CLOUD_LOGGING_ONLY
The trigger tries to deploy the function but i keep getting this error even though i installed the OS dependencies
"Deploy Function": pdftotext.cpp:3:10: fatal error: poppler/cpp/poppler-document.h: No such file or directory
It seems like the function deployment can't find the location where the dependencies are installed.
I've tried installing and deploying in the same step but still get the same error.
Any advice is appreciated.
Thanks in advance!
When you deploy with Cloud Functions, ONLY your code is taken and packaged (in a container) by the service.
During the packaging, another Cloud Build is called to build that container (with Buildpacks.io) and then to deploy it. That deployment doesn't care that you install some APT packages in your environment. But your /lib directory is uploaded to that new Cloud Build
You should update your requirements.txt of the Cloud Functions code that you deploy to point to the /lib directory to prevent PIP looking for external package (and compilation requirement)

is it possible to install postgresql-dev in bullseye

Now I am using rust:1.54-bullseye as my base image, when I run my app it shows error:./reddwarf_music: error while loading shared libraries: libpq.so.5: cannot open shared object file: No such file or directory, I searching from internet and someone told that install postgresql-devel would fix this problem .now I tried to install postgresql-devel in the base image using this command:
RUN apt install postgresql-devel
but it tell me:
Reading package lists...
Building dependency tree...
Reading state information...
E: Unable to locate package postgresql-devel
The command '/bin/sh -c apt install postgresql-devel' returned a non-zero code: 100
exit status 100
Error: exit status 100
what should I do to install this package in rust:1.54-bullseye? I have already tried this way:
RUN apt-get update && apt-get install postgresql-devel
install postgresql-devel failed, could not find the package.
Tried this command to install the libpq.so.5 dependencies:
RUN apt-get update && apt-get install postgresql -y
works. It will install dependencies but the postgresql still too large, the better way is install libpq5 like this:
RUN apt-get update && apt-get install libpq5 -y
Perhaps the best is to use docker-compose with a better segreation of responsibility.
docker-compose.yml
version: '2'
services:
app:
image: rust:1.54-bullseye
depends_on:
- db
environment:
- DB_HOSTNAME=db
- DB_PASSWORD=${POSTGRES_PASSWORD}
volumes:
- bullseye-data:/var/lib/bullseye/db
networks:
- bullseye
db:
image: postgres
environment:
- PGDATA=/var/lib/postgresql/data/pgdata
- POSTGRES_DB=bullseye
- POSTGRES_PASSWORD=${POSTGRES_PASSWORD}
volumes:
- bullseye-database:/var/lib/postgresql/data
networks:
- bullseye
volumes:
bullseye-database:
bullseye-data:
networks:
bullseye:

pip install Pillow on Alpine for Ipad freezes

I am trying to install the Pillow pip package on Alpine for ipad via ISH. This is for light debugging purposes of a django project.
Since ISH already contains the apk package, some dependencies are installed doing the following:
apk update
apk upgrade
apk add python3
apk add py3-pip bash curl python3-dev gcc zlib-dev libffi-dev postgresql-dev musl-dev jpeg-dev
pip3 install virtualenv
and once the virtualenv activated:
pip3 install Pillow
The command freezes while building the wheel for the package and cannot install it. Is it a size limitation (42.5 MB for this package) or is there any missing dependencies I did not see?

Ansible and docker-compose pull / up -d

I'm trying to run theses commands :
docker-compose pull
docker-compose up -d
docker-compose -f other_file.yaml pull
docker-compose -f other_file.yaml up -d
Here's my Ansible code for this specific task :
- name: Run docker-compose
docker_compose:
project_src: {{ my_project_path }}
files:
- docker-compose.yaml
- other_file.yaml
I'm getting the error bellow
Failed to import the required Python library (Docker SDK for Python: docker (Python >= 2.7) or docker-py (Python 2.6)) on managed's Python /usr/bin/python3.
Please read module documentation and install in the appropriate location.
If the required library is installed, but Ansible is using the wrong Python interpreter, please consult the documentation on ansible_python_interpreter, for example via `pip install docker` or `pip install docker-py` (Python 2.6).
The error was: No module named 'docker'
The fact is that the python interpreter is set up in the ansible.cfg as /usr/bin/python3 which is the good one.
The version of python3 installed is 3.6.9 and the python module "docker" is installed.
Any idea on where this error comes from ? Been reading documentation, and others post all day.
Thanks !
Finally understood why the problem occured.
I was installing the python library with pip3 install <lib> the fact is that it will not work if you're using sudo to run some modules in Ansible because sudo pip3 is different from pip3 on its own.
So quick solution ? sudo pip3 install docker docker-compose