Bitbucket Piplines: Upload from a sub directory to staging / production server - deployment

I am trying to upload my project using bitbucket's pipeline service and its working fine. However, I only need to upload the files from a specific sub-directory.
My directory structure is as follows:
Repository:
- Appz
- Android
- iOS
- Designs
- Appz
- Web
- Web
- Html
- Laravel
I need to upload the files form the Repository / Web only (not from any other directory). But the pipeline service is uploading the entire repository to server.
bitbucket-pipelines.yml
image: php:7.2
pipelines:
branches:
master:
- step:
script:
- apt-get update && apt-get install -y unzip git-ftp
- export PROJECT_NAME=Web/
- git-ftp init --user $FTP_USERNAME --passwd $FTP_PASSWORD ftp://domain/path

I found the solution. The only command which requires the modification is the git-ftp command. However, I also found that the export command doesn't have anything to do here, so I removed it and the command still worked as I require.
Here how it goes:
- apt-get update && apt-get install -y unzip git-ftp
- git-ftp init --syncroot Web --user $FTP_USERNAME --passwd $FTP_PASSWORD ftp://domain/path
The specified --syncroot <PATH/TO/DIRECTORY> parameter is all it takes to set the desired source location from where the pipeline service need to fetch and upload files.
I hope this helps.
Thank you.

Related

Can't find dependencies when deploying function in google cloud build

So im trying to create a google cloud function that imports a python package called pdftotext. Now in order to pip install pdftotext you have to install some system dependencies. i.e:
sudo apt install build-essential libpoppler-cpp-dev pkg-config python3-dev
Now my solution to doing that is to create a requirements.txt and a cloudbuild.yml file that I upload to google source repositories and then use a cloud build trigger that listens to the repo, and deploys the function when something is pushed to the repo.
my cloudbuild.yml file looks like this:
steps:
# Install OS Dependencies
- name: "docker.io/library/python:3.9"
id: "OS Dependencies"
entrypoint: bash
args:
- '-c'
- |
apt-get update
apt-get install -y build-essential libpoppler-cpp-dev pkg-config python3-dev
apt-get install -y pip
pip3 install -t /workspace/lib -r requirements.txt
# Deploy Function
- name: "gcr.io/cloud-builders/gcloud"
id: "Deploy Function"
args:
[
"functions",
"deploy",
"pdf_handler",
"--entry-point",
"main",
"--source",
".",
"--runtime",
"python39",
"--memory",
"256MB",
"--service-account",
"my_service_account",
"--trigger-http",
"--timeout",
"540",
"--region",
"europe-west1",
]
options:
logging: CLOUD_LOGGING_ONLY
The trigger tries to deploy the function but i keep getting this error even though i installed the OS dependencies
"Deploy Function": pdftotext.cpp:3:10: fatal error: poppler/cpp/poppler-document.h: No such file or directory
It seems like the function deployment can't find the location where the dependencies are installed.
I've tried installing and deploying in the same step but still get the same error.
Any advice is appreciated.
Thanks in advance!
When you deploy with Cloud Functions, ONLY your code is taken and packaged (in a container) by the service.
During the packaging, another Cloud Build is called to build that container (with Buildpacks.io) and then to deploy it. That deployment doesn't care that you install some APT packages in your environment. But your /lib directory is uploaded to that new Cloud Build
You should update your requirements.txt of the Cloud Functions code that you deploy to point to the /lib directory to prevent PIP looking for external package (and compilation requirement)

Angular Cli Deployment on openshift 3

I'm trying to deploy angular cli project to openshift 3. It continuously failing the build with "Generic Build failure", no farther info on log. Can any one please walk me through the process if I'm wrong, and is there a way to deploy the compiled dist folder and avoid the build process or what is the best practice? Thank You in Advance.
here are my scripts:
package.json
server.js
The approach I use is to create a Jenkins pipeline whose build step does the following
npm install -d
npm install --save classlist.js
$(npm bin)/ng build --prod --build-optimizer
rm -rf node_modules
oc start-build angular-5-example --from-dir=. --follow
You can see that the final step is to kick off a binary build in Openshift passing the contents of the current directory (minus the node_modules which is not needed and rather large). This binary build simply copies the dist folder output of the ng build into a nginx base image plus some configuration files
FROM nginx:1.13.3-alpine
## Copy our nginx config
COPY nginx/ /etc/nginx/conf.d/
## Remove default nginx website
RUN rm -rf /usr/share/nginx/html/*
## copy over the artifacts in dist folder to default nginx public folder
COPY dist/ /usr/share/nginx/html
EXPOSE 8080
CMD ["nginx", "-g", "daemon off;"]
A fully working example application which describes how an Angular CLI generated project can be deployed to Openshift 3 can be found at
https://github.com/petenorth/angular-5-example
The application is an Angular 5 app.

Gitlab Pages + Doxygen + Graphviz creates graphs with corrupted characters

I'm using Gitlab Pages to host a Doxygen-created API for my project. I also leverage the graphviz project to create dependency graphs. I use the CI script to install the packages and build the documentation:
pages:
stage: build
image: alpine
script:
- apk update && apk add doxygen
- apk add graphviz
- doxygen doxy/dox_config
- mv docs/html/ public/
artifacts:
paths:
- public
only:
- master
dependencies: []
The CI script runs without any errors other than a Doxygen error complaining it can't find LaTeX and dvips, neither of which should affect the graphviz pictures. My graphs look like the following:
I'm not really sure what the problem is or how to fix it. Why are all the characters wrong?
It turns out the issue is with the Docker image used. Alpine doesn't include the correct fonts, but Debian has all the prerequisites. While there is almost definitely a way to install the fonts with Alpine, I just switched to the Debian docker image. Here is a working YML script:
pages:
stage: build
image: ubuntu:trusty
script:
- export DEBIAN_FRONTEND=noninteractive
- apt-get -yq update
- apt-get -yq install graphviz
- apt-get -yq install doxygen
- doxygen doxy/dox_config
- mv docs/html/ public/
artifacts:
paths:
- public
Installing either the package ttf-freefont or ttf-ubuntu-font-family will fix the problem. Here is my Dockerfile
FROM alpine:3.6
RUN apk --update add \
doxygen \
graphviz \
ttf-freefont \
&& rm -rf /var/cache/apk/*
ttf-ubuntu-font-family is more narrow font, so your boxes will become a bit smaller.

TravisCI / Coverity: Warning - No files were emitted

I have a medium size github repository for which I configured Travis-CI/Coverity tools. About a month ago my setup had worked just fine: Travis compiled and built my application, and then performed the Coverity scan and I could see the results on my Coverity page.
However, lately, the Coverity analysis stopped working. I looked through the Travis log files and compared to the old logs when the builds were successful and that's what I found:
At the end of the log, the failed version contains the next warning:
[WARNING] No files were emitted. This may be due to a problem with your configuration or because no files were actually compiled by your build command.
Please make sure you have configured the compilers actually used in the compilation.
For more details, please look at: /home/travis/build/name/repo-name/build/cov-int/build-log.txt
Extracting SCM data for 0 files...
...
So, the Travis builds are passing, but nothing is generated for the Coverity. I checked my Travis config file and it is identical to the commits when the Coverity builds were successful.
For the sake of experiment, I cloned my project repository, rolled back to the version when the builds were successful and set up Travis/Coverity for them. And guess what? Same warning! So, the identical setup that worked in the past (about 35 days ago), does not work anymore. Therefore, I make conclusion, something had changed on the part of Travis since it does not generate certain files.
I was wondering if anyone encountered this issue? and what it could be about? Are there some Travis settings I need to change?
Some additional info: I use CMake to build my project, and it has two dependencies: Qt and OpenSceneGraph (which I have to install for Travis).
This is the approximate script of my .travis.yml on my coverity_scan branch:
language: cpp
os: linux
compiler: gcc
sudo: required
dist: trusty
addons:
apt:
packages:
- cmake
- g++-4.8
coverity_scan:
project:
name: "name/project"
description: "Build submitted via Travis CI"
notification_email: email#domain.com
build_command: "make -j2 VERBOSE=1"
branch_pattern: coverity_scan
env:
global:
- PROJECT_SOURCE=${TRAVIS_BUILD_DIR}/src/
- PROJECT_BUILD=${TRAVIS_BUILD_DIR}/build/
# The next declaration is the encrypted COVERITY_SCAN_TOKEN, created
# via the "travis encrypt" command using the project repo's public key
- secure: "...secure..."
before_install:
# download Qt
# ...
# download OpenSceneGraph
# ...
# imitate x server
- export DISPLAY=:99.0
- /sbin/start-stop-daemon --start --quiet --pidfile /tmp/custom_xvfb_99.pid --make-pidfile --background --exec /usr/bin/Xvfb -- :99 -ac -screen 0 1280x1024x16
- sleep 3
install:
# install Qt
- sudo apt-get --yes install qt55base qt55imageformats qt55svg
# compiler
- export CXX="g++-4.8"
- export CC="gcc-4.8"
# install OpenSceneGraph
# ...
before_script:
# Qt location
# ...
# OpenSceneGraph variables
# ...
# create build folder
- mkdir $PROJECT_BUILD
- cd $PROJECT_BUILD
# cmake command
- cmake -DCMAKE_BUILD_TYPE=Release -DCMAKE_PREFIX_PATH=/opt/qt54/lib/cmake -DProject_BUILD_TEST=ON -DProject_VERSION=0.0.0 $PROJECT_SOURCE
script:
- if [[ "${COVERITY_SCAN_BRANCH}" == 1 ]];
then
echo "Don't build on coverty_scan branch.";
exit 0;
fi
# compile everything, if not coverity branch
- make -j2
# run unit tests
# ...
After some research and looking through existing examples, I finally made it work. To fix the warning, and, therefore, to make sure files are emitted for the analysis, it is necessary to explicitly specify the compiler binary (updated according to the comment) . In my .travis.yml I had to add a build_command_prepend before the build_command of the coverity_scan add-on. An example of the final look for that block is as below:
# ...
coverity_scan:
project:
name: "name/project"
description: "Build submitted via Travis CI"
notification_email: name#domain.com
# ! have to specify the binary (updated, thanks to Caleb)
build_command_prepend: "cov-configure --comptype gcc --compiler gcc-4.8 --template"
build_command: "make VERBOSE=1"
branch_pattern: coverity_scan
# ...

Unauthorized response from GitHub API on Appveyor

We just started with a new project and trying to get CI working via Appveyor.
It is an Aurelia web application so we need jspm on the build server.
On my workstation I configured jspm manually as suggested by #guybedford in his answer below and configured my authtoken in appveyor.yml script:
- jspm config registries.github.auth %JSPM_GITHUB_AUTH_TOKEN%
Currently my appveyor.yml looks like this, based on the Auto configuring section from JSPM
version: 1.0.{build}
os: Visual Studio 2015
build:
verbosity: detailed
environment:
JSPM_GITHUB_AUTH_TOKEN:#token from jspm registry export github (locally)#
install:
- ps: Set-Culture nl-NL
- ps: Install-Product node $env:nodejs_version
- cd src\Web
- npm uninstall jspm -g
- npm install -g jspm
- npm install -g gulp
- npm install
- jspm config registries.github.auth %JSPM_GITHUB_AUTH_TOKEN%
- jspm config registries.github.maxRepoSize 0
- jspm registry export github #output to see what the registry looks like
- jspm install -y
- gulp build
- cd ..\..
nuget:
account_feed: true
before_build:
- dnvm install -r clr -arch x86 1.0.0-rc1-update1
- dnu restore
- nuget restore
The jspm install - y command fails with the error: Unauthorized response for GitHub API.
How do I configure Github credentials properly with JSPM on AppVeyor?
It is best to take this token from jspm registry export github after configuring the credentials locally in order to use the exact same algorithm as jspm instead of doing a manual encoding.
If you really want manual encoding, the auth token actually takes the value of new Buffer(encodeURIComponent(username) + ':' + encodeURIComponent(password)).toString('base64').
After contact with the Appveyor team we figured oud that the node version was the problem.
Installing the stable version of Node works like a charm:
ps: Install-Product node stable