I currently have TravisCI building on PRs in a public GitHub repo.
The instructions for Coveralls say to put this in a .coveralls.yml file:
service_name: travis-pro
repo_token: <my_token>
That doesn't work for me because the .coveralls.yml file would be public--checked into GitHub. My TravisCI is integrated into my GitHub repo wired to a branch and fires on PR.
So I tried this:
In TravisCI's site I set an environment var:
COVERALLS_REPO_TOKEN to my token's value.
Then modded my .travis.yml to look like this:
language: scala
scala:
- 2.11.7
notifications:
email:
recipients:
- me#my_email.com
jdk:
- oraclejdk8
script: "sbt clean coverage test"
after_success: "sbt coverageReport coveralls"
script:
- sbt clean coverage test coverageReport &&
sbt coverageAggregate
after_success:
- sbt coveralls
Now when I create a PR on the branch this runs ok--no errors and I see output in Travis' console that the coverage test ran and generated files. But when I go to Coveralls I see nothing--"There have been no builds for this repo."
How can I set this up?
EDIT: I also tried creating a .coveralls.yml with just service_name: travis-ci
No dice, sadly.
How can I set this up?
Step 1 - Enable Coveralls
The first thing to do is to enable Coveralls for your repository.
You can do that on their website http://coveralls.io:
go to http://coveralls.io
sign in with your GitHub credentials
click on "Repositories", then "Add Repo"
if the repo isn't listed, yet, then "Sync GitHub Repos"
finally, flip the "enable coveralls" switch to "On"
Step 2 - Setup Travis-CI to push the coverage infos to Coveralls
You .travis.yml file contains multiple entries of the script and after_success sections. So, let's clean that up a bit:
language: scala
scala: 2.11.7
jdk: oraclejdk8
script: "sbt clean coverage test"
after_success: "sbt coveralls"
notifications:
email:
recipients:
- me#my_email.com
Now, when you push, the commands in the script sections are executed.
This is were your coverage data is generated.
When the commands finish successfully the after_success section is executed.
This is were the coverage data is pushed to coveralls.
The .coveralls config file
The .coveralls file is only needed to:
public Travis-CI repos do not need this config file since Coveralls can get the information via their API (via access token exchange)
the repo_token (found on the repo page on Coveralls) is only needed for private repos and should be kept secret. If you publish it, then anyone could submit some coverage data for your repo.
Boils down to: you need the file only in two cases:
to specify a custom location to the files containing the coverage data
or when you are using Travis-Pro and private repositories. Then you have to configure "travis-pro" and add the token:
service_name: travis-pro
repo_token: ...
I thought it might be helpful to explain how to set this up for PHP, given that the question applies essentially to any language that Coveralls supports (and not just Lua).
The process is particularly elusive for PHP because the PHP link on Travis-CI's website points to a password-protected page on Coveralls' site that provides no means by which to login using GitHub, unlike the main Coveralls site.
Equally confusing is that the primary PHP page on Coveralls' site seems to contain overly-complicated instructions that require yet another library called atoum/atoum (which looks to be defunct) and are anything but complete.
What ended-up working perfectly for me is https://github.com/php-coveralls/php-coveralls/ . The documentation is very thorough, but it boils-down to this:
Enable Coveralls for your repository (see Step 1 in the Accepted Answer).
Ensure that xdebug is installed and enabled in PHP within your Travis-CI build environment (it should be by default), which is required for code-coverage support in PHPUnit.
Add phpunit and the php-coveralls libraries to the project with Composer:
composer require phpunit/phpunit php-coveralls/php-coveralls
Update travis.yml at the root of the project to include the following directives:
script:
- mkdir -p build/logs
- vendor/bin/phpunit tests --coverage-clover build/logs/clover.xml
after_success:
- travis_retry php vendor/bin/php-coveralls
Create .coveralls.yml at the root of the project and populate it with:
service_name: travis-ci
I'm not positive that this step is necessary for public repositories (the Accepted Answer implies that it's not), but the php-coveralls documentation says of this directive (emphasis mine):
service_name: Allows you to specify where Coveralls should look to find additional information about your builds. This can be any string, but using travis-ci or travis-pro will allow Coveralls to fetch branch data, comment on pull requests, and more.
Push the above changes to the remote repository on GitHub and trigger a Travis-CI build (if you don't already have hooks to make it happen automatically).
Slap a Coveralls code-coverage badge in your README (or wherever else you'd like). The required markup may be found on the Coveralls page for the repository in question, in the Badge column.
Related
I'm trying to use GitHub to trigger on PR a GitLab pipeline.
Practically when a developer creates a PR in GitHub, his/her code get tested against a GitLab pipeline.
I'm trying to follow this user guide: https://docs.gitlab.com/ee/ci/ci_cd_for_external_repos/github_integration.html
and we have a silver account, but it won't work. When creating the PR, the GitLab pipeline is not triggered.
Anyone with this kind of experience who can help?
Thanks
Joe
I've found the cause of the issue.
In order for GitHub to trigger GitLab as CD/CI mostly in PR request, you need to have a Silver/Premium account AND, very important, being the root owner.
Any other case, you won't be able to see github in the integration list on GitLab. People from gitlab had the brilliant idea to hide it instead of showing it disabled (which would had been a tip to understand that you needed an upgraded license)
In the video above it's not explained.
Firstly, you need to give us the content of your .gitlab-ci.yaml file. In your question you asked about GitHub but you're following Gitlab documentation which is completely different. Both are using git commands to commit and push repos but Github & Gitlab are different.
For Github pipelines, you need to create a repository, then you go to Actions. Github will propose you to configure a .github/workflows directory which contain a file.yaml. In this .yaml file you can code your pipelines. According to your project, Github will propose you several linux machines with the adequate configuration to run your files (If it's a Java Project --> you'll be proposed maven machines, Python --> Python Machines, React/Angular -> machines with npm installed, Docker, Kubernetes for deployments...) and you're limited to 4 private project as far as I know (check this last information).
For Gitlab you have two options, you can use preconfigured machines like github, and you call them by adding for example atag: npm in your .gitlab-ci.yaml file, to call a machine with npm installed, but you need to pay an amount of money. Or you can configure your own runners by following the Gitlab documentation with gitlab commands (which is the best option), but you'll need good machines and servers to run npm - mvn - python3 - ... commands
Of course, in your Gitlab repository, and finally to answer your question this an example, of .gitlab-ci.yaml file with two simple stages: build & test, the only statement specifies that these pipelines will run if there is a merge request ( I use the preconfigured machines of Gitlab as a sample here) More details on my python github project https://github.com/mehdimaaref7/Scrapping-Sentiment-Analysis and for gitlab https://docs.gitlab.com/runner/
stages:
- build
- test
build:
tags:
- shell
- linux
stage: build
script:
- echo "Building"
- mkdir build
- touch build/info.txt
artifacts:
paths:
- build/
only:
- merge_requests
test:
tags:
- shell
- linux
stage: test
script:
- echo "Testing"
- test -f "build/info.txt"
only:
- merge_requests
I just got a GitHub account and writing small scripts in Python which I am learning.
While adding my code to GitHub I noticed there is an option to run tests/validation on my code but mine is empty.
I googled around and found that lint and black and are good checks.
I found this Action that I want to add - https://github.com/marketplace/actions/python-quality-and-format-checker
There is a "script" and a "config" that I think I need to add/update somewhere. Also when I click "Use latest version" it tells me to add the code into some .yml.
Can anyone assist me in installing this Action or point me in the right direction? Also, how can I use this Action on all my repositories/code?
=======================================
EDIT:
This link has the instructions - https://help.github.com/en/actions/configuring-and-managing-workflows/configuring-a-workflow
place yaml or yml in this directory -> .github/workflows
For this Action: https://github.com/marketplace/actions/python-quality-and-format-checker
the code inside the file will look like this:
on: [push, pull_request]
name: Python Linting
jobs:
PythonLinting:
name: Python linting
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#master
- name: Konstruktoid Python linting
uses: konstruktoid/action-pylint#master
thanks to: bertrand martel
pylint is part of the new GitHub Super Linter (github/super-linter):
Introducing GitHub Super Linter: one linter to rule them all
The Super Linter is a source code repository that is packaged into a Docker container and called by GitHub Actions. This allows for any repository on GitHub.com to call the Super Linter and start utilizing its benefits.
When you’ve set your repository to start running this action, any time you open a pull request, it will start linting the code case and return via the Status API.
It will let you know if any of your code changes passed successfully, or if any errors were detected, where they are, and what they are.
This then allows the developer to go back to their branch, fix any issues, and create a new push to the open pull request.
At that point, the Super Linter will run again and validate the updated code and repeat the process.
And you can set it up to only int new files if you want.
Update August 2020:
github/super-linter issue 226 has been closed with PR 593:
This Pr will add:
Black python linting
Updated tests
I'm having some troubles setting up appveyor. I'd like to publish the generated web deploy packages to the Appveyor artifact feed. I've selected to build web deploy packages in appveyor.yml:
build:
project: Apps/MyProject.sln
publish_wap: true
I can see from the logs that the 2 webdeploy packages get produced:
[00:00:24] Package "Backend.zip" is successfully created as single file at the following location:
[00:00:24] file:///C:/Users/appveyor/AppData/Local/Temp/1/cul57h0ak9
I can push these packages to github releases by simply referring to them by filename:
deploy:
- provider: GitHub
tag: v$(appveyor_build_version)
auth_token:
secure: stuff
artifact: api.zip, backend.zip
force_update: false
on:
DEPLOY: true
However, I'm unable to publish these packages to Appveyor artifact feed, because unlike "deployments", it seems that I'm required to know the exact path of the artifact(s). Appveyour seems to use a temp folder when it generates these, so it's pretty hopeless to know the path. I cold traverse the build agent's user's temp file directory looking for them, but that seems a bit hacky to me.
So, my question is: How do I reliably tell appveyor to send my generated zips to the artifact feed?
(Note that I know that I can configure a "publish target" in visual studio and use that instead, but as far as I can understand the whole idea behind the "publish_wap" option is to not have to do that for every project. I'm trying to achieve a clear separation of code so that no build-specific config has to be included inside my msbuild projects).
Turns out Appveoyr auto-posts any artifacts, and now I feel stupid.
I have generated documentation for my project with cargo doc, and it is made in the target/doc directory. I want to allow users to view this documentation without a local copy, but I cannot figure out how to push this documentation to the gh-pages branch of the repository. Travis CI would help me automatically do this, but I cannot get it to work either. I followed this guide, and set up a .travis.yml file and a deploy.sh script. According to the build logs, everything goes fine but the gh-pages branch never gets updated. My operating system is Windows 7.
It is better to use travis-cargo, which is intended to simplify deploying docs and which also has other features. Its readme provides an example of .travis.yml file, although in the simplest form it could look like this:
language: rust
sudo: false
rust:
- nightly
- beta
- stable
before_script:
- pip install 'travis-cargo<0.2' --user && export PATH=$HOME/.local/bin:$PATH
script:
- |
travis-cargo build &&
travis-cargo test &&
travis-cargo --only beta doc
after_success:
- travis-cargo --only beta doc-upload
# needed to forbid travis-cargo to pass `--feature nightly` when building with nightly compiler
env:
global:
- TRAVIS_CARGO_NIGHTLY_FEATURE=""
It is very self-descriptive, so it is obvious, for example, what to do if you want to use another Rust release train for building docs.
In order for the above .travis.yml to work, you need to set your GH_TOKEN somehow. There are basically two ways to do it: inside .travis.yml via an encrypted string, or by configuring it in the Travis itself, in project options. I prefer the latter way, so I don't need to install travis command line tool or pollute my .travis.yml (and so the above config file does not contain secure option), but you may choose otherwise.
I want to automate student assignment grading system as much as possible. Ideally these steps will be taken when submitting the assignment.
Student forks my Github repository and modifies files
Student pushes the local code to his repository and creates pull request
Travis CI detects pull request and run Pull Request build
If code builds successfully, Coverity runs static code analysis for the pull request
Student gets build status from the Github pull request page
I've successfully set Travis builds for every pull request in my repo. I have successfully run Coverity scan via Travis for every commit on my repo. But I can't trigger Coverity scans for pull request, only Travis builds are run. Can I fix this problem and maintain Coverity scan report for every pull request?
This is my .travis.yml
language: c
compiler: gcc
env:
global:
# The next declaration is the encrypted COVERITY_SCAN_TOKEN, created
# via the "travis encrypt" command using the project repo's public key
- secure: "WHkT1bLbpz8VA8tl+qyZvWHLg7YvnMPhCNXCEAQQaklcDq8HQ7glIrrs35VnTDfs09tVgkPbgsAfwBuwxqkmmxWaquW0AHdb6cefNpQVj2ovUriQVNBFmjfte9Bbq0NWKoLp+4IY/3IDfLoUOekOIDXuQtkJhNvX1zkkt21lSeo="
addons:
coverity_scan:
project:
name: "Freeuni-CN101-2014/midterm"
description: "Build submitted via Travis CI"
notification_email: example#mail.com
build_command_prepend: ""
build_command: "make"
branch_pattern: "*"
script: make
Travis output of pull request here
Travis output after I merged the pull request with the main branch here
I asked Coverity support and they replied
The trigger for Coverity Scan happens for the specific branch and not
for the pull request, and specially the branch that is mentioned in
.travis.yml
UPDATE
With user #Admaster's help I started playing with Jenkins and cppcheck plugin. Jenkins is scanning pull requests successfully without setting build status to Github commits(Travis does set).
Example
So I continued experimenting with Travis and came over this repo. I changed my .travis.yml file that looks like this
language: c
compiler: gcc
before_install:
- sudo apt-get install -qq cppcheck
script:
- cppcheck --error-exitcode=1 --quiet .
- make
cppcheck may be less effective then Coverity, but it's sufficient for students' assignments.
I suggest not using Coverity, because free account has a lots of limits.
Better is to use Jenkins.
I will try to make configuration espacially for You.
Jenkins support pull requests on github