I want to automate student assignment grading system as much as possible. Ideally these steps will be taken when submitting the assignment.
Student forks my Github repository and modifies files
Student pushes the local code to his repository and creates pull request
Travis CI detects pull request and run Pull Request build
If code builds successfully, Coverity runs static code analysis for the pull request
Student gets build status from the Github pull request page
I've successfully set Travis builds for every pull request in my repo. I have successfully run Coverity scan via Travis for every commit on my repo. But I can't trigger Coverity scans for pull request, only Travis builds are run. Can I fix this problem and maintain Coverity scan report for every pull request?
This is my .travis.yml
language: c
compiler: gcc
env:
global:
# The next declaration is the encrypted COVERITY_SCAN_TOKEN, created
# via the "travis encrypt" command using the project repo's public key
- secure: "WHkT1bLbpz8VA8tl+qyZvWHLg7YvnMPhCNXCEAQQaklcDq8HQ7glIrrs35VnTDfs09tVgkPbgsAfwBuwxqkmmxWaquW0AHdb6cefNpQVj2ovUriQVNBFmjfte9Bbq0NWKoLp+4IY/3IDfLoUOekOIDXuQtkJhNvX1zkkt21lSeo="
addons:
coverity_scan:
project:
name: "Freeuni-CN101-2014/midterm"
description: "Build submitted via Travis CI"
notification_email: example#mail.com
build_command_prepend: ""
build_command: "make"
branch_pattern: "*"
script: make
Travis output of pull request here
Travis output after I merged the pull request with the main branch here
I asked Coverity support and they replied
The trigger for Coverity Scan happens for the specific branch and not
for the pull request, and specially the branch that is mentioned in
.travis.yml
UPDATE
With user #Admaster's help I started playing with Jenkins and cppcheck plugin. Jenkins is scanning pull requests successfully without setting build status to Github commits(Travis does set).
Example
So I continued experimenting with Travis and came over this repo. I changed my .travis.yml file that looks like this
language: c
compiler: gcc
before_install:
- sudo apt-get install -qq cppcheck
script:
- cppcheck --error-exitcode=1 --quiet .
- make
cppcheck may be less effective then Coverity, but it's sufficient for students' assignments.
I suggest not using Coverity, because free account has a lots of limits.
Better is to use Jenkins.
I will try to make configuration espacially for You.
Jenkins support pull requests on github
Related
I'm trying to use GitHub to trigger on PR a GitLab pipeline.
Practically when a developer creates a PR in GitHub, his/her code get tested against a GitLab pipeline.
I'm trying to follow this user guide: https://docs.gitlab.com/ee/ci/ci_cd_for_external_repos/github_integration.html
and we have a silver account, but it won't work. When creating the PR, the GitLab pipeline is not triggered.
Anyone with this kind of experience who can help?
Thanks
Joe
I've found the cause of the issue.
In order for GitHub to trigger GitLab as CD/CI mostly in PR request, you need to have a Silver/Premium account AND, very important, being the root owner.
Any other case, you won't be able to see github in the integration list on GitLab. People from gitlab had the brilliant idea to hide it instead of showing it disabled (which would had been a tip to understand that you needed an upgraded license)
In the video above it's not explained.
Firstly, you need to give us the content of your .gitlab-ci.yaml file. In your question you asked about GitHub but you're following Gitlab documentation which is completely different. Both are using git commands to commit and push repos but Github & Gitlab are different.
For Github pipelines, you need to create a repository, then you go to Actions. Github will propose you to configure a .github/workflows directory which contain a file.yaml. In this .yaml file you can code your pipelines. According to your project, Github will propose you several linux machines with the adequate configuration to run your files (If it's a Java Project --> you'll be proposed maven machines, Python --> Python Machines, React/Angular -> machines with npm installed, Docker, Kubernetes for deployments...) and you're limited to 4 private project as far as I know (check this last information).
For Gitlab you have two options, you can use preconfigured machines like github, and you call them by adding for example atag: npm in your .gitlab-ci.yaml file, to call a machine with npm installed, but you need to pay an amount of money. Or you can configure your own runners by following the Gitlab documentation with gitlab commands (which is the best option), but you'll need good machines and servers to run npm - mvn - python3 - ... commands
Of course, in your Gitlab repository, and finally to answer your question this an example, of .gitlab-ci.yaml file with two simple stages: build & test, the only statement specifies that these pipelines will run if there is a merge request ( I use the preconfigured machines of Gitlab as a sample here) More details on my python github project https://github.com/mehdimaaref7/Scrapping-Sentiment-Analysis and for gitlab https://docs.gitlab.com/runner/
stages:
- build
- test
build:
tags:
- shell
- linux
stage: build
script:
- echo "Building"
- mkdir build
- touch build/info.txt
artifacts:
paths:
- build/
only:
- merge_requests
test:
tags:
- shell
- linux
stage: test
script:
- echo "Testing"
- test -f "build/info.txt"
only:
- merge_requests
I'm setting up a project with Travis CI and SonarQube.com, everything goes smoothly when a pull request comes out of a branch from the repository but it is failing when Travis runs a build off a pull request from a forked repository.
A build out of a PR from the repository: https://travis-ci.org/PistachoSoft/dummy-calculator/builds/162905730
A build out of a PR from a forked repository: https://travis-ci.org/PistachoSoft/dummy-calculator/builds/162892678
The repository: https://github.com/PistachoSoft/dummy-calculator
As it can be seen in the build log this is the error:
You're not authorized to execute any SonarQube analysis. Please contact your SonarQube administrator.
Things I've tried out but didn't work out:
Updating the sonar token.
Using an encrypted token granted by another person from the organization.
Granting 'sonar-users' and 'Anyone' the 'Execute Analysis' permission on the SonarQube project.
What can I do to fix this?
First, I raise your attention on one important point: you should not run a "standard" SonarQube analysis on PR - otherwise your project on SonarQube.com will be "polluted" by intermediate analyses that have nothing to do with each other. Standard analyses must be executed only on the main development branch - which is usually the "master" branch. Please read the runSonarQubeAnalysis.sh file of our sample projects to see how to achieve that.
Now, why your attempt does not work? Simply because the SONAR_TOKEN environment variable (that you've set as "secure" in your YML file) will not be decoded by Travis when the PR is coming "from the outside world" (i.e when it's not a PR of your own). This is a security constraint to prevent anybody to fork your repo, update the YML file with a echo $SONAR_TOKEN, submit a PR and genlty wait that Travis executes it to unveil the secured environment variable.
Analyzing "external" PR is something that we'll soon be working on so that this is easy, straightforward and yet secured for OSS projects to benefit from this feature.
I currently have TravisCI building on PRs in a public GitHub repo.
The instructions for Coveralls say to put this in a .coveralls.yml file:
service_name: travis-pro
repo_token: <my_token>
That doesn't work for me because the .coveralls.yml file would be public--checked into GitHub. My TravisCI is integrated into my GitHub repo wired to a branch and fires on PR.
So I tried this:
In TravisCI's site I set an environment var:
COVERALLS_REPO_TOKEN to my token's value.
Then modded my .travis.yml to look like this:
language: scala
scala:
- 2.11.7
notifications:
email:
recipients:
- me#my_email.com
jdk:
- oraclejdk8
script: "sbt clean coverage test"
after_success: "sbt coverageReport coveralls"
script:
- sbt clean coverage test coverageReport &&
sbt coverageAggregate
after_success:
- sbt coveralls
Now when I create a PR on the branch this runs ok--no errors and I see output in Travis' console that the coverage test ran and generated files. But when I go to Coveralls I see nothing--"There have been no builds for this repo."
How can I set this up?
EDIT: I also tried creating a .coveralls.yml with just service_name: travis-ci
No dice, sadly.
How can I set this up?
Step 1 - Enable Coveralls
The first thing to do is to enable Coveralls for your repository.
You can do that on their website http://coveralls.io:
go to http://coveralls.io
sign in with your GitHub credentials
click on "Repositories", then "Add Repo"
if the repo isn't listed, yet, then "Sync GitHub Repos"
finally, flip the "enable coveralls" switch to "On"
Step 2 - Setup Travis-CI to push the coverage infos to Coveralls
You .travis.yml file contains multiple entries of the script and after_success sections. So, let's clean that up a bit:
language: scala
scala: 2.11.7
jdk: oraclejdk8
script: "sbt clean coverage test"
after_success: "sbt coveralls"
notifications:
email:
recipients:
- me#my_email.com
Now, when you push, the commands in the script sections are executed.
This is were your coverage data is generated.
When the commands finish successfully the after_success section is executed.
This is were the coverage data is pushed to coveralls.
The .coveralls config file
The .coveralls file is only needed to:
public Travis-CI repos do not need this config file since Coveralls can get the information via their API (via access token exchange)
the repo_token (found on the repo page on Coveralls) is only needed for private repos and should be kept secret. If you publish it, then anyone could submit some coverage data for your repo.
Boils down to: you need the file only in two cases:
to specify a custom location to the files containing the coverage data
or when you are using Travis-Pro and private repositories. Then you have to configure "travis-pro" and add the token:
service_name: travis-pro
repo_token: ...
I thought it might be helpful to explain how to set this up for PHP, given that the question applies essentially to any language that Coveralls supports (and not just Lua).
The process is particularly elusive for PHP because the PHP link on Travis-CI's website points to a password-protected page on Coveralls' site that provides no means by which to login using GitHub, unlike the main Coveralls site.
Equally confusing is that the primary PHP page on Coveralls' site seems to contain overly-complicated instructions that require yet another library called atoum/atoum (which looks to be defunct) and are anything but complete.
What ended-up working perfectly for me is https://github.com/php-coveralls/php-coveralls/ . The documentation is very thorough, but it boils-down to this:
Enable Coveralls for your repository (see Step 1 in the Accepted Answer).
Ensure that xdebug is installed and enabled in PHP within your Travis-CI build environment (it should be by default), which is required for code-coverage support in PHPUnit.
Add phpunit and the php-coveralls libraries to the project with Composer:
composer require phpunit/phpunit php-coveralls/php-coveralls
Update travis.yml at the root of the project to include the following directives:
script:
- mkdir -p build/logs
- vendor/bin/phpunit tests --coverage-clover build/logs/clover.xml
after_success:
- travis_retry php vendor/bin/php-coveralls
Create .coveralls.yml at the root of the project and populate it with:
service_name: travis-ci
I'm not positive that this step is necessary for public repositories (the Accepted Answer implies that it's not), but the php-coveralls documentation says of this directive (emphasis mine):
service_name: Allows you to specify where Coveralls should look to find additional information about your builds. This can be any string, but using travis-ci or travis-pro will allow Coveralls to fetch branch data, comment on pull requests, and more.
Push the above changes to the remote repository on GitHub and trigger a Travis-CI build (if you don't already have hooks to make it happen automatically).
Slap a Coveralls code-coverage badge in your README (or wherever else you'd like). The required markup may be found on the Coveralls page for the repository in question, in the Badge column.
Almost breaking my head over this for last few days but the github plugin for sonarqube (v 5.3) just does not seem to work.
I have my java app code in github, and have configured Jenkins to run mvn sonar:sonar goal on pull request.
The maven settings are:
clean site sonar:sonar
-Dsonar.analysis.mode=preview
-Dsonar.github.oauth=<OAUTH_TOKEN>
-Dsonar.github.repository=<ORG>/<REPO>
-Dsonar.github.pullRequest=${ghprbPullId}
-Dsonar.github.endpoint=<ENT_GITHUB_API_BASE__URI>
For sonar.analysis.mode, I tried 'issues' too
Now I perform foll:
make change to a fork (introduce a violation as per configured quality gate)
commit and push to fork repo
Create a pull request
run the jenkins job using above configuration
The analysis is successful, and the plugin always reports that all checks have passed and changes can be merged. I am just not able to understand why the github plugin in sonar is not able to show violation occured and checks have failed.
Now if I merge the pull request and run sonar analysis in publish mode on the master repo, it says quality gate failed and I am able to see this in SonarQube dashboard for the project with the statement that Quality gate has failed
What am I doing wrong here? My guess is the github plugin not able to compare the changes in the pull request with that in the master repo and hence not able to report the violation. How do I fix that?
Update:
If at the end, I merge the pull request to master repo and re-run the sonar analysis on the original pull request (the one that got merged), it does report the violation as comments in the Pull Request conversation. (But what is the point if sonar is going to report the violations after the pull request is merged???)
I have a CI build that is setup in TeamCity that will trigger when a pull request is made in BitBucket (git). It currently builds against the source branch of the pull request but it would be more meaningful if it could build the merged pull request.
My research has left me with the following possible solutions:
Script run as part of build - rather not do it this way if possible
Server/agent plugin - not found enough documentation to figure out if this is possible
Has anyone done this before in TeamCity or have suggestions on how I can achieve it?
Update: (based on John Hoerr answer)
Alternate solution - forget about TeamCity doing the merge, use BitBucket web hooks to create a merged branch like github does and follow John Hoerr's answer.
Add a Branch Specification refs/pull-requests/*/merge to the project's VCS Root. This will cause TeamCity to monitor merged output of pull requests for the default branch.
It sounds to me like the functionality you're looking for is provided via the 'Remote Run' feature of TeamCity. This is basically a personal build with the merged sources and the target merge branch.
https://confluence.jetbrains.com/display/TCD8/Branch+Remote+Run+Trigger
"These branches are regular version control branches and TeamCity does not manage them (i.e. if you no longer need the branch you would need to delete the branch using regular version control means).
By default TeamCity triggers a personal build for the user detected in the last commit of the branch. You might also specify TeamCity user in the name of the branch. To do that use a placeholder TEAMCITY_USERNAME in the pattern and your TeamCity username in the name of the branch, for example pattern remote-run/TEAMCITY_USERNAME/* will match a branch remote-run/joe/my_feature and start a personal build for the TeamCity user joe (if such user exists)."
Then setup a custom "Pull Request Created" Webhook in Bitbucket.
https://confluence.atlassian.com/display/BITBUCKET/Tutorial%3A+Create+and+Trigger+a+Webhook
So for your particular use case with BitBucket integration, you could utilize the WebHook you create, and then have a shell / bash script (depending on your TeamCity Server OS) that runs the remote run git commands automatically, which will in turn automatically trigger the TeamCity Remote Run CI build on your server. You'll then be able to go to the TeamCity UI, +HEAD:remote-run/my_feature branch, and view the Remote Run results on a per-feature basis, and be confident in the build results of the code you merge to your main line of code.
Seems that BitBucket/Stash creates branches for pull requests under:
refs/pull-requests//from
You should be able to setup a remote run for that location, either by the Teamcity run-from-branch feature, or by a http post receive hook in BitBucket/Stash.
You can also use this plugin : https://github.com/ArcBees/teamcity-plugins/wiki/Configuring-Bitbucket-Pull-Requests-Plugin
(Full disclosure : I'm the main contributor :P, and I use it every day)