sonar+github integration - github

I want to enable sonar with git but is it neccesary that first pull the project from git repository using hudson or something else and then sonar will analyse the code periodically on hudson .am I right means my steps :
1.Pull project from git using hudson.
2.Sonar on hudson will analyse the code and send the updates.?
or directly we can use git+sonar how it works ,can anybody guide me to get it work.

Yes, you need first to pull your project from GitHub, and then launch a Sonar analysis on your local copy (Sonar needs the file to exist on the file system to be able to analyse them).
So you can pull your project manually or obvioulsy using a CI server like Jenkins/Hudson.

The good news, yesterday (2015-07-08) SonarQube has launched a Github Pluging, every time a pull request is submitted, the CI system launches a SonarQube preview analysis.
Reference:
http://www.sonarqube.org/github-pull-request-analysis-helps-fix-the-leak/

Related

sonarqube github project analysis

I have a repository in my github account and i want to analyse it with sonarqube after each commit
I put the repository url in my sonar scanner properties :
sonar.sources=https://github.com/rahma/JavaTest but does not work .
any idea about this please ?
Depending on the nature (confidential or public) of your project, you could use a GitHub Actions like SonarSource/sonarcloud-github-action
That way, on each push, you would scan your code with SonarCloud.io.
But if you have a local SonarQube instance running, then you need the Developer edition, and check if your GitHub credentials are correct.

VSTS\Azure-DevOps: Enabling Continuous Integration on pipeline with source from Bitbucket fails with error

Regards,
Your help will be appreciated.
I have created a pipeline in VSTS\Azure-DevOps. It gets its sources from a repository in Bitbucket. Queueing a build works fine. It builds and the tests succeed.
Now I want a build to run on every commit to the repository on Bitbucket. However, when I edit the pipeline and in the Triggers tab enable 'Continuous Integration' and click 'Save' I get the following error:
Unable to configure a service on the selected Bitbucket repository. Bitbucket returned the error 'Forbidden: '.
I am confused that I get 'Forbidden', while getting the source-code already works.
What is it that I am doing wrong? Is there something I must configure in VSTS\Azure-DevOps or in Bitbucket?
Answering my own question:
It appeared that in Bitbucket I only had the rights of 'Writer' for the Repository. When we changed it to 'Administrator' enabling Continuous Integration worked and we verified that committing a code change triggered the build.
Good news / bad news.
It looks like - for now - you can configure a pipeline without being a BitBucket admin on the repo... but not using the templates.
So you can build an empty pipeline based on a BitBucket repo (no admin access), and manually add each of the tasks.
Based on further tests: what you cannot do is set the Continuous Integration trigger, because that requires admin access to set up the webhooks
I know, this is not what you want... but at least there is a way to end up with a working pipeline.
Regards,
Jose

Publish try code in Jenkins before committing

I have this development environment with Eclipse as IDE, SVN as SCM and Jenkins as CI server.
Is there a way I could start a Jenkins job from Eclipse and tell Jenkins somehow to take some code from my Eclipse workspace instead of the SVN? Without committing that code into SVN?
I know how to do the first part (start a job via Mylyn / Builds), but not the second one...
Maybe something like the way TeamCity is integrated into Intellij IDEA and the way they have facilitated gated commits...
It is not a good idea using local workspace for a continious integration tool in my opinion. Jenkins runs on a server machine in a standart configuration not in local machine. I think best practice for your scenario is using SVN branches for test committing. Configure a job in jenkins which works with the SVN branch to chechout the code. Add a svn hook for jenkins to compile after commit. Then integrate the branch to trunk after jenkins successfully build.
You can do anything with Jenkins.
Building code from my local machine in Jenkins is not a good idea though.
If at all you want to achieve this anyway, you can poll specific folder for any change and start building through Jenkins

Build an open source project from github (not mine) with a ci

There is an open source project (https://github.com/firebase/firebase-jobdispatcher-android), which I would like to get built using travis/circleci or another cloud ci. However, those CI's don't allow you to get to repos that are not yours.
I didn't try, but I have a hunch that I won't be able to get a webhook setup as well to get notified when those repos 'master' branch is updated.
Why not fork ? Because then I somehow need to manually\use cron server to get my forked repo updated! It loses the point of having open source repo builds...
Why do I want to build it continually? Because they do not upload their .aar output to mavencetral or jcenter and I don't want to put the .aars in my project and get it updated all the time - bloats the repo...
In any case, I don't get it - there's an open source project, the repo exists and open to everyone, pulling the data and getting webhooks doesn't compromise that repo in any way why isn't this possible ????
If I'm mistaken and web hook is possible, how can I set up a build that will end up in uploading to mavencentral (probably gradle plugin, I have an account and be happy to have a public copy there)?
(I thought of micro service, free of course of some kind + docker based ci which I can pull and build whatever, I don't mind if a build will take time).

${CHANGES} does not work in the mail-ext plugin if jenkins job is driven by a bash script

I have setup a Jenkins job to build a project. I'm using email-ext plugin to send out build notifications with the intent of showing who did what and the path to the files changed. But unfortunately I'm not getting anything. I believe the reason why is that under "Source Code Management" I'm setting it to "None". My shell script that I'm using to drive the build is responsible for check-in out a copy of the code based on a CVS tag and run maven to do the build. In the ext-email i'm using the following syntax
${CHANGES_SINCE_LAST_SUCCESS, reverse=true, showPaths=true,
format="\n====\nChanges for Build # %n\n%c\n",
changesFormat="\n[%r] %d %a %m %p\n"}
Same thing with CHANGES: ${CHANGES, showPaths=true}
Is there a way of getting CHANGES and CHANGES_SINCE_LAST_SUCCESS to work if None option is used under Source Code Management?
Thanks for your help folks.
EmailExt plugin gets that info from Jenkins. As Jenkins has access to that info only via its SCM plugins the answer is "no", you can't do it without specifying the SCM option.
There are two things you can do:
(1) Do it by hand. Which with CVS, if I remember correctly, means having a working copy checked out anyway.
(2) Use SCM checkout/update option, but store the working copy on the side without using it in the build. You'll use twice as much disk-space, but nowadays disk-space is not a problem.
By the way, why are you using CVS? SVN, GIT, and Mercurial are all free.