cloudbees folder plugin jobs in copied folders must be saved before they can be run - plugins

I'm having a look at the Cloudbees Jenkins Folder Plugin. It looks like it will serve my purposes in allowing me to easily copy groups of jobs but...
When I copy a folder containing some jobs I have to click through Configure / Save for each job in the new folder before the Build option is available. Is this expected behavior?
thanks
Glenn

Yes, when a job is copied it is marked temporarily unbuildable until you save its configuration. This is essential in case the origin job was configured with a build trigger (like building on a schedule): you would not want the copied job to start running until you had a chance to check its configuration and perhaps make changes.

Related

Copying files and deploying to Azure without building using Visual Studio Team Services

I'm attempting to deploy a web site to Azure using VSTS. Basically, I commit code to the GIT repo and have it setup to run CI, so it begins building as soon as I commit. However, once it hits the release section, it never copies the code to the Azure web app, rather, it gives me this line:
Info: Updating file ({projectname}\error.txt).
It doesn't copy the files I changed, but rather always just copies this file. I checked and there is indeed an error.txt file in my website directory in Azure, but it is always blank.
This build/deploy process isn't "standard" because the build step only downloads from source code, it doesn't build, because the website isn't a "web application", but rather just a "web site", meaning it doesn't need to be built.
So my build step is as follows:
Get Sources
Run on Agent - this step is empty
so the idea is that it just downloads everything from source control, that's it.
Then, my release step is as follows:
Artefacts are from build step above
deploy to environment 1 (dev)
Azure app service deploy, using "package or folder" as $(System.DefaultWorkingDirectory)/
Any idea what I might be doing wrong here?
So I actually figured this out and will leave this here in case anyone else needs it.
I admit I'm pretty new to the Azure/VSTS world, so maybe someone else is making my mistake as well.
If you don't need to "build" your project, then don't. I resolved it by simply skipping the build step altogether. What I was really after was to just download the files from source control and deploy them as-is.
In your release editor, you can specify which "artifact" you want to use to release, and one of the options is source control, which is what I did.
This would be useful for websites like mine where you don't need to build them (mine is DNN/DotNetNuke, so you don't build it before deploying).

Building multiple Gradle projects in Jenkins with AWS CodePipeline

I have a Gradle project that consists of a master project and 2 others that included using includeFlat directive. Each of these 3 projects has its own repo on GitHub. To build it I checkout all 3 projects into a common top folder then cd into the master project and run gradle build. And it works great!
Now I need to deploy the resulting app to AWS EB (Elastic Beanstalk) which is also works great when I produce the artifact locally and then deploy it manually. I want to automate the process so I'm trying to set it up using CodePipelines + Jenkins as described in this document adjusted for Gradle.
The problem is that if I specify 3 Sources in the pipe I end up with my projects extracted on top of each other creating a mess in Jenkins workspace. I need to somehow configure each project to be output to its own directory within Jenkins workspace and I just don't see a way to do it (at least in UI)
Then, of course even if I achieve what I want I need somehow to cd into the master directory to run gradle build and again I'm not sure how to do that
P.S. Great suggestions from #Phil but unfortunately is seems that CodePipeline does not currently support Git submodules or subtrees
I would start common build, when changes happened on any of 3 repos. With say 5 minutes delay, to have single build, even if changes are introduced to more then one repo.
I can't see good way to deal with deployment in other way than using eb deploy... old way... Please install aws tools at your jenkins machine. Create deployment job triggered on successful build. And put bash script doing deployment there. Please put more details about your deployment, that way I can help with deployment script.

Deploy build files from continuous integration

I am working on a project with multiple people, a website application which requires webpack to be built, uglified, concatenated into a few files e.g. app.min.js, style.min.css etc. - As a result of this, in an effort to prevent merge conflicts we recently added the build folder to .gitignore, under the assumption that we would be able to build during deployment.
When pushing to the Master branch, we automatically "deploy" through Semaphore CI (similar to Travis) which runs composer install, npm install, and finally "npm run build" which triggers the webpack build. This is all built and then tested on the CI side of things, and then Semaphore automatically deploys to Amazon's Elastic Beanstalk where our application is hosted.
The problem with this is, it seems Semaphore doesn't upload the build it's just tested, but rather the Master branch itself which has no built JS or CSS. I'm wondering if there's a way to push these built files to deployment as well, or if running the entire build process AGAIN on Elastic Beanstalk is the only route. It seems unnecessary to have to do that process essentially 3 times, locally, CI, and then deployment. Every time a step like this is needed on EB the actual re-instantiation time gets longer, which I'd like to keep as short as possible.
Obviously if building it a 3rd time on EB is the only way to go about this then I'll have to, just wondering if there are better solutions for this whole workflow.
I haven't worked with Semaphore CI, but you might be able to use an .ebignore file.
If you create one, the cli will use that instead of your .gitignore file.
I find in some deployment situations you want the inverse of your .gitignore (all compiled, no src). It essentially lets you pick the files from your project directory that you want to deploy, in the same way as the .gitignore file.
Edit: I just noticed the documentation on aws is lacking. It only mentions file exclusion, but you can include files too.
Edit 2: I don't think Semaphore supports the use of .ebignore, so right now this solution isn't of any use. :(
I just had a great first experience with https://deploybot.com/. The can deploy directly to elastic beanstalk. It might be interesting or you.

Execute SVN Update in Jenkins - Copy a Folder to Web root Explicitly from SVN as a Build Step

I'm new to Jenkins CI.I'm trying to get SVN update (myFolder) inside a job as build steps. I want to explicitly copy some files to web root as I can't have them inside my solution.
Build Steps I need to perform.
Build Solution
Publish
Copy myFolder to web root
Sync
Up to Publish it works fine.Problem when trying to copy/update myFolder to web root.
MyFolder is located out of the project solution folder as I cant have it inside solution Folder.
Note: This myFolder has serialized items/object that I need to Sync in the next step.It should be copied to web root in-order to sync.
And this folder is committed to SVN.
In my local CMD following batch file works fine but when I try in Jenkins Execute Windows Batch Command it stops at
-- Updating source from SVN
-- Running update...
#echo off
cls
echo -- Initiating system instance variables...
echo. -- Setting the variables...
:: Here you need to make some changes to suit your system.
set SOURCE=C:\inetpub\wwwroot\Test\Website\App_Data\myFolder\
set SVN=C:\Program Files\TortoiseSVN\bin
:: Unless you want to modify the script, this is enough.
echo. %SOURCE%
echo. %SVN%
echo. ++ Done setting variables.
echo.
echo -- Updating source from SVN
echo. -- Running update...
"%SVN%\TortoiseProc.exe" /command:update /path:"%SOURCE%" /closeonend:1
echo. ++ Done.
echo. -- Cleaning up...
set SOURCE=
set SVN=
echo. ++ Done.
I have Subversion Plugin installed.Any solution for this problem.
And Also I tried using below Powershell Script
#Get checkout folder
TortoiseProc.exe /command:"update" /path:"C:\inetpub\wwwroot\Test\Website\App_Data\myFolder\"
It works in my local Windows Powershell but not in Jenkins Windows Powershell
In an effort to help answer your question, I will explain the configuration of a job which should accommodate what you are trying to achieve: building a project under version control after an svn update has been performed and moving the generated files to a separate directory.
Setup the Source Code Management section
Within this section in your job's configuration page, choose the appropriate version control system (ie, Subversion) and point the job to your project's URL, noted below. Also be mindful to select the appropriate check-out strategy. This is what Jenkins will use when your job runs (ie, svn update) as Jenkins will store a copy of your repository on the build-server in the job's workspace.
Without proceeding any further, this job will only pull down any changes from your repository through the appropriate check-out strategy configured above when this job runs.
However, you would like Jenkins job to actually do something meaningful when the job runs, such as build/publish your project. This is achieved through build steps, so let's configure build steps.
Configure the appropriate build step(s)
Build/Publish Website Locally
Assuming you have scripts already written to build/publish the website under version control (let's call it !Publish Website.bat as an example) which builds the project and publishes it locally, you can configure the step underneath the Build section as follows,
Note: %WORKSPACE% is a built-in environment variable which resolves to the current workspace of the job. There is a link under the build-step to list all the different environment variables exposed which can be used.
Without proceeding any further, the job will now pull down any changes and execute the batch file to publish/build a website locally within your workspace when this job runs.
Not quite done considering you wish to have these newly generated files to reside within your website's webroot folder so these changes are reflected on your website. For simplicity's sake we can go ahead and add another build-step to perform the copy.
Copy Contents to Webroot
Assuming you have scripts already written to copy the contents of the website under version control (let's call it !Copy Website.bat) which takes the published files and copies them to the appropriate directory on your webserver, you can configure the step underneath the Build section as follows,
Now when the job runs, it will perform an svn update against the repository on it's local workspace and execute the preceeding build-steps (ie, build/publish the solution and copy the contents to your webroot.)

Build Workspace mapping

I got two solution setting at same location. This two solution are sharing some of the projects along with some dedicated ones.
I have created two separate build definition with gated check in trigger but issue is that when I make any change in one solution it triggers both the build definition.
Can I somehow control the triggering of the build definition based on the solution that I am checking in?
You need to configure your workspace correctly for this to work. Any change in a Build definition's mapped workspace will cause a build to trigger. Due to this, it completely depends on your source control layout, whether it's possible to setup a build that only triggers when something changes that belongs to either solution.
This setup will become very hard to manage quite quickly, as such I recommend you put each set of projects in their own subfolder, that makes it a lot easier.
So ensure that you build definitions won't both trigger, open the Source Settings panel of your build definition and apply a cloak rule to each file or folder by changing "active" in the first column to "cloaked".
To cloak a file you need to enter its full path in TFS, the UI will only offer you a folder picker, but entering a path to a file will work.
These files should:
Not be needed to build the solution
and changes to should not trigger the build.
Do note that the cloak will cause Team Build to not get these files on the Build agent, so it's not possible to have files your build depends on, but not trigger the build when these files change.
You should create gated check-in build definitions per project not per solution.