Can I preserve local ivy repository in Bluemix BUILD & DEPLOY? - ibm-cloud

Bluemix BUILD & DEPLOY seems to provide a new virtual machine for every build. Ivy local repository (~/.ivy2) is cleared and dependency jar download runs during every build. That takes too long. Is there any way to avoid this?

You're right--the pipeline spins up a new virtual machine for each build. Your only other option is likely to include those dependencies in the build input.

Related

Skip plugin downloading Terraform

I'm using self-hosted agent in Azure Pipelines and I installed Terraform 0.13 there. When I use Terraform tasks in Azure Devops, as commandOptions I chose '-plugin-dir=/usr/local/bin/.terraform.d/plugins' to skip plugin downloading. Unfortunately, Terraform downloads it to artifact and makes it much heavier than it should be. Also next stage (deployment stage) uses only plugins from artifact, not from our agent.
We do not have much space on our virtual machine that's why we want to avoid unnecessary downloads.
In addition, we defined .terraformrc in home directory with plugin directory. Also we added environment variable as written there:
https://www.terraform.io/docs/commands/cli-config.html#provider-installation
Thank you in advance!
You can try to set -get-plugins=false option.
-get-plugins=false — Skips plugin installation. Terraform will use plugins installed in the user plugins directory, and any plugins already installed for the current working directory. If the installed plugins aren't sufficient for the configuration, init fails.
This is stated in this document.
Eventually I did it another way - I used Cache task from Azure Pipelines. Here's a solution from ITNext:
https://itnext.io/infrastructure-as-code-iac-with-terraform-azure-devops-f8cd022a3341
This is how I described Cache task:
Describe keys well and choose right path.

Only build the project that has changed

I have a single .NET solution with multiple class library projects, each one is published as a nuget feed using azure-devops.
Then I have an azure-devops build pipeline, with steps to Restore, Build, Publish, Pack and Push.
The first 4 steps are setup for **/.csproj and the last is a $(Build.ArtifactStagingDirectory)/.nupkg with the target feed.
I have everything set up and working, except if you make a change to just one project, it builds ALL projects because of the **/*.csproj.
This is no good for nuget packages, as it increments every project's version number and they all appear as having an update available in the nuget package manager.
My Question: Is there a way to do this so that only the project(s) with changes go through the process?
Is there a way to do this so that only the project(s) with changes go through the process?
The answer is yes.
The solution is use the private agent to build your solution instead of the hosted agent.
That because every time the hosted agent assigned to us is a clean machine, VS/MSbuild will build all the projects for the setting **/* csproj. So, to resolve this issue, we must save the results of the last build to achieve incremental builds.
So, to resolve this issue, we need to set up a private agent to build those projects and do not clean the working directory of your private agent before the build is run:
Set the Clean option to false on the Get sources:
Note: Since you also set the **/*.csproj for the task nuget push, if the project not modified, this command will push the same version to the feed, it will throw the conflict error, you need enable Allow duplicates to be skipped on the nuget push task:
Hope this helps.

Building multiple Gradle projects in Jenkins with AWS CodePipeline

I have a Gradle project that consists of a master project and 2 others that included using includeFlat directive. Each of these 3 projects has its own repo on GitHub. To build it I checkout all 3 projects into a common top folder then cd into the master project and run gradle build. And it works great!
Now I need to deploy the resulting app to AWS EB (Elastic Beanstalk) which is also works great when I produce the artifact locally and then deploy it manually. I want to automate the process so I'm trying to set it up using CodePipelines + Jenkins as described in this document adjusted for Gradle.
The problem is that if I specify 3 Sources in the pipe I end up with my projects extracted on top of each other creating a mess in Jenkins workspace. I need to somehow configure each project to be output to its own directory within Jenkins workspace and I just don't see a way to do it (at least in UI)
Then, of course even if I achieve what I want I need somehow to cd into the master directory to run gradle build and again I'm not sure how to do that
P.S. Great suggestions from #Phil but unfortunately is seems that CodePipeline does not currently support Git submodules or subtrees
I would start common build, when changes happened on any of 3 repos. With say 5 minutes delay, to have single build, even if changes are introduced to more then one repo.
I can't see good way to deal with deployment in other way than using eb deploy... old way... Please install aws tools at your jenkins machine. Create deployment job triggered on successful build. And put bash script doing deployment there. Please put more details about your deployment, that way I can help with deployment script.

Builtin Octopus deploy repository

We're using Teamcity CI for build and Octopus Deploy for deployment.
We want to use Builtin Octopus deploy repository for storing Artifacts instead of teamcity repo. What are the differeneces between them?
Can you help me evaluate the built in Octopus repository. Pro/Cons, any complications you might be facing.
Thanks.
One of the key differences is that TeamCity can be used as an externally accessible NuGet server, but Octopus Deploy can't expose any packages it knows about. If you're building components in TeamCity that are exposed as NuGet packages and reused within applications then Octopus Deploy won't be able to handle that scenario.
If you're just building applications and exposing them for Octopus Deploy then my advice would be to push them to Octopus Deploy to manage, otherwise you end up duplicating on disk space as there'll be a copy of the package in TeamCity and a copy of the package in Octopus Deploy once it has downloaded it from the TeamCity NuGet feed.
Hope this helps.
The inbuilt Octopus Deploy repository allows you to automatically create and deploy a release as soon as it is packaged and published (usually during a server build). This is great if you want to schedule nightly builds so that your development/test/integration environment is always update to date.
External package repositories cannot be used to automatically create
releases, only the built-in package repository is supported.
It also maintains packages through a retention policy so you don't have to worry about running out of disk space.
We use two NuGet repositories. One for application packages deployed through Octopus Deploy, and one for shared packaged reusable components using NuGet.Server.

Bamboo Deployment project to Artifactoy

I've been researching this for a while but can't find an answer.
I use Bamboo 5.3 with Artifactory plugin 1.6.2. I have a build project that generates a .war and two .zips. I also have a Bamboo Deployment project that creates releases with these three files and deploys to DEV, QA and so on.
For a build project I am able to use the artifactory plugin, that's fine. The problem is that I end up with a lot of artifacts if I publish all the builds. I would like to publish to Artifactory only the files from the releases, so that is happens less often, and that the people would see only the 3-4 releases tries, not the 150 builds.
My issue is that when creating my Deployment tasks (like download, copy, call ssh script...) there is no 'Artifactory Generic Deploy', like in the build project tasks.
I see there is a new Bamboo 5.4 with some improvement around the deployment process, maybe this could help?
Support for deployment task from Bamboo to Artifactory will be available starting with version 1.8.0 of the Artifactory plugin.
Here is the Jira issue.
I faced a similar issue. Hopefully the next release of the artifactory plugin will integrate with deployment projects.
If you are willing to use Maven to broker the deployment, deploy-file can get the job done.
In the deployment project, after your artifact download task add a Maven 3.x task for each artifact you want to send.
You'll need to specify a build JDK and for environment variables I'm using MAVEN_OPTS="-DskipTests=true -XX:MaxPermSize=4096m"
For the actual maven command:
deploy:deploy-file
-Durl=http://${bamboo.artifactory_username}:${bamboo.artifactory_password}#${bamboo.artifactory_url}/artifactory/${bamboo.destinationRepo}
-DrepositoryId=localhost
-Dfile=${bamboo.pathToArtifact}/${bamboo.arftifactName}-${bamboo.majorVersion}.${bamboo.minorVersion}.${bamboo.arftifactExtension}
-DgroupId=${bamboo.arftifactGroup}
-DartifactId=${bamboo.arftifactName}
-Dversion=${bamboo.majorVersion}.${bamboo.minorVersion}
-Dpackaging=${bamboo.arftifactExtension}
-DgeneratePom=true
Hope this helps!
The artifactory API is pretty usable for this purpose. You can deploy directly using curl in a shell script.
See https://www.jfrog.com/confluence/display/RTF/Artifactory+REST+API for the details.