We are using VSTS for CI and CD in my team, we got over 40 repositories which are separated projects. but all of them have to run the same PowerShell script in one of their Build steps.
the PowerShell file is bigger too big to be kept as the inline script, so we need to save it inside a file. obviously, I got a copy of the PowerShell file in each repository.
Problem:
Now whenever I need to update the script, then I end up to update it in every repository, which is over 40 at the moment.
I think there should be a better approach. Is there any way that I can put my script in one single repo (a repo dedicated to holding the script) then I use it within each build, therefore we I need to update it I only need to update it once.
There are a few options.
My general recommendation is to publish the script as a package (NuGet or otherwise) and restore it during your application builds. This allows consumers to stay "pinned" to a known-good, known-working version, and update on a schedule that works for them.
Another option is to add a submodule to each repository that requires the script dependency, then initialize the submodule during the build process.
A third option is to turn the shared script into a VSTS build task or extension. This is extensively documented and easily located so I won't belabor the point by including instructions for doing that here.
You can add a git repository to store your powershell file.
Then add a build step to get you file from that repository during build and use it.
Related
I have a PowerShell script that I want to re-use across multiple build pipelines. My question is, is there a way I can "store" or "save" my PowerShell script at the project or organization scope so that I can use it in my other build pipelines? If so, how? I can't seem to find a way to do this. It would be super handy though.
It is now possible to check out multiple repositories in one YAML pipeline. You could place your script in one repository and check it out in a pipeline of any other repository. You could then reference the script directly on the pipeline workspace.
More info here.
Depending on how big theese scripts are you can create Taskgroups that contain powershell-tasks with the script as inline-powershell. But this only works on project-scope..
Another attempt i'd try would be to create a repo containing your powershell-scripts, add this repo as submodule in the repository you are trying to build and then call the scripts from the submodule-folder. But this only works when using git-repos.
Or you could create a custom build-task that contains your script.
From what I have seen, no.
A few different options I have explored are:
If using a non-hosted agent, saving the file onto the build server. Admittedly this doesn't scale well, but it is better than copy/pasting the script all over. I was able to put these scripts into version control and deploy them via their own pipeline so that might be an solution for scaling (if necessary)
Cloning another repository that has these shared scripts during the process.
I've been asking for this feature for a bit, but it seems the Azure DevOps team has higher priorities.
How about putting the powershell in a nuget package and install that in depending projects?
I just discovered YAML templates (https://learn.microsoft.com/en-us/azure/devops/pipelines/process/templates?view=azdevops#step-re-use).
I think it may help you in this case (depending how large it is your file), you can put an inline powershell script in that template yaml, and reuse it on your main yaml.
Documentation is pretty straightforward.
I am using VSTS for my OPA5 Tests, so all works for one project. For this I created a Build for these Projects i wanted to test.
But if I want to test all projects, do I need to create a build for all Project or is there a solution to build all projects with one build definition?
The build should do always the same things, saved in a YAML File.
I have seen thats is possible to do builds with difficult branches but not with difficult repositories.
So has anyone a solution for this or is it impossible at the moment?
Yes, it's possible.
You just need to clone another git repositories at the beginning of the build.
So you can add a PowerShell task as the first task and execute git clone command.
And If you are using YAML file, just add the script to execute the PowerShell task.
Besides, you can also refer the post VSTS build from multiple repositories.
Tools: Team Foundation Server 2012
NuGet: 3.0
Powershell: 3.0
I would like to know how to put build related files onto the build server during the build process.
We have a CI Build that runs on check in, so unless we want the build firing every time we update our nuget.config or a powershell script we need to store them someplace else.
I can put them in a separate team project, and add them to our Working folders, but then again TFS will launch a build whenever we make a change to our build files. We don't want that.
Is there no way to host some build files into TFS and download them into the build server during build that does NOT cause a new build to fire if you change those files?
If I change a PS1 file, we don't want the build running because of that.
When checking them in you can add: ***NO_CI*** to the comment, and CI builds will not trigger.
Other than that, you can fire off a Powershell script by customizing the XAML and adding an Invoke process task to call powershell.exe and passing the powershell script you do keep in source control.
From there you can either use OneGet or Download-File to fetch and install anything you need.
I'm new to Jenkins CI.I'm trying to get SVN update (myFolder) inside a job as build steps. I want to explicitly copy some files to web root as I can't have them inside my solution.
Build Steps I need to perform.
Build Solution
Publish
Copy myFolder to web root
Sync
Up to Publish it works fine.Problem when trying to copy/update myFolder to web root.
MyFolder is located out of the project solution folder as I cant have it inside solution Folder.
Note: This myFolder has serialized items/object that I need to Sync in the next step.It should be copied to web root in-order to sync.
And this folder is committed to SVN.
In my local CMD following batch file works fine but when I try in Jenkins Execute Windows Batch Command it stops at
-- Updating source from SVN
-- Running update...
#echo off
cls
echo -- Initiating system instance variables...
echo. -- Setting the variables...
:: Here you need to make some changes to suit your system.
set SOURCE=C:\inetpub\wwwroot\Test\Website\App_Data\myFolder\
set SVN=C:\Program Files\TortoiseSVN\bin
:: Unless you want to modify the script, this is enough.
echo. %SOURCE%
echo. %SVN%
echo. ++ Done setting variables.
echo.
echo -- Updating source from SVN
echo. -- Running update...
"%SVN%\TortoiseProc.exe" /command:update /path:"%SOURCE%" /closeonend:1
echo. ++ Done.
echo. -- Cleaning up...
set SOURCE=
set SVN=
echo. ++ Done.
I have Subversion Plugin installed.Any solution for this problem.
And Also I tried using below Powershell Script
#Get checkout folder
TortoiseProc.exe /command:"update" /path:"C:\inetpub\wwwroot\Test\Website\App_Data\myFolder\"
It works in my local Windows Powershell but not in Jenkins Windows Powershell
In an effort to help answer your question, I will explain the configuration of a job which should accommodate what you are trying to achieve: building a project under version control after an svn update has been performed and moving the generated files to a separate directory.
Setup the Source Code Management section
Within this section in your job's configuration page, choose the appropriate version control system (ie, Subversion) and point the job to your project's URL, noted below. Also be mindful to select the appropriate check-out strategy. This is what Jenkins will use when your job runs (ie, svn update) as Jenkins will store a copy of your repository on the build-server in the job's workspace.
Without proceeding any further, this job will only pull down any changes from your repository through the appropriate check-out strategy configured above when this job runs.
However, you would like Jenkins job to actually do something meaningful when the job runs, such as build/publish your project. This is achieved through build steps, so let's configure build steps.
Configure the appropriate build step(s)
Build/Publish Website Locally
Assuming you have scripts already written to build/publish the website under version control (let's call it !Publish Website.bat as an example) which builds the project and publishes it locally, you can configure the step underneath the Build section as follows,
Note: %WORKSPACE% is a built-in environment variable which resolves to the current workspace of the job. There is a link under the build-step to list all the different environment variables exposed which can be used.
Without proceeding any further, the job will now pull down any changes and execute the batch file to publish/build a website locally within your workspace when this job runs.
Not quite done considering you wish to have these newly generated files to reside within your website's webroot folder so these changes are reflected on your website. For simplicity's sake we can go ahead and add another build-step to perform the copy.
Copy Contents to Webroot
Assuming you have scripts already written to copy the contents of the website under version control (let's call it !Copy Website.bat) which takes the published files and copies them to the appropriate directory on your webserver, you can configure the step underneath the Build section as follows,
Now when the job runs, it will perform an svn update against the repository on it's local workspace and execute the preceeding build-steps (ie, build/publish the solution and copy the contents to your webroot.)
I'd like to check in assemblies to TFS source control after successful project build on TeamCity. Are there any elegant and easy way to do that?
I can create a command line step and run tf.exe with parameters, but then I need to provide credentials to connect to TFS, map the directories and finally do the check in.
The second option is to set up the powershell step, and use one of the cmdlets, but this requires installation of cmdlets on the build machine which I don't want to do.
Have you got any experience in such case? Maybe I can use the credentials used by TeamCity to get the sources, and do not map the directories but use the downloaded structure / sources?
This answers your question but it is not normally a good idea to commit binaries to your source control. You have a couple of choices.
Create a nuget package manually:Nuget Packages can be stored in a
shared folder. You can manually create a package in 5 minutes.
If your other projects are built using teamcity, check out artifact
dependencies in teamcity.
TF.exe commandline tool is the best feasible option for this scenario.