Powershell script not working when included in Jenkins Pipeline Script - powershell

What I am trying to achieve with Powershell is as follows:
Increment the Build Number in the AssemblyInfo.cs file on the Build Server. My Script looks like below right now after over a 100 iterations of different variations I am still unable to get it to work. The script works well in the Powershell console but when included into the Jenkins Pipeline Script I get various errors that are proving hard to fix...
def getVersion (file) {
def result = powershell(script:"""Get-Content '${file}' |
Select-String '[0-9]+\\.[0-9]+\\.[0-9]+\\.' |
foreach-object{$_.Matches.Value}.${BUILD_NUMBER}""", returnStdout: true)
echo result
return result
}
...
powershell "(Get-Content ${files[0].path}).replace('[0-9]+\\.[0-9]+\\.[0-9]+\\.[0-9]+',
${getVersion(files[0].path)})) } | Set-Content ${files[0].path}"
...

How about a groovy approach (with Jenkins keywords) instead of PowerShell:
def updtaeAssemblyVersion() {
def files = findFiles(glob: '**/AssemblyInfo.cs')
files.each {
def content = readFile file: it.path
def modifedContent = content.repalceAll(/([0-9]+\\.[0-9]+\\.[0-9]+\\.)([0-9]+)/,"\$1${BUILD_NUMBER}")
writeFile file: it.path, text: modifedContent
}
}
It will read all relevant files and replace only the build section of the version for every occurrence that matches the version regex.

Related

Output file not created when running a R command in a Nextflow file?

I am trying to run a nextflow pipeline but the output file is not created.
The main.nf file looks like this:
#!/usr/bin/env nextflow
nextflow.enable.dsl=2
process my_script {
"""
Rscript script.R
"""
}
workflow {
my_script
}
In my nextflow.config I have:
process {
executor = 'k8s'
container = 'rocker/r-ver:4.1.3'
}
The script.R looks like this:
FUN <- readRDS("function.rds");
input = readRDS("input.rds");
output = FUN(
singleCell_data_input = input[[1]], savePath = input[[2]], tmpDirGC = input[[3]]
);
saveRDS(output, "output.rds")
After running nextflow run main.nf the output.rds is not created
Nextflow processes are run independently and isolated from each other from inside the working directory. For your script to be able to find the required input files, these must be localized inside the process working directory. This should be done by defining an input block and declaring the files using the path qualifier, for example:
params.function_rds = './function.rds'
params.input_rds = './input.rds'
process my_script {
input:
path my_function_rds
path my_input_rds
output:
path "output.rds"
"""
#!/usr/bin/env Rscript
FUN <- readRDS("${my_function_rds}");
input = readRDS("${my_input_rds}");
output = FUN(
singleCell_data_input=input[[1]], savePath=input[[2]], tmpDirGC=input[[3]]
);
saveRDS(output, "output.rds")
"""
}
workflow {
function_rds = file( params.function_rds )
input_rds = file( params.input_rds )
my_script( function_rds, input_rds )
my_script.out.view()
}
In the same way, the script itself would need to be localized inside the process working directory. To avoid specifying an absolute path to your R script (which would not make your workflow portable at all), it's possible to simply embed your code, making sure to specify the Rscript shebang. This works because process scripts are not limited to Bash1.
Another way, would be to make your Rscript executable and move it into a directory called bin in the the root directory of your project repository (i.e. the same directory as your 'main.nf' Nextflow script). Nextflow automatically adds this folder to the $PATH environment variable and your script would become automatically accessible to each of your pipeline processes. For this to work, you'd need some way to pass in the input files as command line arguments. For example:
params.function_rds = './function.rds'
params.input_rds = './input.rds'
process my_script {
input:
path my_function_rds
path my_input_rds
output:
path "output.rds"
"""
script.R "${my_function_rds}" "${my_input_rds}" output.rds
"""
}
workflow {
function_rds = file( params.function_rds )
input_rds = file( params.input_rds )
my_script( function_rds, input_rds )
my_script.out.view()
}
And your R script might look like:
#!/usr/bin/env Rscript
args <- commandArgs(trailingOnly = TRUE)
FUN <- readRDS(args[1]);
input = readRDS(args[2]);
output = FUN(
singleCell_data_input=input[[1]], savePath=input[[2]], tmpDirGC=input[[3]]
);
saveRDS(output, args[3])

Setting output variable using newman / postman is getting cut off

I have an output variable siteToDeploy and siteToStop. I am using postman to run a test script against the IIS Administration API. In the test portion of one of the requests I am trying to set the azure devops output variable. Its sort of working, but the variable value is getting cut off for some reason.
Here is the test script in postman:
console.log(pm.globals.get("siteName"))
var response = pm.response.json();
var startedSite = _.find(response.websites, function(o) { return o.name.indexOf(pm.globals.get("siteName")) > -1 && pm.globals.get("siteName") && o.status == 'started'});
var stoppedSite = _.find(response.websites, function(o) { return o.name.indexOf(pm.globals.get("siteName")) > -1 && o.status == 'stopped'});
if(stoppedSite && startedSite){
console.log('sites found');
console.log(stoppedSite.id)
console.log('##vso[task.setvariable variable=siteToDeploy;]' + stoppedSite.id);
console.log('##vso[task.setvariable variable=siteToStop;]' + startedSite.id);
}
Here is the output form Newman:
Here is the output from a command line task echoing the $(siteToDeploy) variable. It's getting set, but not the entire value.
I've tried escaping it, but that had no effect. I also created a static command line echo where the variable is set and that worked fine. So I am not sure if it is a Newman issue or Azure having trouble picking up the varaible.
The issue turned our to be how Azure is trying to parse the Newman console log output. I had to add an extra Powershell task to replace the ' coming back from the Newman output.
This is what is looks like:
##This task is only here because of how Newman is writing out the console.log
Param(
[string]$_siteToDeploy = (("$(siteToDeploy)") -replace "'",""),
[string]$_siteToStop = (("$(siteToStop)") -replace "'","")
)
Write-Host ("##vso[task.setvariable variable=siteToDeploy;]{0}" -f ($_siteToDeploy))
Write-Host ("##vso[task.setvariable variable=siteToStop;]{0}" -f ($_siteToStop))

Cake Task output log to file

I have a set of Tasks inside a build.cake file and I would like to capture the log output from the console into a log file. I know it's possible to use the OnError() function to output errors to file but I would like to output everything to a log file, not just errors.
Below is an example of the build.cake file.
#load "SomeTask.cake"
#load "SomeOtherTask.cake"
var target = Argument("target", "Default");
var someTask = Task("SomeTask")
.Does(() =>
{
SomeMethodInsideSomeTask();
});
var someOtherTask = Task("SomeOtherTask")
.Does(() =>
{
SomeOtherMethodInsideSomeOtherTask();
});
Task("Default")
.IsDependentOn(someTask)
.IsDependentOn(someOtherTask);
RunTarget(target);
N.B. The Tasks are not running any sort of MSBuild commands so it's not possible to use MSBuildFileLogger.
How about pipe the stdout to a file i.e.
./build.ps1 > log.txt
Have you heard about tee ?
It reads standard input and writes it to both standard output and one or more files

How to get console output of downstream job in upstream job?

I'm trying to find a workaround because first question still is unanswered.
can't run Start-Job with credentials from Jenkins
I have job A. Job A starts powershell script at server and shows some output.
Also I have a pipeline B that runs multiple copy of job A against different servers.
Here is the groovy code
stage 'Copy sources from Git'
build job: 'DeploymentJobs/1_CopySourcesFromGit'
stage 'Deploy to servers'
def servers = env.SERVERLIST.split('\n')
def steps =[:]
for (int i=0; i<servers.size(); i++) {
def server = servers[i]
def stepName = "running ${server}"
steps[stepName] = {->
echo server
build job: 'DeploymentJobs/2_DeployToServer', parameters:
[booleanParam(name: 'REBOOTAFTER', value: Boolean.valueOf(REBOOTAFTER)),
string(name: 'SERVERNAME', value: server)]
}
}
parallel steps
In the output of pipeline I see only info than N copies of job A started but no their output.
I want to see only powershell output from all instances of job A in console output of pipeline B.
I have no iedea how to do this, Is it possible.
A shorter alternative:
def result = build job: 'job_name', wait: true
println result.getRawBuild().getLog()
It will be necessary to whitelist both methods.
Edit: since you don't want to wait for the build to run, you could add this at the end of your job (or at some point after all triggered builds should be finished) where number_of_builds in your case will be servers.size().
def job = Jenkins.getInstance().getItemByFullName('job_A_name')
def last_build = job.getLastBuild().getNumber()
def first_build = last_build - number_of_builds
(first_build..last_build).each {
println "Log of build $it"
def build = job.getBuildByNumber(it)
println build.log
}
If you really want to be sure the builds you're getting are the ones triggered by your job, you can get the build cause from the build object.
I solved my problem by adding this into the cycle. I could get log of downstrema jobs into upstream, display it and work with it.
def checkjob = build job: 'job name', parameters: [ any params here]
checklog = Jenkins.getInstance().getItemByFullName('job name').getBuildByNumber(checkjob.getNumber()).log
println checklog
Here's a silly example:
$myJobs = #()
$myJobs += Start-job -ScriptBlock { while (1) {Get-Item 'c:\*'; sleep 5}}
$myJobs += Start-job -ScriptBlock { while (1) {Get-Item 'c:\windows\*'; sleep 5}}
try {
while(1)
{
$myJobs | Get-Job -HasMoreData $true | Receive-Job
}
} finally {
$myJobs | Stop-Job
$myJobs | Remove-Job
}
Anything the job pipelines is queued. The -HasMoreData state indicates that the job has output that is available to read. The parent receives the output using Receive-Job. By default it displays in the console, but you can receive the output in the parent process and do further processing.
If this isn't what you're going for you'll have to be more specific in your question. Provide a little of the code you've tried so far.
def jobLog = Jenkins.getInstance().getItemByFullName('job-path/testjob').getLastSuccessfulBuild().log
println(jobLog)

How to get list of TFS builds running at the moment from command line?

I'm trying to automate the deployment process, and as part of it, I need to run my release build from command line. I can do it, using command like
.\TFSBuild start http://server-name:8080/tfs/project-collection project-name build-name priority:High /queue
It even returns some code for the queued build — Build queued. Queue position: 2, Queue ID: 11057.
What I don't know, is how to get info about currently running builds, or about the state of my running build from powershell command line? The final aim is to start publishing after that build completes.
I've already got all necessary powershell scripts to create the deployment package from the build results, zip it, copy to production and install there. All I need now — to know when my build succeedes.
This function will wait for a build with the Queue ID given by TFSBuild.exe:
function Wait-QueuedBuild {
param(
$QueueID
)
[void][Reflection.Assembly]::LoadWithPartialName('Microsoft.TeamFoundation.Build.Client')
[void][Reflection.Assembly]::LoadWithPartialName('Microsoft.TeamFoundation.Client')
$uri = [URI]"http://server-name:8080/tfs/project-collection"
$projectCollection = [Microsoft.TeamFoundation.Client.TfsTeamProjectCollectionFactory]::GetTeamProjectCollection($uri)
$buildServer = $projectCollection.GetService([Microsoft.TeamFoundation.Build.Client.IBuildServer])
$spec = $buildServer.CreateBuildQueueSpec('*','*')
do {
$build = $buildServer.QueryQueuedBuilds($spec).QueuedBuilds| where {$_.Id -eq $QueueID}
sleep 1
} while ($build)
}
You can get the id returned by TFSBuild.exe, then call the function.
$tfsBuild = .\TFSBuild start http://server-name:8080/tfs/project-collection project-name build-name priority:High /queue
Wait-QueuedBuild [regex]::Match($tfsBuild[-1],'Queue ID: (?<id>\d+)').Groups['id'].Value
Using the work by E.Hofman available here it is possible to write a C# console app that uses TFS SDK and reveals if any build agent is currently running as follows:
using System;
using Microsoft.TeamFoundation.Build.Client;
using Microsoft.TeamFoundation.Client;
namespace ListAgentStatus
{
class Program
{
static void Main()
{
TfsTeamProjectCollection teamProjectCollection = TfsTeamProjectCollectionFactory.GetTeamProjectCollection(new Uri("http://TFSServer:8080"));
var buildServer = teamProjectCollection.GetService<IBuildServer>();
foreach (IBuildController controller in buildServer.QueryBuildControllers(true))
{
foreach (IBuildAgent agent in controller.Agents)
{
Console.WriteLine(agent.Name+" is "+agent.IsReserved);
}
}
}
}
}
The parameter .IsReserved is what toggles to 'True' during execution of a build.
I 'm sorry my powershell skills are not good enough for providing with a PS variant of the above. Please take a look here, where the work by bwerks might help you do that.
# load classes for execution
[Reflection.Assembly]::LoadWithPartialName("Microsoft.TeamFoundation.Build.Client") | Out-Null
[Reflection.Assembly]::LoadWithPartialName("Microsoft.TeamFoundation.Client") | Out-Null
# declare working variables
$Uri = New-Object System.Uri "http://example:8080/tfs"
# get reference to projection collection
$ProjectCollection = [Microsoft.TeamFoundation.Client.TfsTeamProjectCollectionFactory]::GetTeamProjectCollection($Uri)
# get reference to build server
$BuildServer = $ProjectCollection.GetService([Microsoft.TeamFoundation.Build.Client.IBuildServer])
# loop through the build servers
foreach($Controller in $BuildServer.QueryBuildControllers($true))
{
# loop through agents
foreach($BuildAgent in $Controller.Agents)
{
Write-Host "$($BuildAgent.Name) is $($BuildAgent.IsReserved)"
}
}