Jenkinsfile pipeline running PS1 script from git - powershell

Am really struggling with this and the only examples I can seem to find are for cmdlets to be run not scripts cloned from a Git repo. I have the following:
pipeline {
environment {
JOB_BASE_NAME="org-onprem-2016"
VERSION="$BUILD_NUMBER"
teamsColorRed="C50E2E"
teamsColorGreen="7FBA00"
}
// Helper function to run PowerShell Commands
agent {
node {
label 'Windows'
}
}
stages {
stage('Checkout SCM') {
steps {
step([$class: 'WsCleanup'])
dir ('inv_onprem_2016') {
git url: 'https://github.local/org-cit/org_onprem_2016.git', credentialsId: 'svc-tokenauth'
}
}
}
stage('Initiate Build Variables') {
steps {
powerShell('scripts\\2012\\initiatevars.ps1')
}
}
I have tried multiple iterations over the steps bit but always get some form of error around path not found etc. I think I am missing how the workspace should be addressed but tried $ENV:WORKSPACE in front and got the same result (different error around no "ENV" being defined). Can anyone help steer me in the right direction for this?

Related

How to pass GitHub action event hook from a javascript to a bash file?

I want to create a GitHub action that is simple and only run a bash-script file, see previous
question: How to execute a bash-script from a java script
With this javascript action, I want to pass values to the bash-script from the JSON payload given by GitHub.
Can this be done with something as simple as an exec command?
...
exec.exec(`export FILEPATH=${filepath}`)
...
I wanted to do something like this, but found there to be much more code needed that I originally expected. So while this is not simple, it does work and will block the action script while the bash script runs:
const core = require('#actions/core');
function run() {
try {
// This is just a thin wrapper around bash, and runs a file called "script.sh"
//
// TODO: Change this to run your script instead
//
const script = require('path').resolve(__dirname, 'script.sh');
var child = require('child_process').execFile(script);
child.stdout.on('data', (data) => {
console.log(data.toString());
});
child.on('close', (code) => {
console.log(`child process exited with code ${code}`);
process.exit(code);
});
}
catch (error) {
core.setFailed(error.message);
}
}
run()
Much of the complication is handling output and error conditions.
You can see my debugger-action repo for an example.

How to pass BUILD_USER from Jenkins to PowerShell

I'm having difficulty passing BUILD_USER from Jenkins to PowerShell. I'm using the Build User Vars plugin in Jenkins. I can output "${BUILD_USER}" just fine, but when I try to pass it to PowerShell, I get nothing.
Here's my current groovy file:
#!groovy
pipeline {
agent {
label 'WS'
}
stages {
stage ('Pass build user'){
steps {
wrap([$class: 'BuildUser']) {
script {
def BUILDUSER = "${BUILD_USER}"
echo "(Jenkins) BUILDUSER = ${BUILD_USER}"
powershell returnStatus: true, script: '.\\Jenkins\\Testing\\Output-BuildUser.ps1'
}
}
}
}
}
post {
always {
cleanWs(deleteDirs: true)
}
}
}
Here's my PowerShell file:
"(PS) Build user is $($env:BUILDUSER)"
Not sure what else to try.
def BUILDUSER = ... does not create an environment variable, so PowerShell (which invariably runs in a child process) won't see it.
To create an environment variable named BUILDUSER that child processes see, use env.BUILDUSER = ...

Jenkins Powershell write to console

I have a jenkins job that calls a powershell file. When I use it from a freestyle project it shows the powershell execution in the console output. Having switched it to a pipeline job I no longer see the output.
Currently my pipeline looks like this:
pipeline
{
stages
{
stage ('Deploy To Dev')
{
steps
{
powershell '"%WORKSPACE%\\SpearsLoad\\Scripts\\CIDeployToDev.Ps1"'
}
}
}
}
but I get no logging of the powershell steps.
Following the documentation I tried changing the stage to:
pipeline
{
stages
{
stage ('Deploy To Dev')
{
steps
{
node('Deploy the SSIS load')
{
//Deploy the SSIS load
def msg = powershell(returnStdout: true, script: '"%WORKSPACE%\\SpearsLoad\\Scripts\\CIDeployToDev.Ps1"')
println msg
}
}
}
}
}
but that gives:
Expected a step # line 123, column 6. def msg =
powershell(returnStdout: true, script:
'"%WORKSPACE%\SpearsLoad\Scripts\CIDeployToDev.Ps1"')
I feel like I am missing something quite fundamental. What am I doing wrong ?
You need to wrap your pipeline execution into script section, because you're trying to use scripted syntax in declarative pipeline:
script {
//Deploy the SSIS load
def msg = powershell(returnStdout: true, script: '"%WORKSPACE%\\SpearsLoad\\Scripts\\CIDeployToDev.Ps1"')
println msg
}
For anyone who comes here looking for answers, I need to point out that there were actually 3 separate problems with the code:
The lack of a script section as per the accepted answer.
The powershell was not actually running due to not calling it correctly
The powershell was not running due to spaces in the %WORKSPACE% variable
in the end, I went with:
script {
def msg = powershell(returnStdout: true, script: " & './SpearsLoad\\Scripts\\CIDeployToDev.Ps1'")
println msg
}
which actually works!
I use
Write-Host "My message"
in powershell. This will show up in the log.

When to 'inline' tasks and when to extract a separate

I am trying to figure out what criteria should be for a decision to 'inline' some work as a set of calls directly in a lets say Does clause (using aliases), or have a set of separate tasks with proper dependencies. It seems like it can be done in either way
For example
var target = Argument ("target", "build");
Task ("build")
.Does (() =>
{
NuGetRestore ("./source/solution.sln");
DotNetBuild ("./source/solution.sln", c => c.Configuration = "Release");
CopyFiles ("./**/*.dll", "./output/");
});
Task ("pack")
.IsDependentOn ("build")
.Does (() =>
{
NuGetPack ("./solution.nuspec");
});
RunTarget (target);
I could 'inline' all of this right into 'pack' task, and I could have a separate task for each of nuget restore, dotnetbuild and copy files actions
Unfortunately, the main answer to this is, it depends. It depends on your own preferences, and how you want to work.
Personally, I break Tasks into a concrete piece of functionality, or unit of work. So, in the above example, I would have a Task for:
NuGetRestore
DotNetBuild
CopyFiles
NuGetPack
The thought process here being that depending on what I wanted to do, I might want to only run one of those tasks, and I wouldn't want to do everything that was required. Breaking the Tasks into individual ones, gives me the option to piece these Tasks together as required.
If you put all the aliases into a single Task, you no longer have the option of doing that.
Best practice is to have one task per step in your build process, an example flow could be:
Clean
Restore
Build
Test
Pack
Publish
Then it'll be much more clear what takes time and what's the cause of any failure.
Cake will abort on any failure so the flow will be the same, but it'll give you more granular control and insight.
There's a simple example solution at github.com/cake-build/example
Convertng your script according to that example would look something like this:
var target = Argument("target", "Pack");
var configuration = Argument("configuration", "Release");
FilePath solution = File("./source/solution.sln");
Task("Clean")
.Does(() =>
{
CleanDirectories(new [] {
"./source/**/bin/" + configuration,
"./source/**/obj/" + configuration
});
});
Task("Restore")
.IsDependentOn("Clean")
.Does(() =>
{
NuGetRestore(solution);
});
Task("Build")
.IsDependentOn("Restore")
.Does(() =>
{
if(IsRunningOnWindows())
{
// Use MSBuild
MSBuild(solution, settings =>
settings.SetConfiguration(configuration));
}
else
{
// Use XBuild
XBuild(solution, settings =>
settings.SetConfiguration(configuration));
}
});
Task("Pack")
.IsDependentOn("Build")
.Does(() =>
{
NuGetPack("./solution.nuspec", new NuGetPackSettings {});
});
RunTarget(target);
Which will give you a nice step by step summary report like this
Task Duration
--------------------------------------------------
Clean 00:00:00.3885631
Restore 00:00:00.3742046
Build 00:00:00.3837149
Pack 00:00:00.3851542
--------------------------------------------------
Total: 00:00:01.5316368
If any step fails, it'll be much more clear which.

Nothing happens using spawn gulp task to execute commands into subfolder

Yesterday, I've been sent to a reference to use process_child.spawn for my need. I'd like gulp executing commands for me, to avoid typing commands concerning my dependency when I need to compile my main project.
I got something that seems ok, any error into logs, but nothing happened, the way my commands wouldn't been executed.
Any one with feedback about my code ? I got another task like this one to compile the dependency.
var spawn = require("child_process").spawn;
gulp.task("my-dependency-install", function(done) {
spawn("ft", ["install"], {
cwd: "node_modules/app/my-dependency/"
})
.on("error", function (err) {
throw err
})
.on("close", done);
});
Thanks
Here is the way I've fixed it :
var spawn = require("child_process").spawn;
spawn("ft.cmd", ["install"], {
cwd: "node_modules/app/my-dependency/"
})
.on("error", function (err) {
throw err
});
Anyone is able to explain why I had to add .cmd ? It's because of windows OS, isn'it ?