I have a jenkins job that calls a powershell file. When I use it from a freestyle project it shows the powershell execution in the console output. Having switched it to a pipeline job I no longer see the output.
Currently my pipeline looks like this:
pipeline
{
stages
{
stage ('Deploy To Dev')
{
steps
{
powershell '"%WORKSPACE%\\SpearsLoad\\Scripts\\CIDeployToDev.Ps1"'
}
}
}
}
but I get no logging of the powershell steps.
Following the documentation I tried changing the stage to:
pipeline
{
stages
{
stage ('Deploy To Dev')
{
steps
{
node('Deploy the SSIS load')
{
//Deploy the SSIS load
def msg = powershell(returnStdout: true, script: '"%WORKSPACE%\\SpearsLoad\\Scripts\\CIDeployToDev.Ps1"')
println msg
}
}
}
}
}
but that gives:
Expected a step # line 123, column 6. def msg =
powershell(returnStdout: true, script:
'"%WORKSPACE%\SpearsLoad\Scripts\CIDeployToDev.Ps1"')
I feel like I am missing something quite fundamental. What am I doing wrong ?
You need to wrap your pipeline execution into script section, because you're trying to use scripted syntax in declarative pipeline:
script {
//Deploy the SSIS load
def msg = powershell(returnStdout: true, script: '"%WORKSPACE%\\SpearsLoad\\Scripts\\CIDeployToDev.Ps1"')
println msg
}
For anyone who comes here looking for answers, I need to point out that there were actually 3 separate problems with the code:
The lack of a script section as per the accepted answer.
The powershell was not actually running due to not calling it correctly
The powershell was not running due to spaces in the %WORKSPACE% variable
in the end, I went with:
script {
def msg = powershell(returnStdout: true, script: " & './SpearsLoad\\Scripts\\CIDeployToDev.Ps1'")
println msg
}
which actually works!
I use
Write-Host "My message"
in powershell. This will show up in the log.
Related
In my VS Code extension, I'd like to invoke a python file and append standard output to VSCode's output channel. I'm aware that I can create an output channel via vscode.window.createOutputChannel("..."); and I'm also aware I can execute such python (or for this matter, any local bash) script via these APIs: https://nodejs.org/api/child_process.html. Would it be possible to append standard out information to this output channel in real-time (i.e., as the script runs)?
What I currently have is the following:
export function execute(cmd: string, callback: any, logging?: vscode.OutputChannel) {
const spwanedProcess = spawn(cmd, [], {shell: true, detached: true});
console.log(`spawned pid ${spwanedProcess.pid} with command ${cmd}`);
spwanedProcess.stdout.on('data', (data: any) => {
console.log(data);
logging?.appendLine("stdout:" + data);
});
spwanedProcess.stderr.on('data', (data: any) => {
console.error(`spawned pid ${spwanedProcess.pid} pushed something to stderr`);
logging?.appendLine(data);
});
spwanedProcess.on('exit', function(code: any) {
if (code !== 0) {
console.log('Failed: ' + code);
}
else {
console.log(`pid ${spwanedProcess.pid} finished`);
}
callback();
});
}
where callback() is something to be executed after the execution is done. I got this structure from here https://stackoverflow.com/a/32872753/14264786.
However, when I run a simple python code that sleeps for 3 seconds and prints something and does this again, the standard out information is still not displayed on the logging in real-time. Instead, they are outputted together after the script finishes.
Any idea on what's a potential solution?
After some more digging, I found out that one solution is to use python -u to invoke the python script to unbuffer the python output
Am really struggling with this and the only examples I can seem to find are for cmdlets to be run not scripts cloned from a Git repo. I have the following:
pipeline {
environment {
JOB_BASE_NAME="org-onprem-2016"
VERSION="$BUILD_NUMBER"
teamsColorRed="C50E2E"
teamsColorGreen="7FBA00"
}
// Helper function to run PowerShell Commands
agent {
node {
label 'Windows'
}
}
stages {
stage('Checkout SCM') {
steps {
step([$class: 'WsCleanup'])
dir ('inv_onprem_2016') {
git url: 'https://github.local/org-cit/org_onprem_2016.git', credentialsId: 'svc-tokenauth'
}
}
}
stage('Initiate Build Variables') {
steps {
powerShell('scripts\\2012\\initiatevars.ps1')
}
}
I have tried multiple iterations over the steps bit but always get some form of error around path not found etc. I think I am missing how the workspace should be addressed but tried $ENV:WORKSPACE in front and got the same result (different error around no "ENV" being defined). Can anyone help steer me in the right direction for this?
I want to create a GitHub action that is simple and only run a bash-script file, see previous
question: How to execute a bash-script from a java script
With this javascript action, I want to pass values to the bash-script from the JSON payload given by GitHub.
Can this be done with something as simple as an exec command?
...
exec.exec(`export FILEPATH=${filepath}`)
...
I wanted to do something like this, but found there to be much more code needed that I originally expected. So while this is not simple, it does work and will block the action script while the bash script runs:
const core = require('#actions/core');
function run() {
try {
// This is just a thin wrapper around bash, and runs a file called "script.sh"
//
// TODO: Change this to run your script instead
//
const script = require('path').resolve(__dirname, 'script.sh');
var child = require('child_process').execFile(script);
child.stdout.on('data', (data) => {
console.log(data.toString());
});
child.on('close', (code) => {
console.log(`child process exited with code ${code}`);
process.exit(code);
});
}
catch (error) {
core.setFailed(error.message);
}
}
run()
Much of the complication is handling output and error conditions.
You can see my debugger-action repo for an example.
I'm having difficulty passing BUILD_USER from Jenkins to PowerShell. I'm using the Build User Vars plugin in Jenkins. I can output "${BUILD_USER}" just fine, but when I try to pass it to PowerShell, I get nothing.
Here's my current groovy file:
#!groovy
pipeline {
agent {
label 'WS'
}
stages {
stage ('Pass build user'){
steps {
wrap([$class: 'BuildUser']) {
script {
def BUILDUSER = "${BUILD_USER}"
echo "(Jenkins) BUILDUSER = ${BUILD_USER}"
powershell returnStatus: true, script: '.\\Jenkins\\Testing\\Output-BuildUser.ps1'
}
}
}
}
}
post {
always {
cleanWs(deleteDirs: true)
}
}
}
Here's my PowerShell file:
"(PS) Build user is $($env:BUILDUSER)"
Not sure what else to try.
def BUILDUSER = ... does not create an environment variable, so PowerShell (which invariably runs in a child process) won't see it.
To create an environment variable named BUILDUSER that child processes see, use env.BUILDUSER = ...
How can I call a function in a workflow from a nested InlineScript?
The following throws an exception because the function is out of scope in the InlineScript:
Workflow test
{
function func1
{
Write-Verbose "test verbose" -verbose
}
InlineScript
{
func1
}
}
test
"The inlinescript activity runs commands in a standard, non-workflow Windows PowerShell session and then returns the output to the workflow."
Read more here.
Each inlinescript is executed in a new PowerShell session, so it has no visibility of any functions defined in the parent workflow. You can pass a variable to workflow using the $Using: statement,
workflow Test
{
$a = 1
# Change the value in workflow scope usin $Using: , return the new value.
$a = InlineScript {$a = $Using:a+1; $a}
"New value of a = $a"
}
Test
PS> New value of a = 2
but not a function, or module for that mater.
In the past I used the technique where I put all the common stuff in powershell module file, and do:
workflow Hey
{
PrepareMachine
ConfigureIIS
}
function PrepareMachine() {
Import-Module "MyCommonStuff"
CallSomethingBlahBlah()
}
function ConfigureIIS {
Import-Module "MyCommonStuff"
CallSomethingBlahBlah2()
}
You don't even have to wrap it in a module, you could just define the function out of workflow, and it would still work:
workflow Hey
{
InlineScript {
func1
}
}
function func1 {
Write-Output "Boom!"
}
That said, I was not impressed by workflows at all. Seems like quite pointless feature if you ask me. The most useful stuff about workflows is the ability to run things in parallel, but jobs can do it too. The idea above rant is that make sure you really do need workflows :)