# test.ps1
function foo {
echo "bar"
}
I have a file named test.ps1 which contains some frequently called functions. And I want it to be shared between my jenkins master and slave nodes.
I've tried creating 2 copies of test.ps1 and put them in master and slave nodes. But this is not convenient. Because I'll have to maintain 2 test.ps1s
Another way I've tried is putting 1 test.ps1 at master node and copying test.ps1 from master to slave with Publish Over SSH Plugin whenever I need to use test.ps1 at slave node. This is not convenient, either.
How can I share test.ps1 between master and slave nodes?
I found that Config File Provider Plugin solves my problem:
First, create a custom file using this plugin:
Write the name and content of this shared file. Then save it.
Inside your project, check Provide Configuration files
Jenkins will create test.ps1 inside Target folder (In my case, I set it to the workspace of the project) whenever you build the project. Note that this folder must exist before building the project.
Variable represents an environment variable with which you can refer to the file test.ps1.
Inside your build step, you can import the file using . $env:util. Then you will be able to call the function foo.
The above can be done no matter whether you are at master or slave node.
Related
My agent is installed in C:/azagent, as it was by default after running the powershell script created for me when creating a new deployment group.
My build artifact "drop" gets placed in C:\azagent\A1\_work\r2\a\_TransactionImportTurkey-Test\drop
It is here i run into issues. I want to copy the files over to V:\Program\TransactionImportTurkey\TransactionImportApp but get this error:
enter image description here
I have no issues copying the files when i use a filepath within C:/ root directory.
This is the YAML-file :
enter image description here
How can I make a release to another root directory than C:/ ?
I would do four things here:
First, ensure that your Azure Pipelines Agent is running under a domain account and not a local one.
Second, ensure that the target folder grants write privileges to that account.
Third, run net use to get the target mapping for V:
Fourth, change the target path for your copy to reference the path by the UNC mapping for V:, so instead of V:\Program\TransactionImportTurkey\TransactionImportApp, it would be \\TargetServerNameOrIP\PathToTargetShare\Program\TransactionImportTurkey\TransactionImportApp
I want a Rundeck job to download a file over HTTP on the Rundeck server, copy that file over to other nodes, do work on that file on the nodes, and then delete the file from the Rundeck server.
So far, I've got three jobs:
Get File: has "url" and "localfile" options
Delete File: has "localfile" option
Main Job: has "url" option.
I have Main Job doing these steps:
Workflow step: Call "Get File" job with -url ${option.url} -localfile /tmp/tempfile.${job.execid}
Node step: Copy file to node with SourcePath=/tmp/tempfile.${job.execid} and DestinationPath=/tmp/tempfile.${job.execid}
Node step: Run inline script on node
Workflow step: Call "Delete File" job with -localfile /tmp/tempfile.${job.execid}
Is there some way I can define a variable or an option for "localfile" for reuse in all my steps rather than having to put '/tmp/tempfile.${job.execid} in three or four places? If I want to redefine where this tempfile is later, it would be much easier to have one place to change it. I have tried defining an option built from other options in "Main Job", but it didn't work.
You can create an environment variable for it, but you still need to pass this variable to next job.
Context Variable Usage
Make sure you Configuring remote machine for SSH
For a number of reasons, it would be really useful if I could create a file from a Jenkins pipeline and put it in my workspace. If I can do this, I could avoid pulling in some repositories where I'm currently pulling them in for just one or two files, keep those files in a maintainable place, and I could also use this to create temporary powershell scripts, working around a limitation of the solution described in https://stackoverflow.com/a/42576572
This might be possible through a Pipeline utility, although https://jenkins.io/doc/pipeline/steps/pipeline-utility-steps/ doesn't list any such utility; or it might be possible using a batch script - as long as that can be passed in as a string
You can do something like that:
node (''){
stage('test'){
bat """
echo "something" > file.txt
"""
String out = readFile(file.txt).trim()
print out // prints variable out groovy style
out.useFunction() // allows running functions loaded from the file
bat "type %out%" // batch closure can access the variable
}
}
I would like to capture the output of some variables to be used elsewhere in the job using Jenkins Powershell plugin.
Is this possible?
My goal is to build the latest tag somehow and the powershell script was meant to achieve that, outputing to a text file would not help and environment variables can't be used because the process is seemingly forked unfortunately
Besides EnvInject the another common approach for sharing data between build steps is to store results in files located at job workspace.
The idea is to skip using environment variables altogether and just write/read files.
It seems that the only solution is to combine with EnvInject plugin. You can create a text file with key value pairs from powershell then export them into the build using the EnvInject plugin.
You should make the workspace persistant for this job , then you can save the data you need to file. Other jobs can then access this persistant workspace or use it as their own as long as they are on the same node.
Another option would be to use jenkins built in artifact retention, at the end of the jobs configure page there will be an option to retain files specified by a match (e.g *.xml or last_build_number). These are then given a specific address that can be used by other jobs regardless of which node they are on , the address can be on the master or the node IIRC.
For the simple case of wanting to read a single object from Powershell you can convert it to a JSON string in Powershell and then convert it back in Groovy. Here's an example:
def pathsJSON = powershell(returnStdout: true, script: "ConvertTo-Json ((Get-ChildItem -Path *.txt) | select -Property Name)");
def paths = [];
if(pathsJSON != '') {
paths = readJSON text: pathsJSON
}
I'm new to Chef and seeking help here. I'm looking into using Chef to deploy our builds to Chef node servers (Windows Server 2012 machines). I have a cookbook called copy_builds that goes out to a central repository and selects the build we want to deploy and copies it out to the node server. The recipe I have contains basic steps that perform the copy steps, and this recipe could be used for all builds we want to deploy except for one thing: the build name.
Here is an example of the recipe:
powershell_script 'Copy build files' do
code '
$Project = "Dev3_SomeCoolBuild"
net use "\\\\server\\build_share\\drop\\$Project"
$BuildNum = GC "\\\\server\\build_share\\drop\\$Project\\buildlabel.txt"
robocopy \\\\server\\build_share\\drop\\$Project\\bin W:\\binroot\\$BuildNum'
end
As you can see, the variable $Project contains the name of the build in this recipe. If we have 100 different builds, all with different names, then what is the best way to handle this without creating 100 different recipes for my copy_builds cookbook?
BTW: this is how I'm currently calling Chef to deploy, which is in a PowerShell script that's external to Chef:
knife node run_list set $Node "recipe[copy_builds::$ProjectName],recipe[install_build]"
This command (from the external PowerShell script) contains the project/build name info within it's own $ProjectName variable. In this case $ProjectName contains the value of 'Dev3_SomeCoolBuild', to reference the recipe Dev3_SomeCoolBuild.rb.
What I'd like is have just one default recipe under copy_builds cookbook, and pass in the build/project name. Is this possible? And what is the best way to do it? I've read about data bags, attributes, and providers, but not sure if they would work for what I want.
Please advise.
Thanks,
Keith
The best approach for you is likely to use a single recipe that gets a list of projects to deploy from a databag or node attributes (or both). So basically take what you have now and put it in a loop, and then use either roles to set node attributes or put the project mapping into a databag item.
I ended up using attributes here to solve my problem. I updated my script to write the build name to the attributes/default.rb file for the copy_builds recipe and upload the cookbook to Chef each time a deployment is run.
My recipe now includes a call to the attributes file to get the build name, like so:
powershell_script 'Copy build files' do
code <<-EOH
$BuildNum = GC \\\\hqfas302002c\\build_share\\drop\\"#{node['copy_builds']['build']}"\\buildlabel.txt
robocopy \\\\hqfas302002c\\build_share\\drop\\"#{node['copy_builds']['build']}"\\webbin W:\\binroot\\$BuildNum /E
EOH
end
And now my call to Chef looks like this:
knife node run_list set $Node "recipe[copy_builds],recipe[install_build]"