Cloudbees Jenkins Folders Plugin: Folder Name as Enviroment Variable - plugins

Is the Folder Name available as an environment variable similar to JOB_NAME?
For a folder; JOB_NAME contains the full path including parent folders. I want the immediate parent folder as an environment variable.

I used:
FOLDER_NAME=${JOB_NAME%/*}
STAGE=${JOB_NAME##*/}
See http://www.tldp.org/LDP/abs/html/parameter-substitution.html

No such variable exists, but $(basename $(dirname $JOB_NAME)) would give you what you are asking for.

Using the suggestion of $(basename $(dirname $JOB_NAME)) combined with the Environment Script Plugin, you should be able to set such an environment variable by selecting Generate environment variables from script and then providing the Script Content of:
echo FOLDER_NAME=$(basename $(dirname $JOB_NAME))

Related

File path from within Azure CLI task

I have an Azure CLI task which references a PowerShell script (via build artifact) running az commands. Most of these commands work successfully, but when attempting to execute the following command:
az appconfig kv import --name $resourceName -s file --path appconfig.json --format json
I've noticed that the information was not present against the Azure resource and the log file has "File is not available".
I must be referencing the file incorrectly from the build artifact but if anyone could provide some clarity around this that would be great.
I must be referencing the file incorrectly from the build artifact
You can try to add $(System.ArtifactsDirectory) to the json file path. For example: --path $(System.ArtifactsDirectory)/appconfig.json.
System.ArtifactsDirectory: The directory to which artifacts are downloaded during deployment of a release. Example: C:\agent\_work\r1\a
For details ,please refer to predefined variables .
This can be a little tricky to figure out.
System.ArtifactsDirectory is the default variable that indicates the directory to which artifacts are downloaded during deployment of a release.
However, to use a default variable in your script, you must first replace the . in the default variable names with _. For example, to print the value of artifact variable System.ArtifactsDirectory in a PowerShell script, you would have to use $env:SYSTEM_ARTIFACTSDIRECTORY.
I have a similar setup and do it this way within my PowerShell script:
# Define the path to the file
$appSettingsFile="$env:SYSTEM_ARTIFACTSDIRECTORY\<rest_of_the_path>\appconfig.json"
# Pass it to the Azure CLI command
az appconfig kv import -n $appConfigName -s file --path $appSettingsFile --format json --separator . --yes
It is also helpful to view the current values of all variables to see what they contain before using them.
References:
Default variables - System
Using default variables

Is there a way to specify the output path of .coverage?

I'm looking for a way to specify the output path of the generated .coverage file. I've checked coverage help and did some research but still no luck so far. The reason is that I would like it to be specified into our tmp directory.
You can create a .coveragerc and specify the path for .coverage using the data_file attribute under the [run] section.
This is the link to the current documentation.
You can use the [run] data_file setting (https://coverage.readthedocs.io/en/latest/config.html#run), or the COVERAGE_FILE environment variable (https://coverage.readthedocs.io/en/latest/cmd.html#data-file).

Why is fish not using my abbreviations.fish file?

I have the following abbreviations.fish file located in ~/.config/fish/abbreviations.fish
abbr -a gco 'git checkout'
But when I am in the terminal, I can use gco. Can I just create any .fish files in the fish config folder and they should be automatically loaded?
Can I just create any .fish files in the fish config folder and they should be automatically loaded?
No.
You can create function files in ~/.config/fish/functions (these should contain the function they are named after, and are only loaded when that function is about to be executed).
In fish >= 2.3.0, you can put arbitrary files that will be sourced on startup into ~/.config/fish/conf.d/. The only restriction is that the name has to end in ".fish".

Creating files at PSModulePath in batch

I am currently trying to write a batch program that installs a module named SetConsolePath.psm1 at the correct location. I am a beginner with Batch and I have absolutely no powershell experience.
Through the internet, I have learned how to display PSModulePath with powershell -command "echo $env:PSModulePath.
How can I, via .bat file, move SetConsolePath.psm1 from the desktop to the location displayed by powershell -command "echo $env:PSModulePath?
Thank you in advance, and I apologize for my lack of experience.
Before I answer, I must out that you do not want to copy PowerShell module files directly to the path pointed by PsModulePath. You really want to create a folder inside PSModulePath and copy the files there instead.
The prefix env in a Powershell variable indicates an environment variable. $env:PSModulePath is actually referring to the PSMODULEPATH environment variable. On the command line, and in batch files, environment variables can be displayed by placing the name between percent symbols. (In fact, you could have displayed this value by typing echo %PSMODULEPATH% instead.)
To reference the desktop folder, have a look at this answer, which shows you how to use another environment variable, USERPROFILE.
Therefore, to copy the file from the desktop directory to the path specified in PSModulePath, you would do this:
COPY "%USERPROFILE%\Desktop\SetConsolePath.psm1" "%PSMODULEPATH%"
And, as I warned earlier, you really should copy the file to a folder underneath PsModulePath. So what you really want is:
IF NOT EXIST "%PSMODULEPATH%\MyNewFolder" MKDIR "%PSMODULEPATH%\MyNewFolder"
COPY "%USERPROFILE%\Desktop\SetConsolePath.psm1" "%PSMODULEPATH%\MyNewFolder"

Capistrano - How to put files in the shared folder?

I am new to Capistranoand I saw there is shared folder and also option :linked_files. I think shared folder is used to keep files between releases. But my question is, how do files end up being in the shared folder?
Also, if I want to symlink another directory to the current directory e.g. static folder at some path, how do I put it at the linked_dirs ?
Lastly how to set chmod 755 to linked_files and linked_dirs.
Thank you.
Folders inside your app are symlinks to folders in the shared directory. If your app writes to log/production.log, it will actually write to ../shared/log/production.log. That's how the files end up being in the shared folder.
You can see how this works by looking at the feature specs or tests in Capistrano.
If you want to chmod these shared files, you can just do it once directly over ssh since they won't ever be modified by Capistrano after they've been created.
To add a linked directory, in your deploy.rb:
set :linked_dirs, %w{bin log tmp/backup tmp/pids tmp/cache tmp/sockets vendor/bundle}
or
set :linked_dirs, fetch(:linked_dirs) + %w{public/system}
Capistrano 3.5+
Capistrano 3.5 introduced append for array fields. From the official docs, you should use these:
For Shared Files:
append :linked_files, %w{config/database.yml}
For Shared Directories:
append :linked_dirs, %w{bin log public/uploads vendor/bundle}
I've written a task for Capistrano 3 to upload your config files to the shared folder of each of your servers, it'll check these directories in order:
config/deploy/config/:stage/*.yml
config/deploy/config/*.yml
And upload all config files found. It'll only upload the files if they've changed. Note also that if you have the same file on both directories then the second one will be ignored.
Here's the code: https://gist.github.com/Jesus/448d618c83fb0445ebbf
One last thing, this task is just uploading the config. files to your remote shared folder, you still need to set linked_files in config/deploy.rb, eg:
set :linked_files, %w{config/database.yml config/aws.yml}
UPDATE:
If you're using Git, you'll probably want to ignore these files:
echo "config/deploy/config/*" >> .gitignore
There are 3 simple steps you can follow to put a file that you don't want to change in consecutive releases; add your file to linked_files list.
set :linked_files, fetch(:linked_files, []).push('config.php')
Select all the files that you want to share. Put this file from your local to remote server through scp
scp config.php deployer#amazon:~/capistrano/shared/config.php
Now, deploy through the command given below:
bundle exec cap staging deploy
of course, staging can be changed as per requirements may be production,sandbox etc.
One more thing, because you don't want your team members to commit such files. So, put this file to your .gitignore file. And push it to git remote repo.
For Capistrano 3.5+, as specified in official doc :
append :linked_dirs, ".bundle", "tmp"
For me non of the above worked so I ended up adding two functions to the end of the deployment process:
namespace :your_company do
desc "remove index.php"
task :rm_files do
on roles(:all) do
execute "rm -rf #{release_path}/index.php"
end
end
end
namespace :your_company do
desc "add symlink to index.php"
task :add_files do
on roles(:all) do
execute "ln -sf #{shared_path }/index.php #{release_path}/index.php"
end
end
end
after "deploy:finished", "your_company:rm_files"
after "deploy:finished", "your_company:add_files"