How to load environment variables in a remote AIX machine through ssh while running script from Jenkin pipeline? - perl

We are using some custom modules in our Perl automation framework which runs through Jenkin pipeline. Recently we got package not found error for all custom modules while executing test cases in AIX servers as latest Perl version is installed there . So we tried to add "PERL5LIB" in the path as mentioned in document
https://perlmaven.com/how-to-change-inc-to-find-perl-modules-in-non-standard-locations
We added "export PERL5LIB=/home/foobar/code" in /etc/profile of the AIX server and script getting executed without any issue when running from local AIX machine.
Issue:
But we have Jenkin pipeline to execute the scripts in AIX server using ssh. Now when we do SSH to the AIX server in the pipeline script the variables that we have set in /etc/profile does not load and we get package not found error.
Question: How can I load the profile in the AIX server while running it from pipeline? or is there any other way to handle this. Before executing script I want to export PERL5LIB in remote AIX server through pipeline (only once) and the I should not get package not found error.
Below solutions I have tried :
Load the /etc/profile: ssh AIX server ./etc/profile (using dot since source not working in AIX)
Adding this line "export PERL5LIB=/home/foobar/code" in .ssh/environment in AIX server and set PermitUserEnviorment yes
Appreciate any help on this.

Assign values to variables the usual way:
ssh user#host 'export PERL5LIB=/somepath; echo $PERL5LIB'
user#hosts's password:
/somepath
or
ssh user#host '. /etc/profile.local; echo $PERL5LIB'
user#hosts's password:
/somepath/from/profile
Edit:
If you have to execute multiple commands, create a script and upload it to the target computer, for example:
SCRIPTNAME=/tmp/$$.$RANDOM.script
scp myscript.sh user#host:"$SCRIPTNAME"
ssh user#host "$SCRIPTNAME"

This is solved with below changes.
Step 1: Edit ~/.ssh/environment. Add variable PERL5LIB="/path of the module/"
Step 2: Edit /etc/ssh/sshd_config. Change variable PermitUserEnvironment from no to yes. Uncomment it if commented. This will enable access of environment variables to SSH.
Step 3: Restart SSHD service. (This is imp. I had tried step 1 and 2 before also but not restarted the service so solution was not working)
We can create a script and run it before executing automation test from pipeline.

Related

Execute powershell script with gitlab-runner on local windows machine

I do have following setup:
a win PC with gitlab-runner installed (working)
a powershell script running on the same PC is starting an application
a gitlab server to connect this local PC and starting the powershell script
Now when starting the powershell script directly from the local PC, the application starts and terminates after done - working as expected. When starting the same powershell script with the gitlab server (yml-file) then I can see that the application has been started (new process in taskmanager) but it is not running as well it never terminates.
When manually end the task I see that gitlab terminates again.
Question:
what could be the root cause?
is it possible to run the powershell script with gitlab-runner? I think there is a way with the command "exec". How does the command looks like when calling the powershell script?
is it possible to run the application not in the background in order to see whats going on?
others?
thanks in advance
I think there is a bug with the gitlab runner on windows.
No matter which shell you configure in the config.toml the runner
will always use cmd.exe for an exec local run.
Specify the --shell argument to override the default cmd.exe shell:
> gitlab-runner exec shell your_job --shell pwsh
If you run this locally in your project, it outputs to .builds/, so add this to your .gitignore because git will see it and think you might want to add a submodule.

Run perl script on remote server

Is it possible to run perl script, which is located on a remote server, on that server from Windows? There is a job on a remote server that I want to get done every time I make something on Windows.
You have to have something listening for an instruction to run the script, and then you have to send the instruction.
There are lots of approaches you could take to that, including:
Running an SSH server and then connecting to it from an ssh client on the windows machine
Running an HTTP server, running the script through FastCGI, and then requesting the URL for it from curl or a browser on the Windows machine
Writing a custom protocol, listening on a socket, and then writing a custom client that you run on the Windows machine
Absolutely.
You can use plink to run commands on the server from Windows, assuming the server is running sshd.
plink user#a.domain.ext echo hi
This will print "hi\n" to the standard output.
Substitute /path/to/perl/script for echo above and substitute hi with any command line argument that the script needs.
plink is available here: http://www.chiark.greenend.org.uk/~sgtatham/putty/download.html
One cautionary personal note from doing this many times is that the environment in which the perl script will be run is much less complete than what you would experience when logging in via a full SSH session and running the command interactively. Many environment variables you would normally expect are unset.
For instance using "set | wc -l" in the command above produces only 39 environment variables defined, but from an interactive SSH session, there are 57 environment variables defined. You have to make sure your perl script isn't depending on an environment variable that hasn't been set. For instance, you may need to use full paths for any modules that it uses, or by using the -I flag in the shebang line, because #INC may not be what you expect it to be.

Run a perl script on remote machine from local machine using Telnet or SSH with Perl

I want to run a Perl script in a remote machine using telnet or ssh. The script is on my local host.how can do this. Can anyone please help me on this?
If you for some reason don't want to copy the script to the remote host and then run it, you can send the script to the Perl interpreter over stdin. If perl doesn't get either a script name of a script on the command line it tries to read the script on stdin. So this would work:
ssh user#remote perl < my_script.pl
Of course this requires that all necessary modules are already installed on the remote host. If you script only have pure perl dependencies you can work around this restriction by using App::FatPacker to make your script (more) self contained.
But if this is an recurring task I would recommend getting the script deployed correctly to your remote host.
scp your script to remote machine.
ssh user#remote 'perl /path/to/remote/script.pl'
Using HERE document across SSH might also do the trick you are after. You can run at least a BASH script without first separately copying it to remote. I have not verified anything else than BASH but no reason to doubt either. Please see:
ssh + here document + interactive mode

Why my perl script requires to export packages path into #INC to run remotely via ssh

The server perl script - with its required packages - works locally by the user "my_user".
But if I run the script remotely (ssh), I need to export PERL5LIB=/usr/local/share/perl/5.10.0/my_modules before calling the perl script to get it working.
Why this and how can I turn around this in order to avoid exporting PERLIB each time I need to call a remote perl script ?
WORKING :
ssh my_user#remote_server "export PERL5LIB=/usr/local/share/perl/5.10.0/my_modules; /cgi-bin/my_perl_script.pl --option1 foo --option2 '*';"
NOT WORKING :
ssh my_user#remote_server "/cgi-bin/my_perl_script.pl --option1 foo --option2 '*';"
returns :
Can't locate my_package1.pm in #INC
That might be rather an ssh question than a strict perl point : why the remote user running the perl script does not inherit from its ENV local datas.
Thx
As suggested by #mu_is_too_short (no friction is good as well), and linking to a more detailed explanation here, there are different types of shells : "the SSH command execution shell is a non-interactive shell, whereas your normal shell is either a login shell or an interactive shell".
So the solution is what I did on purpose (eg adding "export PERL5LIB" before running the script), or better, source the whole environement from the remote user to run the remote shell with the expected behavior.

Is it possible to have Perl run shell script aliases?

Is it possible to have a Perl script run shell aliases? I am running into a situation where we've got a Perl module I don't have access to modify and one of the things it does is logs into multiple servers via SSH to run some commands remotely. Sadly some of the systems (which I also don't have access to modify) have a buggy SSH server that will disconnect as soon as my system tries to send an SSH public key. I have the SSH agent running because I need it to connect to some other servers.
My initial solution was to set up an alias to set ssh to ssh -o PubkeyAuthentication=no, but Perl runs the ssh binary it finds in the PATH instead of trying to use the alias.
It looks like the only solutions are disable the SSH agent while I am connecting to the problem servers or override the Perl module that does the actual connection.
Perhaps you could put a command called ssh in PATH ahead of the ssh which runs ssh as you want it to be run.
Alter the PATH before you run the perl script, or use this in your .ssh/config
Host *
PubkeyAuthentication no
Why don't you skip the alias and just create a shell script called ssh in a directory somewhere, then change the path to put that directory before the one containing the real ssh?
I had to do this recently with iostat because the new version output a different format that a third-party product couldn't handle (it scanned the output to generate a report).
I just created an iostat shell script which called the real iostat (with hardcoded path, but you could be more sophisticated), passing the output through an awk script to massage it into the original format. Then, I changed the path for the third-party program and it started working fine.
You could declare a function in .bashrc (or .profile or whatever) with that name. It could look like this (might break):
function ssh {
/usr/bin/ssh -o PubkeyAuthentication=no "$#"
}
But using a config file might be the best solution in your case.