Capture short host name using built-in Perl modules? - perl

Is there a cleaner way to do
use Sys::Hostname qw(hostname);
my $hostname = hostname();
$hostname =~ s/\.domain//;
Basically, is it possible to strip the hostname down to its short name without running two $hostname assignments and without additional modules?

You may use Net::Domain's hostname instead
Returns the smallest part of the FQDN which can be used to identify the host.
use Net::Domain qw(hostname);
my $hostname = hostname();
Without additional modules, call external command hostname -s
-s, --short
Display the short host name. This is the host name cut at the
first dot.
chomp(my $hostname = `hostname -s`);

Using Sys::Hostname:
use Sys::Hostname;
my ($short_hostname) = split /\./, hostname(); # Split by '.', keep the first part
Using system hostname command:
chomp(my ($short_hostname) = `hostname | cut -f 1 -d.`);

Related

Quotes and slashes surviving multiple layers

Goal
I need to effectively run a copy (cp) command but have explicit quote symbols preserved. This is needed so that the z/OS Unix System Services Korn shell properly recognizes the target of the copy as a traditional MVS dataset.
The complexity is that this step is part of an automated process. The command is generated by Perl. That Perl is executed on a separate Docker container via ssh. This adds another layer of escaping that needs to be addressed, in addition to the escaping needed by Perl.
Basically, docker is doing something like
perl myprogram.perl
which generates the necessary SSH commands, sending them to the mainframe which tries to run them. When I run the Perl script, it generates the command
sshpass -p passwd ssh woodsmn#bldbmsb.boulder.mycompany.com export _UNIX03=NO;cp -P "RECFM=FB,LRECL=287,BLKSIZE=6027,SPACE=\(TRACK,\(1,1\)\)" /u/woodsmn/SSC.D051721.T200335.S90.CP037 "//'WOODSMN.SSC.D051721.T200335.S90.CP037'"
and the mainframe returns an error:
cp: target "//'WOODSMN.SSC.D051721.T200335.S90.CP037'" is not a directory
The sshpass is needed because my sysadmin refuses to turn on authorized users, so my only option is to run sshpass and shove a password in. The password exposure is contained and we're not worried about this.
The first command
export _UNIX03=NO
tells z/OS to treat the -P option as an indicator for MVS dataset control blocks. That is, this is where we tell the system, hey this is a fixed length of 287 characters, allocate in tracks, etc. The dataset will be assumed to be new.
For the copy command, I'm wanting z/OS to copy the HFS file (basically a normal UNIX file)
/u/woodsmn/SSC.D051721.T200335.S90.CP037
into the fully qualifed MVS dataset
WOODSMN.SSC.D051721.T200335.S90.CP037
Sometimes MVS commands assume a high level qualifier of basically the users userid and allow the user to omit this. In this case, I've explicitly specified this.
To get z/OS to treat the target as a dataset, one needs to prefix it with two slashes (/), so //
to use a fully qualified dataset, the name needs to be surrounded by an apostrophe (')
But, to avoid confusion within Korn shell, the target needs to be surrounded by double quotes (").
So, somehow between Perl, the shell running my SSH command inside the Docker container (likely bash) and the receiving Korn shell on z/OS, it's not being properly interpreted.
My scaled down Perl looks like:
use strict;
use warnings;
sub putMvsFileByHfs;
use IO::Socket::IP;
use IO::Socket::SSL;
use IPC::Run3;
use Net::SCP;
my $SSCJCL_SOURCE_DIRECTORY = "/home/bluecost/";
my $SSCJCL_STORAGE_UNIT = "TRACK";
my $SSCJCL_PRIMARY_EXTENTS = "1";
my $SSCJCL_SECONDARY_EXTENTS = "1";
my $SSCJCL_HFS_LOCATION="/u/woodsmn";
my $SSCJCL_STAGING_HLQ = "WOODSMN";
my $COST_FILE="SSC.D051721.T200335.S90.CP037";
my $SSCJCL_USER_PW="mypass";
my $SCJCL_USER_ID="woodsmn";
my $SSCJCL_HOST_NAME="bldbmsb.boulder.mycompany.com";
my $MVS_FORMAT_OPTIONS="-P ".qq(")."RECFM=FB,LRECL=287,BLKSIZE=6027,SPACE=\\("
.${SSCJCL_STORAGE_UNIT}
.",\\("
.${SSCJCL_PRIMARY_EXTENTS}
.","
.${SSCJCL_SECONDARY_EXTENTS}
."\\)\\)".qq(");
putMvsFileByHfs(${MVS_FORMAT_OPTIONS}." ",
$SSCJCL_SOURCE_DIRECTORY.'/'.$COST_FILE,
${SSCJCL_HFS_LOCATION}.'/'.$COST_FILE,
${SSCJCL_STAGING_HLQ}.'.'.$COST_FILE);
# This function copys the file first from my local volume mounted to the Docker container
# to my mainframe ZFS volume. Then it attempts to copy it from ZFS to a traditional MVS
# dataset. This second part is the failinmg part.
sub putMvsFileByHfs
{
#
# First copy the file from the local file system to my the mainframe in HFS form (copy to USS)
# This part works.
#
my $OPTIONS = shift;
my $FULLY_QUALIFIED_LOCAL_FILE = shift;
my $FULLY_QUALIFIED_HFS_FILE = shift;
my $FULLY_QUALIFIED_MVS_FILE = shift;
RunScpCommand($FULLY_QUALIFIED_LOCAL_FILE,$FULLY_QUALIFIED_HFS_FILE);
#
# I am doing something wrong here
# Attempt to build the target dataset name.
#
my $dsnPrefix = qq(\"//');
my $dsnSuffix = qq('\");
my $FULLY_QUALIFIED_MVS_ARGUMENT = ${dsnPrefix}.${FULLY_QUALIFIED_MVS_FILE}.${dsnSuffix};
RunSshCommand("export _UNIX03=NO;cp ${OPTIONS}".${FULLY_QUALIFIED_HFS_FILE}." ".${FULLY_QUALIFIED_MVS_ARGUMENT});
}
# This function marshals whatever command I want to run and mostly does it. I'm not having
# any connectivity issues. My command at least reaches the server and SSH will try to run it.
sub RunScpCommand()
{
my $ssh_source= $_[0];
my $ssh_target= $_[1];
my ($out,$err);
my $in = "${SSCJCL_USER_PW}\n";
my $full_command = "sshpass -p ".${SSCJCL_USER_PW}." scp ".${ssh_source}." ".${SSCJCL_USER_ID}."#".${SSCJCL_HOST_NAME}.":".${ssh_target};
print ($full_command."\n");
run3 $full_command,\$in,\$out,\$err;
print ($out."\n");
print ($err."\n");
return ($out,$err);
}
# This function marshals whatever command I want to run and mostly does it. I'm not having
# any connectivity issues. My command at least reaches the server and SSH will try to run it.
sub RunSshCommand
{
my $ssh_command = $_[0];
my $in = "${SSCJCL_USER_PW}\n";
my ($out,$err);
my $full_command = "sshpass -p ".${SSCJCL_USER_PW}." ssh ".${SSCJCL_USER_ID}."#".${SSCJCL_HOST_NAME}." ".${ssh_command};
print ($full_command."\n");
run3 $full_command,\$in,\$out,\$err;
print ($out."\n");
print ($err."\n");
return ($out,$err);
}
Please forgive any Perl malpractices above as I'm new to Perl, though kind constructive pointers are appreciated.
First, let's build the values we want to pass to the program. We'll worry about building shell commands later.
my #OPTIONS = (
-P => join(',',
"RECFM=FB",
"LRECL=287",
"BLKSIZE=6027",
"SPACE=($SSCJCL_STORAGE_UNIT,($SSCJCL_PRIMARY_EXTENTS,$SSCJCL_SECONDARY_EXTENTS))",
),
);
my $FULLY_QUALIFIED_LOCAL_FILE = "$SSCJCL_SOURCE_DIRECTORY/$COST_FILE";
my $FULLY_QUALIFIED_HFS_FILE = "$SSCJCL_HFS_LOCATION/$COST_FILE";
my $FULLY_QUALIFIED_MVS_FILE = "$SSCJCL_STAGING_HLQ.$COST_FILE";
my $FULLY_QUALIFIED_MVS_ARGUMENT = "//'$FULLY_QUALIFIED_MVS_FILE'";
Easy peasy.
Now it's time to build the commands to execute. The key is to avoid trying to do multiple levels of escaping at once. First build the remote command, and then build the local command.
use String::ShellQuote qw( shell_quote );
my $scp_cmd = shell_quote(
"sshpass",
-p => $SSCJCL_USER_PW,
"scp",
$FULLY_QUALIFIED_LOCAL_FILE,
"$SSCJCL_USER_ID\#$SSCJCL_HOST_NAME:$FULLY_QUALIFIED_HFS_FILE",
);
run3 $scp_cmd, ...;
my $remote_cmd =
'_UNIX03=NO ' .
shell_quote(
"cp",
#OPTIONS,
$FULLY_QUALIFIED_HFS_FILE,
$FULLY_QUALIFIED_MVS_ARGUMENT,
);
my $ssh_cmd = shell_quote(
"sshpass",
-p => $SSCJCL_USER_PW,
"ssh", $remote_cmd,
);
run3 $ssh_cmd, ...;
But there's a much better solution since you're using run3. You can entirely avoid creating a shell on the local host, and thus entirely avoid having to create a command for it! This is done by passing a reference to an array containing the program and its args instead of passing a shell command.
use String::ShellQuote qw( shell_quote );
my #scp_cmd = (
"sshpass",
-p => $SSCJCL_USER_PW,
"scp",
$FULLY_QUALIFIED_LOCAL_FILE,
"$SSCJCL_USER_ID\#$SSCJCL_HOST_NAME:$FULLY_QUALIFIED_HFS_FILE",
);
run3 \#scp_cmd, ...;
my $remote_cmd =
'_UNIX03=NO ' .
shell_quote(
"cp",
#OPTIONS,
$FULLY_QUALIFIED_HFS_FILE,
$FULLY_QUALIFIED_MVS_ARGUMENT,
);
my #ssh_cmd = (
"sshpass",
-p => $SSCJCL_USER_PW,
"ssh", $remote_cmd,
);
run3 \#ssh_cmd, ...;
By the way, it's insecure to pass passwords on the command line; other users on the machine can see them.
By the way, VAR=VAL cmd (as a single command) sets the env var for cmd. I used that shorthand above.
The parameter specifying the disk units is "TRK" not TRACK", so this has to be
-P "RECFM=FB,LRECL=287,BLKSIZE=6027,SPACE=\(TRK,\(1,1\)\)"
Also, I never had to escape the paranthesis, when running such a command interactively from an SSH session. So this works for me
-P "RECFM=FB,LRECL=287,BLKSIZE=6027,SPACE=(TRK,(1,1))"
Then, the error
cp: target "//'WOODSMN.SSC.D051721.T200335.S90.CP037'" is not a directory
indicates that cp understood it has to copy more than one source file, thus it requests the final pathname to be a directoy. Seems to confirm that the cp did not run on the remote mainframe but on your local shell (as someone pointed out caused by not escaping the semicolon). And your local UNIX does not understand the z/OS specific MVS Data Set notation //'your.mvs.data.set'.
Instead of exporting _UNIX03=NO, you could replace
-P "RECFM=FB,LRECL=287,BLKSIZE=6027,SPACE=(TRK,(1,1))"
with
-W "seqparms='RECFM=FB,LRECL=287,BLKSIZE=6027,SPACE=(TRK,(1,1))'"
Then, only one command is to be run.

Perl run subroutine on different host

i am new in perl. I have script where i need to jump to different hosts and comparing FS, environments etc
I have one main jumpserver(MAIN_JUMP and 5 jumpservers to different clusters(CLUSTER_JUMP_1-5). I run my script on MAIN_JUMP, but i need run some subroutines on CLUSTER_JUMP_*. In subroutine I jump to specific host in cluster.
Is it possible to run subroutine via ssh or some perl modules directly on CLUSTER_JUMP? for now I use double ssh to CLUSTER_JUMP_* and then to specific host. It is working in some cases, but for example selects to oracle databases is not working due to quote marks.
Object::Remote will do this for you in a really easy way...
use strict;
use warnings;
use feature 'say';
use Object::Remote;
####################################################################
# Note that My::File must be installed on the machines you want to
# run this on!
####################################################################
# package My::File;
# use Moo;
# has path => ( is => 'ro', required => 1 );
# sub size {
# my $self = shift;
# -s $self->path;
# }
# 1;
####################################################################
use My::File;
## find the size of a local file
my $file1 = My::File->new( path => '/etc/hostname' );
say $file1->size;
## find the size of a file on a remote host
my $conn = Object::Remote->connect('host.example.net'); # ssh
my $file2 = My::File->new::on( $conn, path => '/etc/hostname' );
say $file2->size;
Update: for clarity, there's nothing special about "My::File". That's just an example of a module that you would write and ensure is installed properly on all the machines that you will be remotely accessing, plus the "client" machine. It can be any module written in an OO style.

Perl issue with storing url in variable

I am relatively new to Perl. I am trying to store a URL in a variable. Here's my code:
my $port = qx{/usr/bin/perl get_port.pl};
print $port;
my $url = "http://localhost:$port/cds/ws/CDS";
print "\n$url\n";
This gives me the below output:
4578
/cds/ws/CDS
So the get_port.pl script is giving me the port correctly but the URL isn't getting stored properly. I believe there's some issue with the slash / but I am not sure how to get around it. I have tried escaping it with backslash and I have also tried qq{} but it keeps giving the same output.
Please advise.
Output for perl get_port.pl | od -a
0000000 nl 4 5 7 8 nl
0000006
There is noithing wrong with your $url string. The problem is almost certainly that the $port string contains carriage-return characters. Presumably you are working on Windows?
Try this code instead, which extracts the first string of digits it finds in the value returned by get_port.pl and discards everything else.
my ($port) = qx{/usr/bin/perl get_port.pl} =~ /(\d+)/;
print $port, "\n";
my $url = "http://localhost:$port/cds/ws/CDS";
print $url, "\n";
As #Сухой27 I think was trying to point out, you can use other character besides '/' with qx, to simplify syntax and then you don't have to escape the slashes.
I also added a default port 8080 in case get_port.pl does not exist.
This seems to work properly.
#!/usr/bin/perl -w
# make 8080 the default port
my $port = qx{/usr/bin/perl get_port.pl} || 8080;
print $port;
my $url = "http://localhost:$port/cds/ws/CDS";
print "\n$url\n";
output
paul#ki6cq:~/SO$ ./so1.pl
Can't open perl script "get_port.pl": No such file or directory
8080
http://localhost:8080/cds/ws/CDS

scp with special characters programmatically

I have been searching for this for a while, and can't find a satisfactory answer.
I have a perl script that needs to copy a file from one host to another, essentially
sub copy_file{
my($from_server, $from_path, $to_server, $to_path, $filename) = #_;
my $from_location = "$from_server:\"\\\"${from_path}${filename}\\\"\"";
my $to_location = $to_path . $filename;
$to_location =~ s/\s/\\\\ /g;
$to_location = "${to_server}:\"\\\"${to_location}\\\"\"";
return system("scp -p $from_location $to_location >/dev/null 2>&1"");
}
The problem is, some of my filenames look like this:
BLAH;BLAH;BLAH.TXT
Some really nicely named file( With spaces, prentices, &, etc...).xlx
I am already handling whitespaces, and the code for that is quite ugly since on each side, the files could be local or remote, and the escaping is different for the from and to part of the scp call.
what I am really looking for is either to somehow to escape all possible special characters or somehow bypass the shell expansion entirely by using POSIX system calls. I am ok with writing a XS Module if need be.
I have the correct keys set up in the .ssh directory
Also I am not honestly sure which special characters do and don't cause problems. I would like to support all legal filename characters.
Say you want to copy file foo(s) using scp.
As shown below, scp treats the source and target as shell literals, so you pass the following arguments to scp:
scp
-p
--
host1.com:foo\(s\) or host1.com:'foo(s)'
host2.com:foo\(s\) or host2.com:'foo(s)'
You can do that using the multi-argument syntax of system plus an escaping function.
use String::ShellQuote qw( shell_quote );
my $source = $from_server . ":" . shell_quote("$from_path/$filename");
my $target = $to_server . ":" . shell_quote("$to_path/$filename");
system('scp', '-p', '--', $source, $target);
If you really wanted to build a shell command, use shell_quote as usual.
my $cmd = shell_quote('scp', '-p', '--', $source, $target);
$ ssh ikegami#host.com 'mkdir foo ; touch foo/a foo/b foo/"*" ; ls -1 foo'
*
a
b
$ mkdir foo ; ls -1 foo
$ scp 'ikegami#host.com:foo/*' foo
* 100% 0 0.0KB/s 00:00
a 100% 0 0.0KB/s 00:00
b 100% 0 0.0KB/s 00:00
$ ls -1 foo
*
a
b
$ rm foo/* ; ls -1 foo
$ scp 'ikegami#host.com:foo/\*' foo
* 100% 0 0.0KB/s 00:00
$ ls -1 foo
*
There are three ways to handle this:
Use the multi-argument form of system which will completely avoid the shell:
system 'scp', '-p', $from_location, $to_location;
Disadvantage: you can't use shell features like redirection.
Use String::ShellQuote to escape the characters.
$from_location = shell_quote $from_location;
$to_location = shell_quote $to_location;
Disadvantage: certain strings can exist which can't be quoted safely. Furthermore, this solution is not portable as it assumes Bourne shell syntax.
Use IPC::Run which essentially is a supercharged system command that allows redirections.
run ['scp', '-p', $from_location, $to_location],
'>', '/dev/null', # yes, I know /dev/null isn't portable
'2>', '/dev/null'; # we could probably use IO::Null instead
Disadvantage: a complex module like this has certain limitations (e.g. Windows support is experimental), but I doubt you'll run into any issues here.
I strongly suggest you use IPC::Run.
A few suggestions:
It's not part of the standard Perl distribution, but Net::SSH2 and Net::SSH2::SFTP are two highly ranked CPAN modules that will do what you want. ssh, scp, and sftp all use the same protocol and the sshd daemon. If you can use scp, you can use sftp. This gives you a very nice Perlish way to copy files from one system to another.
The quotemeta command will take a string and quote all non-text ASSCII characters for you. It's better than attempting to do the situation yourself with the s/../../ substitution.
You can use qq(..) and q(...) instead of quotation marks. This will allow you to use quotation marks in your string without having to quote them over and over again. This is especially useful in the system command.
For example:
my $error = system qq(scp $user\#host:"$from_location" "$to_location");
One more little trick: If the system command is passed a single parameter, and that parameter has shell metacharacters in it, the system command will be passed to the default system shell. However, if you pass the system command a list of items, those items are passed to execvp directly without being passed to the shell.
Passing a_list_ of arguments to system via an array is a great way to avoid problems with file names. Spaces, shell metacharacters, and other shell issues are avoided.
my #command;
push #command, 'scp';
push #command, "$from_user\#$from_host:$from_location",
push #command, "$to_user\#$to_host:$to_location"
my $error = system #command;
use Net::OpenSSH:
my $ssh = Net::OpenSSH->new($host);
$ssh->scp_get($remote_path, $local_path);
The way arguments have to be quoted varies depending on the shell running on the remote side. The stable version of the module has support for Bourne compatible shells. The development version available from GitHub has also support for csh and several Windows flavors (Windows quoteing is, err, interesting).
For instance:
my $ssh = Net::OpenSSH->new($host, remote_shell => 'MSWin', ...);
Note that on Windows, there are string that just can not be properly quoted!

Script to change hostname on CentOS

I using curl to pull a hostname from a server using a HTTP get request, and that is working fine. I need my script to modify the /etc/sysconfig/network file so I don't have to restart the system to apply the hostname.
Here is my code thus far:
#!/usr/local/bin/perl
use strict;
use warnings;
use Sys::Hostname;
my $curl = `curl http://100.10.10.10/hostname`; # Get correct hostname
my $host = hostname;
if ($curl ne $host) {
# Need to modify the /etc/sysconfig/network file to replace hostname or add it.
}
EDIT:
My Actual Question: What is the best way for me to modify that file with the new hostname?
I guess you want is
my $host = qx{hostname};
instead of
my $host = hostname;
also, why dont you just make the changes manually (i.e, open /etc/hosts or whatever file you want to edit, and edit, just make sure the $> is 0... script is running as user root).