Create scripts that run in different servers - perl

I have three servers that are used to manage a bunch of other client servers. One of the managing servers has nagios, the other has a web proxy, another has an ldap and MySQL server.
Whenever I need to include a new client server, I have to log into Server A, and create the SQL entry, go to nagios and create the entry, go to the web server and add the proxy. You get the picture. What I would like is to be able have all servers share a scripts directory, say '/opt/boxes/scripts` and in there have a bunch of scripts that know where they can run. Say I'm in server A and run script X, that should run on server B, it will actually run in server B.
Is there a simple way to do this? Preferably perl bases since that is something i know a little bit about.

One easy way may be to make a directory for each script on each of the machines.
In one directory, the actual script runs.
In all the other directories, the script does a ssh to the appropriate server and runs the actual script.
E.g.
Script to add client
ssh servera -c servera-addclient-to-sql
ssh serverb -c serverb-addclient-to-nagios
ssh serverc -c serverc-addclient-to-webproxy
Adding scripts that know where they run is fairly easy too.
In loose form
$RunWhere = "XXXX"
$ScriptName = "YYYY"
if `hostname` ne "XXXX" {ssh $RunWhere -c $Scriptname #ARGS ; exit; }
else {
Do the actual stuff
}
So run from anywhere, it runs locally if thats the right thing to do, otherwise does a secure shell to the right place and runs the same command there.

Related

make server backup, and keep owner with rsync

I recently configured a little server for test some services, now, before an upgrade or install new software, I want to make an exact copy of my files, with owners, groups and permissions, also the symlinks.
I tried with rsync to keep the owner and group but in the machine who receives the copy I lost them.
rsync -azp -H /directorySource/ myUser#192.168.0.30:/home/myUser/myBackupDirectory
My intention is to do it with the / folder, to keep all my configurations just in case, I have 3 services who have it's own users and maybe makes modifications in folders outside it's home.
In the destination folder appear with my destination user, whether I do the copy from the server as if I do it from the destination, it doesn't keep the users and groups!, I create the same user, tried with sudo, even a friend tried with 777 folder :)
cp theoretically serves the same but doesn't work over ssh, anyway I tried to do it in the server but have many errors. As I remembered the command tar also keep the permissions and owners but have errors because the server it's working and it isn't so fast the process to restore. I remember too the magic dd command, but I made a big partition. Rsync looked the best option to do it, and to keep synchronized the backup. I saw rsync in the new version work well with owners but I have the package upgraded.
Anybody have some idea how I do this, or how is the normal process to keep my own server well backuped, to restore just making the partition again?
The services are taiga, a project manager platform, a git repository, a code reviewer, and so on, all are working well with nginx over Ubuntu Server. I haven't looked other backup methods because I thought rsync with a cron job do the work.
Your command would be fine, but you need to run as root user on the remote end (only root has permission to set file owners):
rsync -az -H /directorySource/ root#192.168.0.30:/home/myUser/myBackupDirectory
You also need to ensure that you use rsync's -o option to preserve owners, and -g to preserve groups, but as these are implied by -a your command is OK. I removed -p because that's also implied by -a.
You'll also need root access, on the local end, to do the reverse transfer (if you want to restore your files).
If that doesn't work for you (no root access), then you might consider doing this using tar. A proper archive is probably the correct tool for the job, and will contain all the correct user data. Again, root access will be needed to write that back to the file-system.

how to create jboss fuse admin user

i am trying to install jboss-fuse-6.1.0.redhat-379
and i am able to create esb:create-admin-user via console but not i am trying to automate installation via shell script .
i am able to start fuse server but not able to create user
esb:create-admin-user.
below is sample script for creating user.
!/bin/ksh
cd $HOME/jboss-fuse-6.1.0.redhat-379/bin
./fuse esb:create-admin-user --new-user admin1 --new-user-password admin12
but it is not creating user.
please let me know how i can do this .
This could be the issue with file write access present in etc .
Here is the another solution for that
Go to jboss-fuse-6.1.0.redhat-379/etc folder
Open users.properties file and add username=password,admin
Last word admin is the "Admin" role.
You are probably looking for the bin/client script to execute commands against a running fuse instance. But for that you will need a username/password ;)
If you want to automate it is indeed probably easier just to use #mahesh-biradar approach:
echo "admin1=admin12,admin" >> etc/users.properties

Login to the multiple UNIX servers one by one and then execute the command using perl

I am trying to login to the multiple UNIX servers using SSH one by one and then need to be executed the commands without prompt the username and password.
Is there any chance to do this from Perl?
Yes, this can be done, and if you do, it's best to use a module for it, e.g. Net::OpenSSH.
But this is probably a bad idea. You will need to specify the username and password in plaintext in a place where your Perl script can read them (e.g. in its source code). Anyone who can read your script can also read the username and password, and execute any remote command that using them permits.
So you might as well create a different piece of information that can be read by anyone who can read the Perl script, and use that to authenticate: a passwordless SSH key.
Create one and add its public key to the authorized keys on the host(s) you want to execute your remote command on. Use that key to execute the remote command. This is as least as safe as hardcoding the username and password in a place where the Perl script can read them.
In fact, it is safer: you can restrict the use of that key to the command you want to execute.
And you won't need a Perl script to execute the remote command.

Remotely restarting services on several servers

I have around 1000 servers on which I need to restart the SNMP service on, is there an easy method to this via a script or a batch file?
Do you have any sort of collection of the IP's and the root users and passwords (or SSH keys)?
If so, you could use a for loop to cycle through them (implementation depends on the way they're stored), and select the username and password with regular expression filtering or selecting by field and use expect to provide it the password.
If you don't have a collection like that, it seems that you'll have to build a database of them, and it may just be easier to do it manually, but it may be worth creating the database anyways in case you ever need to do this again.
You should give a look to Ansible provisioning tool.
The steps should be somthing like this:
Install Ansible: sudo apt-get install ansible (on ubuntu)
Define your server groups at /etc/ansible/hosts
[snmpservers]
myhostnames[01:10000].example.com
Restart the service on all servers
ansible snmpservers -m service -a "name=snmp state=restarted"

How do I copy data from a remote system without using ssh or FTP Perl modules?

I have to write a Perl script to automatically copy data from remote server to my local system. The directory structure on remote systems is:
../log/D1/<date>.tar.gz
../log/D2/<date>.gz
../log/D3/<date>.tar.gz
../log/D4/<date>
and same on other server. I want to copy the data on local system in below format.
../log/S1/D1/<date>.tar.gz
../log/S1/D2/<date>.gz
../log/S1/D3/<date>.tar.gz
../log/S1/D4/<date>
and same for other servers i.e. S2, S3, etc
Also, no ssh supported Perl modules are available on remote server as well on local server and I dont have permission to install any Perl modules. The only good thing is that the connectivity is through password-less ssh keys.
Can anyone please suggest me any Perl code to get this done?
I believe you can access to shell command from perl.
So you can do this:
$cmd = "/usr/bin/scp remotefile localfile";
system $cmd;
NOTE: scp is secure-copy -- a buddy of ssh.
This does not require ssh-perl module but it require ssh support on both (which I have).
Hope this helps.
I started to suggest the scp command line program, but it seems that there's a CPAN module for that (no surprise). Check out Net::SCP.
By using scp on your client (where you can install new Perl modules) you can copy files without having to install any new software on the remote system. It just needs to have the ssh server running - which you've said it does.
I'd say stop trying to make life difficult for yourself and get the system to support the features you require.
Trying to develop for such a limited/ locked down platform is not going to be cost-effective in the long run - you'll develop stuff more slowly and it will have more bugs.
A little developer time is way more expensive than a decent hosted VM / hardware box.
Get a proper host, it will definitely save money (talk to your manager about this).
From your query above I understand that you don't have much permissions to install perl modules or do any changes which require administrative privileges. I love perl but to automate things like this you should use bash instead of perl. Below is the sample code I am using with password less ssh keys.
#!/bin/bash
DATE=`date`
BASEDIR="/basedir"
cd $BASEDIR
for HOST in S1 S2 S3
do
scp -q $HOST:$BASEDIR/D1/$DATE.tar.gz $HOST/D1/
echo "Data copy from $HOST done"
done
exit 0
You can use different date formats like date +%Y%m%d for current date in format YYYYMMDD. Also you can use this link to learn different date formats.
Hope this helps.
You may not be able to install anything in system-wide lib directories, but there is nothing preventing you from installing modules in a location to which you have write-access. See How do I keep my own module/library directory?
This creates no more of a security issue than allowing you to write scripts on this system in the first place.
So, go forth and install Net::SCP.
It sounds like you want rsync. You shouldn't have to do any programming at all.