I need the package having below features:
needs client based secure file transfer machanism for getting files from multiple directories at the remote machine.
Must have features to do ls at remote machine.
Must have functions to know the file permissions at remote machine.
It have to use single connection for all file transfers.
That package have to use less number of resources and it have to do transfer at fast.
zipping the files at the remote machine.
Is this homework? And, have you tried searching CPAN? There are several modules that do what you need, such as Net::SCP and IO::Compress::Gunzip.
You can get most of this done with Net::SFTP. How well the remote server complies with the compression part is mainly a function of that machine. If it supports it, you can probably issue a site command to do it.
Related
I am trying to install a few Perl modules, one of them being Time::Format. My corporate group policy does not allow me to use CPAN (since it uses FTP) to install modules. Tried using makefile.pl, but nmake.exe (I am using windows cygwin) seems to be missing too. The policy does not allow to download the nmake.exe executable too. PPM isn't available too.
How do I go about installing module manually? Is there a way where I could manually copy files from this module on to individual folders in my Perl directory? If yes, which files, go where?
You might be able to get around the FTP restriction by using a minicpan. Basically this lets you build your own local CPAN mirror, for example on a USB key drive. It's intended to carry a snapshot of CPAN with you, e.g. when you are in transatlantic flights and just really need that module you didn't install.
However, those restrictions are probably there for a reason. Downloading and installing/copying stuff from the internet might be against the corporate policy just the same as it's an external source. You should check that. On the other hand, those policies are often intended for the average office user that is not a developer, so you might be able to talk to them and explain why you and your team should be excepted from this policy, and how reusing code from CPAN would save your company a lot of time, which equals money.
It might be simple query, but still I'm not clear regarding this.
In my script I'm connecting to SFTP client server using Net::SSH::Perl module. Previously I was using Net::SFTP but removed from the script, as all of a sudden it stopped working and throwing an error.
When I used Net::SFTP, most often I will use the below command to Put/get files from remote server.
$sftp->put("/home/ftpford/ftpcon/conout/$file","/Uti_Integrator/READYFORPICKUP/PENDING/$file");
But I'm not sure of how to get/put files using Net::SSH::Perl.
Can any one suggest? I have tried many ways and even I tried to search in Google but I'm not clear of any thing.
And please note that I don't have privileges to install new modules in my server.
I want to get/put files using the above Module.
I don't think that's a standard feature of Net::SSH::Perl.
One option is to use other modules that do (you don't necessarily need to install them in the system directories, you can have them in your own directories if you include them via PERL5LIB or use lib.
The other option (which answers your question stricto sensu) is to emulate it. You can try to simply run cat >$destination_file on the remote box via Net::SSH::Perl, and then send the contents of the file over the ssh connection.
Of course, error handling and the like might not be very straightforward...
BTW, you tell us you have tried many things, but you don't tell us which, and what problems you encountered.
I need help: is there a way to fetch a file from the remote server using only core modules of perl 5.8.8? File::Fetch became core module only from 5.9.
This comes up all the time. Take a look at the classic yes, even you can use CPAN. If you have the ability to create and run a Perl script, then you also have the ability to put a module in your local directory and use it. The requirement to use only core modules is entirely artificial.
In your case, LWP::Simple's getstore() function will do what you want. While it is technically not core, LWP::Simple is included by default with many Perl distributions. You may well already have it.
Update: so, you want to do this on 1000 servers? No need to manually install the module on each server. Use CPAN programmatically to download and install the module(s) you need (some tweaking will be needed to get CPAN to install it locally rather than in the root module library). Also Leon Timmermans's suggestion of fatpacking the module is another option.
If you really don't want to do it this way, then basically the answer is no: there is no simple way to fetch a remote file via HTTP using neither the appropriate modules, nor a system command (I didn't consider writing your own HTTP client to be a simple method, but that's fine if it works for you).
The only other potential solution I see would be a different approach to your problem, such as:
Using a script in a single location to get the file, then distribute
it to all 1000 servers via FTP.
Or, putting the file on an FTP server, then using a simple Perl
script on each server to fetch it via FTP.
As Dan already said, yes, even you can use CPAN. One approach his link doesn't mention is writing it as a normal CPAN-using distribution, and then fatpack it. Fatpacker combines a script with all its (pure-perl) dependencies, creating a single easy to distribute file.
You could use:
my $wgetoutput = `wget "$myFileToGet"`;
Stuff in backticks (`) will be given to the default shell, so you can call whatever you want (and are allowed) there.
Caveat: $myFileToGet could have stuff like "&& rm -rf *" in it, so dont forget to sanitize!
Is it possible to set up a remote NetBeans C++ project where the source files are only accessible via SSH?
My project needs to build on a Linux box, but I'd like to develop it on a Windows machine.
Checking out the code via SVN to my Windows machine is not an option since there are a few files that differ only by case, and NTFS is not case sensitive (unfortunately, I can not change them).
I'm well aware that Windows can be kind-of forced be case-aware and the ideal solution is to just re-name those file to something sane.
However, I'm just trying to solve this using NetBeans. Since it's a remote project anyway, why bother to keep any files locally.
Thanks
Currently, no. In general programming files with different cases of the same name is a bad practice.
You can enable case sensitivity in Windows - you may need to have a Professional version or better.
For Windows XP: http://support.microsoft.com/kb/817921
For Windows 7: http://technet.microsoft.com/en-us/library/cc732389.aspx
See also: Windows Services for Unix
Another solution would be to setup VNC/RDP on the remote Unix system. The overall solution should be to conform to a better file naming convention:
Programmer 1: "Hey man, take a look at noCamelCase.cs - I just rewrote it."
Programmer 2: "Um, nocamelcase.cs is blank."
There are two ways of doing remote builds with Netbeans. The first, the project is stored locally. You just create a regular project and on the 2nd page of the wizard you specify the network directory with the source and the remote build host. I've used this for Solaris client to Linux server, but not from Windows as we don't have the mounts exported by SMB. This uses ssh and some shared lib interposers to get the build info.
The second way is to create a remote project. In this case the project is created on the remote host and date is copied on demand to the client. I've only doe a few tests with this as I preferred the first method as it had much better latency.
Lastly, you could either use vnc or install X on your windows machine and do everything on the Linux machine.
Anyone have suggestions for deployment methods for Perl modules to a share nothing cluster?
Our current method is very manual.
Take down half the cluster
Copy Perl modules ( CPAN style modules ) to downed cluster members
ssh to each member and run perl Makefile.pl; make ; make install on each module to be installed
Confirm deployment
In service the newly deployed cluster members, out of service the old cluster members and repeat steps 2 -> 4
This is obviously far from optimal, anyone have or know of good tool chains for deploying Perl modules to a shared nothing cluster?
Take one node offline, install Perl, and then use it to reimage the other nodes.
At least, that's how I imagine you'd want to install software in a shared-nothing cluster. Perl is just the application you happen to be installing.
Assuming all the machines are identical, you should be able to keep one canonical installation, and use rsync or something to keep the others in updated.
I have, in the past, developed a Perl program which used the Expect module (from CPAN) to automate basically the process you described, automatically sshing to each host, copying any necessary files, and performing the installations. Unfortunately, this was developed on-site for a client, so I do not have access to the code to share. If you're familiar with Expect, it shouldn't be too difficult to set up, though.
We currently have a clustered Perl application that does data processing. We also have numerous CPAN modules and modules that we've developed that the software depends on. When you say 'shared nothing', I'm assuming you're referring to things like NFS mounts.
If the machines have identical configurations, then you may be able to build your entire application into a single directory structure (eg: /opt/my-app), tar it up and that could be come the only thing you need to push to the boxes.
As far as deploying it to the boxes, you might be able to use Capistrano. We developed a couple of our own cluster utilities that piggybacked off of ssh - I've released one form of that utility: parallel-jobs. Its README shows an example of executing multiple parallel ssh commands. It's a small step to extend that program to be able to know about your cluster and then be able to execute the same command across the cluster (as opposed to a series of different commands).
If you are using Debian or Ubunto OS you could package your Perl modules - I have open sourced some code to help with this: Perl module builder it's still very rough but does work and can be made to work on your own code as well as CPAN modules, this then makes deployment much easier.
There is also project to get RedHat rpms for all of CPAN, Dave Cross gave a talk Perl in RPM-Land which may be of use.
If you are on some other system which doesn't have packaging then the rsync option (install on one machine and then rsync to the others) should work as well, note you can mount a windows share and rsync to it across unix if needed.
Using a central manager like Puppet makes creating and maintaining machines in a cluster a lot easier to manage, from installing code to managing users and email configuration. There is also a Perl project in the pipeline to do something similar but this has not been made public yet.
Capistrano is a tool that allows you to run commands on a group of servers; it is perfectly suited to making your task considerably easier.
Further down the line of automation, but also complexity, is Puppet that allows you do define a group of servers, give them roles and then push out sets of code to every machine subscribing to a certain role.
I am not sure exactly what a share nothing cluster is, but if it uses some base *nix system like Fedora, Mandriva, or Ubuntu. Many of the perl modules are precompiled for specific architectures. You can easily run these.
If these systems are of the same arch you can do as someone else said and just copy the compiled modules from system to system, just make sure you have all of the dependancies as well on the recipient system.