Powershell script to copy files to remote server - powershell

I'd like to have files on a local machine and copy them to a remote machine on the internet. For this reason, I can't use UNC files shares. I'd also like to avoid using MSDeploy or FTP if possible. Does powershell have an easy way to copy a bunch of files to a remote server?

Look at BitsTransfer module, might help you - http://technet.microsoft.com/en-us/library/dd819420.aspx

When the remote server is a Linux or UNIX system, I use PSCP.EXE, the Windows SCP client created by the developer of Putty. I also use Puttygen to create a key pair that can be used instead of interactive password authentication.

Related

Is there a way to keep Windows EFS encryption metadata in place when uploading a file to Linux?

I am trying to copy an EFS Encrypted zip file from Windows to a Linux server (through OpenSSH scp). It was encrypted using the PowerShell .Encrypt() method. Unfortunately, for whatever reason, when I download the file from the Linux server to a Windows machine, it can't be opened because the Windows machine does not detect it's EFS encrypted, and just regards it as an unreadable zip file.
I have exported the EFS key from the first computer and installed it on the computer that opens the file. The file is successfully detected as an EFS encrypted file when I use a USB key to move the file around and can be opened properly.
The PowerShell script that I'm trying to create should be invisible to the user. Another question is: could creating and mounting a VHDX file still be part of a script that doesn't interrupt the normal workflow of the user?

Command line for Dropbox connection in Duplicati backup

I'm trying to backup Folders from local drive to Dropbox using Duplicati command in Command prompt. (Backup should be Incremental)
C:\Users\Desktop\Office_Works\Duplicati\Duplicati 1.3.4\Duplicati>Duplicati.CommandLine.exe backup a https://www.dropbox.com/
Enter passphrase: **
Confirm passphrase: **
**Unable to find backend for: https://www.dropbox.com/**
"a" is my folder in local drive. Now I want to know how to make a connection with Dropbox using command lines. Is any particular way to connect Dropbox using duplicati commands?
There is no direct support for DropBox in Duplicati.
Others have reported using a local folder under the DropBox folder as a destination, such that the DropBox client synchronizes the folder for you.
There's a project that can upload files to dropbox from the command prompt. It's very lightweight and both installation and usage couldn't have been easier!
PneumaticTube
If you use it the first time it will open your browser to ask for permission to access your dropbox account but from then on it's easy sailing.

How to send a file using scp from a perl script using only ftp hostname, login and password, WITHOUT ppk file

Hoe to send a file using scp from a perl script using only ftp hostname, login and password, WITHOUT ppk file
This is what I use when I do have ppk file.
open scp://username#hostname -privatekey="\path\to\ppkfile\ppkfile.ppk "
put filename.csv /home/destination_flder
exit
Thank you!
one of the possible solution is that u can configure destination machine for passwordless ssh and then
use the following command to transfer file or copy
scp $source_filepath username#machinename:$destination_filepath
Suic's recommendation is very good, looks like exactly what you're looking for. Personally I haven't used that particular module, though.
I've had good success with Net::SFTP::Foreign, which is a perl wrapper for SFTP client. It supports password-based login in most situations (see documentation for details). In my experience SFTP is usually available whenever SCP is and gives greater level of control.

How to deploy mysql database to server via ssh

I try to use Phing for deploying site to the server.
Command which should create database or make changes:
<pdosqlexec url="mysql:host=${db.host}; dbname=${db.name}"
userid="${db.user}"
password="${db.pass}"
src="${project.basedir}/deploy/mysqlbuiltscripts/create_database.sql"/>
It works good on local machine. But I need to make changes on server too.
Main problem - I have access to server database via SSH only.
Question - How can I execute this command via SSH tunnel?
P.S. I tried to use <ssh username="${username}" password="${password}" host="${host}" command="${myMysqlCommand}">, but it does not suit me because it does not write changes to Phing "changelog" table.
Are you using dbdeployTask? If you are generating a delta for the remote server, then your file should have the changelog present.
If you don't have access to your remote server, you may need to do the dbdeploy work on the remote server directly or tunnel your requests through ssh.
My dbdeploy steps are:
Run phing -> dbdeploy task
Get a delta sql
With mysql, run the delta sql script on the remote server
Enjoy

How do I copy data from a remote system without using ssh or FTP Perl modules?

I have to write a Perl script to automatically copy data from remote server to my local system. The directory structure on remote systems is:
../log/D1/<date>.tar.gz
../log/D2/<date>.gz
../log/D3/<date>.tar.gz
../log/D4/<date>
and same on other server. I want to copy the data on local system in below format.
../log/S1/D1/<date>.tar.gz
../log/S1/D2/<date>.gz
../log/S1/D3/<date>.tar.gz
../log/S1/D4/<date>
and same for other servers i.e. S2, S3, etc
Also, no ssh supported Perl modules are available on remote server as well on local server and I dont have permission to install any Perl modules. The only good thing is that the connectivity is through password-less ssh keys.
Can anyone please suggest me any Perl code to get this done?
I believe you can access to shell command from perl.
So you can do this:
$cmd = "/usr/bin/scp remotefile localfile";
system $cmd;
NOTE: scp is secure-copy -- a buddy of ssh.
This does not require ssh-perl module but it require ssh support on both (which I have).
Hope this helps.
I started to suggest the scp command line program, but it seems that there's a CPAN module for that (no surprise). Check out Net::SCP.
By using scp on your client (where you can install new Perl modules) you can copy files without having to install any new software on the remote system. It just needs to have the ssh server running - which you've said it does.
I'd say stop trying to make life difficult for yourself and get the system to support the features you require.
Trying to develop for such a limited/ locked down platform is not going to be cost-effective in the long run - you'll develop stuff more slowly and it will have more bugs.
A little developer time is way more expensive than a decent hosted VM / hardware box.
Get a proper host, it will definitely save money (talk to your manager about this).
From your query above I understand that you don't have much permissions to install perl modules or do any changes which require administrative privileges. I love perl but to automate things like this you should use bash instead of perl. Below is the sample code I am using with password less ssh keys.
#!/bin/bash
DATE=`date`
BASEDIR="/basedir"
cd $BASEDIR
for HOST in S1 S2 S3
do
scp -q $HOST:$BASEDIR/D1/$DATE.tar.gz $HOST/D1/
echo "Data copy from $HOST done"
done
exit 0
You can use different date formats like date +%Y%m%d for current date in format YYYYMMDD. Also you can use this link to learn different date formats.
Hope this helps.
You may not be able to install anything in system-wide lib directories, but there is nothing preventing you from installing modules in a location to which you have write-access. See How do I keep my own module/library directory?
This creates no more of a security issue than allowing you to write scripts on this system in the first place.
So, go forth and install Net::SCP.
It sounds like you want rsync. You shouldn't have to do any programming at all.