Run a script via FTP connection from PowerShell - powershell

I have made a script that does a really basic task, it connects to a remote FTP site, retrieves XML files and deletes them afterward.
The only problem is that in the past we lost files because they were added when the delete statement was run.
open ftp.site.com
username
password
cd Out
lcd "E:\FTP\Site"
mget *.XML
mdel *.XML
bye
To prevent this from happening, we want to put a script on the FTP server (rename-files.ps1). The script will rename the *.xml files to *.xml.copy.
The only thing is I have no clue how to run the script through my FTP connection.

Some, but very few, FTP servers support SITE EXEC command. In the very rare case that your FTP server does support it, you can use:
quote SITE EXEC powershell rename-files.ps1
Though, in most cases, you cannot execute anything on the FTP server, if FTP is the only way you can access the server. You would have to use another method to execute the script, like SSH, PowerShell Remoting, etc.
But your problem has other solutions:
rename files using FTP; or
delete only the files that were downloaded.
Both are doable, but you will need better FTP client than Windows ftp.exe.
See for example A WinSCP script to download, rename, and move files.
Or you can do it like:
Run ftp.exe once to retrieve list of files;
Process the list in PowerShell to generate an ftp script with get and del commands for specific files (without wildcard);
Run ftp.exe again with generated script.
Actually you do not need to use ftp.exe in PowerShell. .NET assembly has its own FTP implementation: FtpWebRequest.

Related

Why is my file retrieved from FTP in a remote PowerShell session encrypted?

Yet another works-fine-locally-but-looses-its-mind-in-a-remote-session problem.
I have a PS script that runs standard Windows command line ftp.exe to get a file. Works a treat when ran directly, however when ran remotely via Invoke-Command it suddenly leaves the files with the encrypted bit set.
If I then, in the same PS session (in the same script), run cipher /d on the file, I get Access Denied. However if I log onto the remote machine using the same account, I can decrypt it.
So, question the first, is this a "feature" of ftp.exe? I can't find anything suggesting as such, but no other method of creating a file seems to result in it being encrypted, so I'm left thinking it is an intentional act by the application, like it checks the logon type and encrypts if it a network logon.
Second, why can I not immediately decrypt it? Same account, same session.
The essential bits of the script in question:
#the ftp script is just open, user, binary, get, quit
& ftp -n -v -s:"$script"
& cipher /d "$file_path"
I realize this is probably a pretty obscure edge case, but I'll leave an answer just in case anyone runs into anything similar.
As usual, ProcMon has all the answers...
At my company %HOMESHARE% is set to a network file server (by some GPO I believe).
As ftp.exe is retrieving a file, it writes to a temp file and then once finished, copies it over to the specified location. Even after knowing this, one might expect %TEMP% to be used for such a purpose, but no.
I'm not quite sure exactly how ftp.exe goes about determining the temp file location, but when I'm in a PSsession, it chooses my Documents folder (%USERPROFILE% I suppose), but when I'm in an RDP session it uses %HOMEPATH%. So of course my Documents folder is set to encrypt new files and so the temp file is encrypted and gets copied over, but the file share is not so it copies over clean.
Also, while I have found nothing official stating this, it does seem that cipher.exe is completely ineffective for a network logon. If after entering a PSSession I create a new file with Set-Content and attempt to encrypt using cipher /e <file> it gives the same access denied. Same account over RDP encrypts no problem;

Startup Task not running on Azure Cloud Service role

I'm having difficulties trying to setup a startup task in an Azure role.
The ultimate goal is to disable RC4 cipher, along with other SSL configurations. In my (VS2012Express) project (solution partially achieved following another answer here in SO that led me to https://gist.github.com/sidshetye/29d6d48dfa0c2f5488a4 ) I created a Startup.cmd file like this:
# Execute powershell command to disable RC4 and imporve SSL security settings
ECHO Batch started >> "StartupLog.txt" 2>&1
PowerShell -ExecutionPolicy Unrestricted .\HardenSSL.ps1 >> log- HardenSSL.txt 2>&1
EXIT /B 0
HardenSSL.ps1 is the PowerShell script from the previous link. Both the .cmd and .ps1 scripts are placed in the application root directory, marked as "Content" with properties set to "CopyLocal=Always".
In my service definition, I put this:
<Startup>
<Task commandLine="Startup.cmd" executionContext="elevated" taskType="background"></Task>
</Startup>
Now, when I deploy the application to Azure, "nothing" happens. I configured the role instance to allow remote desktop, connected to the machine. I verified the scripts where published, and there were no log files, RC4 still enabled. I tried to manually run the .cmd and the machine runs the scripts to completion, disables RC4 and restarts. So the scripts are actually "correct".
The problem is that the scripts are not getting fired up at startup. I may be wrong, but I don't see anything related looking Windows events. Actually, the server now keeps all the configurations, but I have to be sure the scripts get executed in case I'll have to publish to new instances/cloud services.
I also tried to:
1. place the scripts on a child directory
2. create other 2 "simpler" .cmd that just create a log file with "script started" to exclude problems related to the .cmd calling the PowerShell script.
None of those scripts got executed.
Hope I've been sufficiently clear, any help would be greatly appreciated.
Thank you in advance,
Alberto
UPDATE 1
Reading through various discussions, I missed one very important thing: the script files are actually published in 2 distinct places, one being inside the /bin folder.
Ex: I placed my scripts in a /StartupScripts folder in my project, and when I connect via Remote Desktop to the Azure server I find the scripts both in "approot/StartupScripts" and in "approot/bin/StartupScripts".
The scripts the are actually executing are those placed inside the "bin" folder. the real problem is that I have probably a path problem inside the .cmd since I now found the execution logs with an error.
Now I will try to change it up and update the question here on SO.
Ok.
In the end it was indeed a problem with a path in my Startup.cmd file: .\HardenSSL.ps1 could not be found if the StartUp Task pointed to a subfolder.
Solution was to place both Startup.cmd and HardenSSL.ps1 files in the application root, remove the ".\" part when calling the PowerShell Script and all worked well.
Anyway, I would like to suggest anyone to pick this other solution I found in stack exchage:
https://security.stackexchange.com/a/79957
It links to a NuGet package that does the same thing as the script I found on the link to github in the original post, just "better"; mainly:
Better configuration of cipher suites, with support for ForwardSecrecy for all reference browsers on SSLLabs
Retain SSL support for Internet Explorer 8 on windows XP (unfortunately still a necessity for us)
Alberto.

Pausing a perl script while SFTP transfers files

FYI, I'm a complete newbie with Perl, as in I can spell it and only a little more so I'm trying to learn. What I'm trying to accomplish is using SFTP to transfer files from a Windows machine to a Linux machine.
I've noticed that Perl issues the SFTP get command, but doesn't wait for the transfer to finish so when the Perl script tries to use a file it can't find it. I know there is the sleep command, but the number and size of files will vary on a weekly basis so using sleep(600) seems a little silly.
Is there a standard way to pause a Perl script until SFTP finishes transferring all necessary files?
TIA.
Using Net::SFTP might have solved this dilemma, but my workplace won't allow me to download and install stuff, especially in production. So rather than waiting on the typical bureaucracy I did some more digging around and discovered this:
By calling SFTP in batch mode using a separate file that contains the SFTP commands, the Perl script has to wait for SFTP to finish executing the commands in the separate "command" file. So by using the batch mode option, the Perl script is paused as long as it takes for SFTP to finish its work of file transfer.

Copy a non exe File to a Remote Machine

I used PsExec to copy and run an exe file in a remote machine. I also want to copy a xml file to remote machine. I am able to do this way
PsExec.exe -d -c \\someserver c:\somefile.xml
The above command throws error saying system cannot find the file specified but adds the xml file to remote server.
Do u know any better way of copying files to remote server.
Is there any PsTool available for that?
Or an option in PsExec ?
Edit: (Answer)
I found out that using Powershell we can copy file to remote machines and it worked.
As you can read from psexec help
-c: Copy the specified program to the remote system for execution. If you omit this option the application must be in the system path on the
remote system.
So your xml file is copied on remote sys/USER:[domainname]username]tem and executed, this gives you the error.
If your xml is part of an application you have to run in remote computer, one solution is compress the app with all necessary files in a self-extracting EXE that runs main command when extracted.
If you just have to copy a file, why don't you use a simple script that maps remote folder and then copies file? Something like:
NET USE \\computername\sharename password /USER:[domainname\]username
xcopy .....
NET USE \\computername\sharename /DELETE
PsExec is not designed to copy files across machines. It can only copy the program it is going to run remotely.
If you have access to the remote machine, the copy could be done by running copy c:\somefile.xml \\\\remote-machine\Admin$ before running PsExec.
You can use this pattern with psexec to copy any extension ...
psexec -d -i 2 \PC Name -u domain\username -p password cmd /c copy
\server\location\filename c:\xx\xx\xx
PS: Refer to PSEXEC switches if you're unsure of what -d and i does. However "2" is a session id of remote desktop user that may change every time a new remote desktop session is created.
this helped me copy my exe file into c:\windows directory (one to one copy within same domain) :
PsExec.exe -d -c \\remoteserver -u administrator -p password c:\executable.exe

Send data to putty in powershell

I have powershell script that open putty.exe in process and i want to send data to this process, how can i do that???
PLEASE HELP!
The process:
$solExe = [diagnostics.process]::start("putty.exe", "-raw -P 2000 127.0.0.1")
The command line interface for putty is plink.exe. You can use plink to send commands over ssh.
For example:
PS C:> c:\progra~2\putty\plink.exe -i C:\credentials\mykeyfile.ppk root#myserver.com "ls";
Things to remember:
The first time you connect to a server, you'll have to add it to your registry, so this won't work in a non-interactive mode for brand new servers. There isn't a way to disable this.
The key file has to be in ppk format for plink.exe to recognize it. If yours is in pem format, use puttygen.exe to create a ppk file.
The path to the key file cannot contain any spaces, or the command above won't work.
If you want to send multiple commands at once, write them to a file and use the -m switch with plink.exe.
If you need to transfer files, you can use pscp.exe in a similar fashion.