compare file size after ftp get with the original file on server - command-line

In SQL I'm using xp_cmdShell to run FTP commands. I have no problem getting the list of files or copying files to the local server, but I want to compare copied file size to the original to make sure the get has been successful.
Any ideas on how to compare file sizes?

From a command prompt you can use the DOS File Compare command (fc). In your case you probably want to do a binary compare (there is no file size compare). I binary compare should work in your case.
Most DOS commands will return some code that let s you know the status.
http://www.computerhope.com/fchlp.htm
EDIT
Sorry, I read your question and realized you want to compare it against a file on the ftp server. I think this is a moot point since if ftp reports a successful file transfer there is no reason to compare (unless your source of comparison for not the ftp site). Does that make sense?
What you could do it use the FTP command ls command.
ftp> ls <filename>
where ftp> is the ftp prompt and not part of the command. This command gives you the file size in bytes. Then you need to use the dos command for the local file. Here is a StackOverflow question (and answer) about that.
Windows command for file size only?

Related

Powershell call a script from another scipt as if in a different folder

I use the video transcoding tools made by Don Melton over on GitHub to compress self filmed videos. Now I would like to automate this task by using a PowerShell script to loop over the contents of a folder as input arguments for the tool and have the output put into a seperate folder. My problem is that the tool is written in a way that it has no option to provide an output location, instead it always places the output files in the directory where it is called in. So when I cd into an "output" directory "next to" the one where my input files are, I then can call
other-transcode ../input/file.mp4
and the output file of the same name as the input file will be placed in the output directory.
Now when I want to use the command in a script, how do I tell PowerShell to run the command as if it was typed manually into a shell that was in the output directory at the moment?
For context, this is my end goal, but I think it is easier to split the complicated question into multiple ones.

How does the pgadmin encodes the file path in backups?

I'm trying to restore dump files from locations that contain character from other languages besides English.
So here is what I did:
From inside the pgadmin I used the backup tool like:
And inside the FileName input provided an actual real folder named "א":
C:\א\toc.dump
The actual file argument (-f file) has been auto decoded into:
pg_dump.exe --file "C:\\0F04~1\\TOC~1.DUM"
My question is what is the decoding system pgadmin uses in order to decode the file path argument?
How did it came up with 0F04~1 from א?
I'm asking it because pg_restore is not supporting file path that contains not English chars (from cmd):
pg_dump.exe --file "C:\\0F04~1\\TOC1.DUMP" .... WORKS OK!
pg_dump.exe --file "C:\\א\\TOC1.DUMP" ... Not Working!
pg_restore: [custom archiver] could not open input file "..."
As in this question, so if I'll find the encoding system for pgadmin I'll use it from code.
My goal is to encode the path that contain not-English chars from a batch code so it will work.
This is not something weird pgadmin does, but rather it is something weird Windows itself does when needing to represent such file names in a DOS-like setting. Like when the name is more than 8 chars, or extension more than 3.
In my hands the weird presentation is only there in the logs and status messages. If I use the GUI file chooser, the file names look normal, and replay successfully.
If you really want to know what Windows is doing, I think that is a better question for superuser with a Windows tag. I don't know why you can't restore these files. Are you using the pgAdmin GUI file chooser or trying to type the names in directly to something?

postgreSQL COPY command error

Hallo everyone once again,
I did various searches but couldn't gind a suitable/applicable answer to the simple problem below:
On pgAdminIII (Windows 7 64-bit) I am running the following command using SQL editor:
COPY public.Raw20120113 FROM 'D:\my\path\to\Raw CSV Data\13_01_2012.csv';
I tried many different variations for the path name and verified the path, but I keep getting:
ERROR: could not open file "D:\my\path\to\Raw CSV Data\13_01_2012.csv" for reading: No such file or directory
Any suggestions why this happens?
Thank you all in advance
Petros
UPDATE!!
After some tests I came to the following conclusion: The reason I am getting this error is that the path includes some Greek characters. So, while Windows uses codepage 1253, the console is using 727 and this whole thing is causing the confusion. So, some questions arise, you may answer them if you like or prompt me to other questions?
1) How can I permanently change the codepageof the console?
2) How can I define the codepage is SQL editor?
Thank you again, and sorry if the place to post the question was inappropriate!
Try DIR "D:\my\path\to\Raw CSV Data\13_01_2012.csv" from command line and see if it works - just to ensure that you got the directory, file name, extension etc correct.
The problem is that COPY command runs on server so it takes the path to the file from the server's scope.
To use local file to import you need to use \COPY command. This takes local path to the file into account and loads it correctly.

Zipped data getting loss during copying from FTP to windows

I am trying to copy some zipped file from FTP to my local system (Windows). The transfer mode is default mode (ASCII). File is getting copied, I am not getting any problem during transfer.
The problem is that the size of file on FTP to the one which is copied on my local system is different.
FTP_file_size -> 12,812,085
Copied_file_size->12,551
Above files should be the same.
Now I am not able to figure it out what is wrong going with transfer.
For script which i am using please refer :
Why am I getting "File not found" errors with this Perl script using Net::FTP?
You have to use the binary (type "I") mode to transfer. Otherwise the FTP client translates line-ending characters to the local convention (on Windows: CR-LF) which would corrupt the ZIP format.

how to check for activity or lack thereof on a unix file directory using perl or unix commands

Scenario:
I have a process where many files are being copied (scp'd) to a DestinationServer by Host1, Host2, Host3, Host4 for example. Going to the same common directory: DestinationServer:/home/target. All the files are unique so no files will be overwritten. Host1-Host4 will have a cronjob that will launch their scp script to DestinationServer. The caveat is the Hosts are in different time zones, locations. So, they will finish at different times.
Need:
Since the files are being scp'd to Destination:/home/target, what is the best way to programmatically check when those scp's from the other Hosts are done??
Options:
My options are to programmatically do this either in perl or shell if possible.
What do I look for, what unix commands or perl modules could I use to help determine when the processes would finish? Any ideas, examples would be great! Thanks.
Use a Maildir kind of approach: copy all files to a temporary directory, then after the transfer is complete have the originating host perform a rename into the target directory via ssh. That way when a file appears in the target directory, you know that it is complete.
I suggest this because if you just scp files into the target directory and monitor the directory in whatever way, you cannot distinguish a complete transfer from an interrupted scp command or a network failure.
SGI::FAM, Sys::Gamin
Similar but alternative way to Jouni is to use semaphore files. Before scp-ing files originating host puts up semaphore-file and when finished, remove it. So you know, it's time.