WinSCP timestamp retrieval from command line/scripting - command-line

How do I get the last time stamp of a .txt file from WinSCP when using .txt and .bat scripting?

Well, you should better state, what do you want to do with the timestamp. You might get better answer then.
Anyway:
WinSCP has the stat command:
stat /home/martin/index.html
Outputs something like:
-rwxr--r-- 0 20480 Jan 5 14:09:33 2009 index.html
You can redirect the output of the WinSCP script to a file and parse.
Or even better use XML logging and parse the XML log.
Easier solution may be to make use of WinSCP .NET assembly method Session.GetFileInfo.
An additional example (to those linked in the method documentation) is here:
https://winscp.net/eng/docs/scriptcommand_stat#net

Related

postgreSQL COPY command error

Hallo everyone once again,
I did various searches but couldn't gind a suitable/applicable answer to the simple problem below:
On pgAdminIII (Windows 7 64-bit) I am running the following command using SQL editor:
COPY public.Raw20120113 FROM 'D:\my\path\to\Raw CSV Data\13_01_2012.csv';
I tried many different variations for the path name and verified the path, but I keep getting:
ERROR: could not open file "D:\my\path\to\Raw CSV Data\13_01_2012.csv" for reading: No such file or directory
Any suggestions why this happens?
Thank you all in advance
Petros
UPDATE!!
After some tests I came to the following conclusion: The reason I am getting this error is that the path includes some Greek characters. So, while Windows uses codepage 1253, the console is using 727 and this whole thing is causing the confusion. So, some questions arise, you may answer them if you like or prompt me to other questions?
1) How can I permanently change the codepageof the console?
2) How can I define the codepage is SQL editor?
Thank you again, and sorry if the place to post the question was inappropriate!
Try DIR "D:\my\path\to\Raw CSV Data\13_01_2012.csv" from command line and see if it works - just to ensure that you got the directory, file name, extension etc correct.
The problem is that COPY command runs on server so it takes the path to the file from the server's scope.
To use local file to import you need to use \COPY command. This takes local path to the file into account and loads it correctly.

Looking for command line ftp client (linux)

I am looking to batch download a large number of files (>800). I have a text file with the list of all the filenames. These filenames are then used to derive the URL from which they can be downloaded. I had been parsing through the file with a python script, using subprocess to wget the files.
wget ftp://ftp.name.of.site/filename-prefix/filename/filename+suffix
However for reasons unknown to me, wget is failing to properly connect. I wanted to know if I could essentially use an ftp program that would work in a similar manner, i.e. no login and stay within the commandline.
Edit:
What's in my text file:
ERS032033
ERS032214
ERS032234
ERS032223
ERS032218
The ERS### act as the prefix. The whole thing is the filename. The final file (i.e. filename+suffix) would look something like: ERS032033_1.fastq.gz
Submitting the correct url is not the problem.
Since you are using Python, I suggest dropping the subprocess approach and using the urllib module instead:
import urllib
handle = urllib.urlopen('ftp://ftp.name.of.site/filename-prefix/filename/filename+suffix')
print handle.read()
handle.close()
Assuming you are using Python 2 (urllib.request for Python 3)
If you simply need batch download, urllib.urlretrieve is a cleaner approach.

compare file size after ftp get with the original file on server

In SQL I'm using xp_cmdShell to run FTP commands. I have no problem getting the list of files or copying files to the local server, but I want to compare copied file size to the original to make sure the get has been successful.
Any ideas on how to compare file sizes?
From a command prompt you can use the DOS File Compare command (fc). In your case you probably want to do a binary compare (there is no file size compare). I binary compare should work in your case.
Most DOS commands will return some code that let s you know the status.
http://www.computerhope.com/fchlp.htm
EDIT
Sorry, I read your question and realized you want to compare it against a file on the ftp server. I think this is a moot point since if ftp reports a successful file transfer there is no reason to compare (unless your source of comparison for not the ftp site). Does that make sense?
What you could do it use the FTP command ls command.
ftp> ls <filename>
where ftp> is the ftp prompt and not part of the command. This command gives you the file size in bytes. Then you need to use the dos command for the local file. Here is a StackOverflow question (and answer) about that.
Windows command for file size only?

Shell Script to update the contents of a folder

I'm a beginner in Unix Shell Scripting and Perl Scripting.
I would like to have an example program that teaches me how to update a file contents on a directory.
The scenario is, there is a directory which has some n number of files.
Among those n number of files, m number of files have been modified.
I need to update the contents of the modified files in the directory.
Give me a simple shell script to do this.
Thanks and Regards,
Vijay
I would do it with find like this:
find your_directory -newermt time_of_last_check -exec modify_script.sh {} \;
where:
your_directory is the directory where you have the files.
time_of_last_check is when you last ran this command
modify_script.sh is the program that you will run to modify the files, it should take one argument, and that is the filename to modify.
In Perl
To Update a File content see perlfaq5, you will find lot of information regarding File manipulation.You will get a lot of examples of file manipulations.
Getting File or Dir Statistics see perl built in function stat.
For Traverse a directory tree, see
File::Find

How can I login to an FTP site and remove files that are more than 7 days old?

I need a shell or Perl script which would connect to the FTP server and delete all the files which are more than 7 days old.
cheers
Use the Net::FTP module to connect to the ftp server as outlined in the CPAN document. To browse through the site listings you may have to combine cwd/cdup in order to handle directories (unless ofcourse all the files are in the root directory).
To get the file's modification time use the mdtm(FILE) method, just make sure to check if this is supported on the current server by calling
if( $ftp->feature( 'MDTM' ) ) {
...check modification time for file...
}
If not, then you might try calling the 'dir' method which will get you the listings in the long format, and then extract the date information from the individual file listings in order to compare and delete.
To compare the two dates use the Date::Calc module. The 'Delta_Days' method should give you the number of days between two dates; this can be used just as easily for either of the methods specified above.
In Perl, you'd want to use Net::FTP's ls, mdtm, and delete commands.
If it's a shell script you're after, you might be better off running a script in a crontab.
find /tmp -type f -mtime +7 -exec rm {} \;