How to overwrite existing files with ncftpget? - ncftp

When getting a remote ftp file that exists in the local destination
ncftpget says that
local file appears to be the same as the remote file, download is not necessary.
What does appears mean? How does ncftpget check if this is the same file?
It seems that it compares time and size of the file. But does it compare content or at least checksum?
Is there a way to force to overwrite the existing file. Of course other than remove it first.

Related

Skipping the errors and recording downloads in kdb

I am trying to use kdb q script to download file from remote source.
How can I make the download keep going if there is an error?
also, how can i mark it down what its downloaded in linux when there are other files in the same directory???
Here is my code:
file:("abc.csv";"def.csv");
dbdir:"/home/terry/";
dlFunc:{
system "download.sh abc.com user /"get /remote/path/",x /",dbdir};
dlFunc each file;
If you're asking how to continue downloading other files if one file fails then you can put a protected eval around your dlFunc each file, e.g.
#[dlFunc;;()]each file;
You could capture the list of failed files using something like:
badfiles:();
{#[dlFunc;x;{y;badfiles,:enlist x}x]}each file;
Then inspect the badfiles list afterwards. The ones that succeeded would be:
file except badfiles

Youtube-dl how to get a direct download link to the merged file without creating a temp file on server

Is there any way to create a direct download link to the merged file without creating a temp file on the server in youtube-dl?
youtube-dl -f 255+160 https://youtu.be/p-flvm1szbI
The above code will merge the file and output the merged file.
I want to allow users to directly download the merged file to their computers -- without creating any temp file on my server. Is this possible?
(Creating a temp file and then letting the user download it is already possible.)

How to extract .gz file with .txt extension folder?

I'm currently stuck with this problem where my .gz file is "some_name.txt.gz" (the .gz is not visible, but can be recognized with File::Type functions),
and inside the .gz file, there is a FOLDER with the name "some_name.txt", which contains other files and folders.
However, I am not able to extract the archive as you would manually (the folder with the name "some_name.txt" is extracted along with its contents) when calling the extract function from the Archive::Extract because it will just extract the "some_name.txt" folder as a .txt file.
I've been searching the web for answers, but none are correct solutions. Is there a way around this?
From Archive::Extract official doc
"Since .gz files never hold a directory, but only a single file;"
I would recommend using tar on the folder and then gz it.
That way you can use Archive::Tar to easily extract specific file:
Example from official docs:
$tar->extract_file( $file, [$extract_path] )
Write an entry, whose name is equivalent to the file name provided to disk. Optionally takes a second parameter, which is the full native path (including filename) the entry will be written to.
For example:
$tar->extract_file( 'name/in/archive', 'name/i/want/to/give/it' );
$tar->extract_file( $at_file_object, 'name/i/want/to/give/it' );
Returns true on success, false on failure.
Hope this helps.
Maybe you can identify these files with File::Type, rename them with .gz extension instead of .txt, then try Archive::Extract on it?
A gzip file can only contain a single file. If you have an archive file that contains a folder plus multiple other files and folders, then you may have a gzip file that contains a tar file. Alternatively you may have a zip file.
Can you give more details on how the archive file was created and a listing of it contents?

Zipped data getting loss during copying from FTP to windows

I am trying to copy some zipped file from FTP to my local system (Windows). The transfer mode is default mode (ASCII). File is getting copied, I am not getting any problem during transfer.
The problem is that the size of file on FTP to the one which is copied on my local system is different.
FTP_file_size -> 12,812,085
Copied_file_size->12,551
Above files should be the same.
Now I am not able to figure it out what is wrong going with transfer.
For script which i am using please refer :
Why am I getting "File not found" errors with this Perl script using Net::FTP?
You have to use the binary (type "I") mode to transfer. Otherwise the FTP client translates line-ending characters to the local convention (on Windows: CR-LF) which would corrupt the ZIP format.

compare file size after ftp get with the original file on server

In SQL I'm using xp_cmdShell to run FTP commands. I have no problem getting the list of files or copying files to the local server, but I want to compare copied file size to the original to make sure the get has been successful.
Any ideas on how to compare file sizes?
From a command prompt you can use the DOS File Compare command (fc). In your case you probably want to do a binary compare (there is no file size compare). I binary compare should work in your case.
Most DOS commands will return some code that let s you know the status.
http://www.computerhope.com/fchlp.htm
EDIT
Sorry, I read your question and realized you want to compare it against a file on the ftp server. I think this is a moot point since if ftp reports a successful file transfer there is no reason to compare (unless your source of comparison for not the ftp site). Does that make sense?
What you could do it use the FTP command ls command.
ftp> ls <filename>
where ftp> is the ftp prompt and not part of the command. This command gives you the file size in bytes. Then you need to use the dos command for the local file. Here is a StackOverflow question (and answer) about that.
Windows command for file size only?