Zipped data getting loss during copying from FTP to windows - perl

I am trying to copy some zipped file from FTP to my local system (Windows). The transfer mode is default mode (ASCII). File is getting copied, I am not getting any problem during transfer.
The problem is that the size of file on FTP to the one which is copied on my local system is different.
FTP_file_size -> 12,812,085
Copied_file_size->12,551
Above files should be the same.
Now I am not able to figure it out what is wrong going with transfer.
For script which i am using please refer :
Why am I getting "File not found" errors with this Perl script using Net::FTP?

You have to use the binary (type "I") mode to transfer. Otherwise the FTP client translates line-ending characters to the local convention (on Windows: CR-LF) which would corrupt the ZIP format.

Related

PostgreSQL 14.5 pg_read_binary_file could not open file for reading: Invalid argument

Yesterday I installed PostgreSQL 14.5 on a Windows 10 laptop.
I then ran an old script to load images into a table.
The script uses the pg_read_binary_file function.
Some of the images are .jpg files and some are .png files.
Of the 34 files, only 5 were successfully processed (1 .jpg and 4 .png). The other 29 failed with the following error:
[Exception, Error code 0, SQLState XX000] ERROR: could not open file "file absolute path" for reading: Invalid argument
For instance, the following statement executes without errors
select pg_read_binary_file('C:\Users\Jorge\OneDrive\Documents\000\020-logos\adalid.png') as adalid_png;
... and the following statement fails
select pg_read_binary_file('C:\Users\Jorge\OneDrive\Documents\000\020-logos\oper.png') as oper_png;
... with the following error message
[Exception, Error code 0, SQLState XX000] ERROR: could not open file "C:/Users/Jorge/OneDrive/Documents/000/020-logos/oper.png" for reading: Invalid argument
So far, I have not been able to identify any difference in the files that could be the cause of the error. Also, I'm pretty sure the script works on earlier releases of version 14. Unfortunately I have not been able to find a website to download any of those earlier releases to test it again.
Has anyone else found this problem, and its solution?
I think the issue is somehow caused by OneDrive. This laptop is new. When I logged in with my Microsoft account, the OneDrive directory was automatically created and updated. Apparently this operation only updates the directory entries, leaving the contents of the files in the cloud until they are opened. When I zipped the directory that contains all my images, a message from OneDrive appeared saying that in that moment it will restore some files. After that, all the commands in my scripts work.
My theory is that pg_read_binary_file gets the file entry from the directory, so it doesn't give the "No such file or directory" message; but then fails reading the contents, giving the "Invalid argument" message instead.
The unanswered question would be: why does 7-Zip make OneDrive restore the files but pg_read_binary_file does not?
UPDATE
After more testing, and reading Save disk space with OneDrive Files On-Demand for Windows, now I am sure that pg_read_binary_file could fail and send the message "Invalid argument" when the OneDrive file is not a locally available file. In Windows File Explorer such file has a blue cloud icon next to it.

Copy file from Remote Server using PgAdmin4

I am trying to copy a file from my Office Remote server as CSV output on my local machine (windows). I cannot use the export/import dialog. It shows the following error
Utility file not found. Please correct the Binary Path in the Preferences dialog
The same command works fine for local server files meaning I have already edited the binary path in the Configuration setting.
The COPY command gives the following error
ERROR: relative path not allowed for COPY to file SQL state: 42602
\Copy doesn't work either.
Can anyone suggest me a solution for this?
Good afternoon,
You can generate the CSV file like this:
copy(select * from schema.table) TO '/tmp/file.csv' WITH CSV DELIMITER '|';

How to overwrite existing files with ncftpget?

When getting a remote ftp file that exists in the local destination
ncftpget says that
local file appears to be the same as the remote file, download is not necessary.
What does appears mean? How does ncftpget check if this is the same file?
It seems that it compares time and size of the file. But does it compare content or at least checksum?
Is there a way to force to overwrite the existing file. Of course other than remove it first.

Copying large files using Remote Desktop

I have a 4 GB text file, compressed to 1.4 GB zip file. I need to copy it over to a Windows secure server using RDP. I am able to copy small files but not this file. It takes 15 mins and then shows an error. Any tips?
You can try to copy it by using Drive Redirection. Here's a tutorial.
BTW, RDP cannot copy files larger than 2GB by using clipboard as said in Microsoft support
window rdp clipboard has limit of about 2GB if you want to copy paste more than 2 gb file then you can try any of these options.
split file into parts like 1 gb each part with help of winrar or any other software
Use any FTP software
map local pc drive for remote desktop session(for move or copy data)
File size doesn't matter - I copied folders through Remote Desktop connection with 30GB and more. While doing this I received "Unspecified error". The Problem is that you aren't allowed to use the clipboard again while you are copying. Doesn't matter if you use the clipboard for the same machine or from the remote machine. To summarize don't use Ctrl+C.
The madness is the error is delayed so you don't recognize quickly that those things relate.
format usb drive as ntfs
connect drive as local resource in remote desktop
NET USE X: \\TSCLIENT\F
robocopy c:\source x:\
net use X: /delete
If you are administrator, you can copy the files of any size over the network using Administrative Shares assuming that it is not purposefully disabled.
Enter the following url on your File Explorer and you will see all the files and folders on your C drive of that computer with read and write access:
\\computername\c$
right-click zip-file >> Properties >> Advanced >> Encrypt contents
open your one-drive or googledrive (if the secure server allows you) and park the encrypted file on there.
(you might have to one-drive space by signing up to a months 365)
https://support.microsoft.com/en-us/office/manage-your-onedrive-storage-and-limits-989fce19-ade6-4e2f-81fb-941eabefee28
I guess it would might be possible to use google datastore or something cloudy.

compare file size after ftp get with the original file on server

In SQL I'm using xp_cmdShell to run FTP commands. I have no problem getting the list of files or copying files to the local server, but I want to compare copied file size to the original to make sure the get has been successful.
Any ideas on how to compare file sizes?
From a command prompt you can use the DOS File Compare command (fc). In your case you probably want to do a binary compare (there is no file size compare). I binary compare should work in your case.
Most DOS commands will return some code that let s you know the status.
http://www.computerhope.com/fchlp.htm
EDIT
Sorry, I read your question and realized you want to compare it against a file on the ftp server. I think this is a moot point since if ftp reports a successful file transfer there is no reason to compare (unless your source of comparison for not the ftp site). Does that make sense?
What you could do it use the FTP command ls command.
ftp> ls <filename>
where ftp> is the ftp prompt and not part of the command. This command gives you the file size in bytes. Then you need to use the dos command for the local file. Here is a StackOverflow question (and answer) about that.
Windows command for file size only?