FTP transfer server to server using SSH/command line - command-line

I have a bunch of vendors that make their FTPs available to download images of their products. Some of these guys like to put them into multiple subfolders, using the collection or style name and then sku. For example, they will make folder structure like:
Main folder
---> Collection A
------> Sku A
----------> SKUApicture1.jpg, SKUApicture2.jpg
------> sku B
----------> SKUBpicture1.jpg, SKUBpicture2.jpg
---> Collection B
------> Sku C
----------> SKUCpicture1.jpg, SKUCpicture2.jpg
------> sku D
----------> SKUDpicture1.jpg, SKUDpicture2.jpg
Until now, I have found it easiest to log onto my server via SSH, navigate to the folder I want, and then log on to my vendor's FTP, at which point I put in the user name a PW and navigate to the folder I want, and then take all the images using mget. If all (or most) of the files are in 1 folder, this is simple.
The problem is mget won't take and folders or subfolders, it will only take files within the given folder. In the above example, my vendor has over 10 folders and each one has 100+ subfolders, so navigating to each one isn't an option.
Also, the industry I deal in isn't tech savy, so asking their "tech people" to enable/allow SCP, SFTP, or rsync, etc., is likely not an option.
Downloading all the images locally and re-uploading them to my server also isn't practical, as this folder is over 10GB.
I'm looking for a command (mget or other) that will enable me to take ALL files and subfolders, as is, and copy straight to my server (via SSH).
Thanks
NOTE: For this particular server I tried rsync, but got an error telling me it wasn't compatible with that command. I doubt I have the command wrong, but if you want to post the proper way to rsync I'll be more then happy to try it again and provide the exact error

Have you tried something like
wget -r ftp://username:password#ftp.example.com/
It should recursively get all the files from the remote ftp.

You can use the lftp:
lftp -e 'mirror <remote download dir> <local download dir>' -u <username>,<pass> <host>
Taken from Copying Folder Contents with Subdirectories Over FTP.

Have you considered using SFTP? You said that FTP works how you want it to work, and SFTP works the exact same way. Your FTP client with SFTP support behaves the exact same way but it's using SSH to connect.

Related

Firebase hosting: The remote web server hosts what may be a publicly accessible .bash_history file

We host our website on firebase. We fail a security check due to the following reason:
The remote web server hosts publicly available files whose contents may be indicative of a typical bash history. Such files may contain sensitive information that should not be disclosed to the public.
The following .bash_history files are available on the remote server : - /.bash_history Note, this file is being flagged because you have set your scan to 'Paranoid'. The contents of the detected file has not been inspected to see if it contains any of the common Linux commands one might expect to see in a typical .bash_history file. - /cgi-bin/.bash_history Note, this file is being flagged because you have set your scan to 'Paranoid'. The contents of the detected file has not been inspected to see if it contains any of the common Linux commands one might expect to see in a typical .bash_history file. - /scripts/.bash_history Note, this file is being flagged because you have set your scan to 'Paranoid'. The contents of the detected file has not been inspected to see if it contains any of the common Linux commands one might expect to see in a typical .bash_history file.
The problem is that we don't have an easy way to get access to the hosting machine and delete these files.
Anybody knows how it can be solved?
If you are using Firebase Hosting, you should check the directory (usually public) that you are uploading via the firebase deploy command. Hosting serves only those files (plus a couple of auto-generated ones under the reserved __/ path for auto-configuration).
If you have a .bash_history, cgi-bin/.bash_history or scripts/.bash_history in that public directory, then it will be uploaded to and served by Hosting. There are no automatically served files with those name.
You can check your public directory, and update the list of files to ignore on the next deploy using the firebase.json file (see this doc). You can also download all the files that Firebase Hosting is serving for you using this script.

How to turn a folder hierarchy into a share on one server and copy permissions from a different server?

Server 1 is going to be going away which is currently our file sharing server. Server 2 is going to be replacing Server 1. Right now Server 2 has the same folder structure and files already added to it, but none of it is setup as a share. Is their a way I can turn all folders on Server 1 into a share, copy the permissions from those folders on Server 2 and add them to the shares on Server 1. For example:
Server 1 has file structure dir1/dir2/dir3/file.txt.
Server 2 will have the same exact file structure dir1/dir2/dir3/file.txt
I need to turn Server 2 files into a share then copy permissions from each directory/file on Server 1 and add them to the counterparts on Server 2. I would prefer this to be programmatically as we have a lot of different shares/directories. I tried using Get-Acl and Set-Acl, but it only does it folder per call. Can anyone guide me in the right direction?

mutt + offlineimap and only few folder offline

I have been trying to understand offlineimap with mutt configuration but I probably do not. In the end, I realised that what i need is to have offline only e.b. Inbox and Sent. That configuration one can find on internet but I also need to be able to access the other folders in mutt but without having to download them offline.
E.g.
I want all mails in Inbox to be downloaded offline to the computer and mutt accessing them from the local repository. But I also need to access folder Inbox/SomeMore but without having to reconfigure mutt and offlineimap and most importantly without donwloading the whole content of that folder to the computer.
Is this doable? And exactly how?
offlineimap's job is to download mails and make them available in offline situations. There is no way to temporary download the content of some mail folders. It might be possible to go for a hacky solution. Specify the folders you don't want to sync with the folderfilter option and additionally set up mutt to access the other IMAP folders.
You can specify a folder filter that excludes specific folders. Instead of adding the subfolder's name to the list it might be even possible to exclude it like this INBOX/foo (in case of having multiple folders with the same name):
folderfilter = lambda folder: folder in ['INBOX', 'Sent', 'Drafts', 'Junk', 'foo', ...]
PS: If folderfilter is not specified at all, ALL remote folders will be synchronized.

Is it possible to keep a local mirror of an FTP site with aria2?

I have a website to which I have FTP access only (otherwise I'd use rsync for this) and I'd like to keep a local copy of it. At the moment I run the following wget command every once in a while
wget -m --ftp-user=me --ftp-password=secret ftp://my.server.com
When there are many updates it does get tedious with wget only having one connection at a time. I read about aria2 but couldn't find any hints as to answer the questions whether it would be possible to use aria2 as a replacement for this purpose?
No, according to the aria2 docs the option for downloading only newer files only works with http(s).
--conditional-get[=true|false]
Download file only when the local file is older than remote file. This function only works with HTTP(S) downloads only. It does not work if file size is specified in Metalink.

What is the best way to transfer files between servers constantly?

I need to transfer files between servers ... no specific move, but continuos moves. What is the best way ?
scp, ftp, rsync ... other ?
Of course if it's a "common" method (like ftp) I would block just to works between the IP's of the servers.
Also I need a SECURED form to transfer files ... I mean, to be totally sure that the files have moved successfully .
Has rsync a way to know the files were moved successfully ? maybe an option to check size or checksum or whatever.
Note: The servers are in different location
try rsync, utility that keeps copies of a file on two computer systems