mutt + offlineimap and only few folder offline - email

I have been trying to understand offlineimap with mutt configuration but I probably do not. In the end, I realised that what i need is to have offline only e.b. Inbox and Sent. That configuration one can find on internet but I also need to be able to access the other folders in mutt but without having to download them offline.
E.g.
I want all mails in Inbox to be downloaded offline to the computer and mutt accessing them from the local repository. But I also need to access folder Inbox/SomeMore but without having to reconfigure mutt and offlineimap and most importantly without donwloading the whole content of that folder to the computer.
Is this doable? And exactly how?

offlineimap's job is to download mails and make them available in offline situations. There is no way to temporary download the content of some mail folders. It might be possible to go for a hacky solution. Specify the folders you don't want to sync with the folderfilter option and additionally set up mutt to access the other IMAP folders.
You can specify a folder filter that excludes specific folders. Instead of adding the subfolder's name to the list it might be even possible to exclude it like this INBOX/foo (in case of having multiple folders with the same name):
folderfilter = lambda folder: folder in ['INBOX', 'Sent', 'Drafts', 'Junk', 'foo', ...]
PS: If folderfilter is not specified at all, ALL remote folders will be synchronized.

Related

import of mail folders by Thunderbird

My company is no longer supporting our Linux mail server (all will be handled by gmail).
Over the years I've run many mail clients on the Linux server: elm, alpine, squirrelMail, roundCube. My most recent client has been RoundCube.
Ideally I'd like Thunderbird to import the most current folders from RoundCube; these appear to me to be inside Maildir/ (with deeper directories like .saved-mailed, etc). But I also have Mail/ (which alpine appears to reference).
But upon adding this account to Thunderbird, some mix of folders is presented to me: not all from Maildir/ and not all from Mail/...in fact no 'new' Roundcube folders are presented.
Where does Thunderbird search on a linux mail server to 'subscribe' folders? And how can I access this location to force the subscription of the folders I actually want?
I gave up trying to determine where Thunderbird (Tb) searches for mail.
Instead I copied all Roundcube email in Maildir/ to my local machine and then used the code here
https://gist.github.com/lftbrts/249f034a439d3eb2e008f73506cacc2d
to convert that email to mbox format.
Then I copied all that converted email to Tb's 'Local Folders' directory; Tb was able to load all the converted folders and then I 'dragged' them (using Tb) to the synced Gmail account.
So the above named coded saved the day!

Talend: Using tfilelist to access files from a shared network path

I have a Talend job that searches a directory and then uploads it to our database.
It's something like this: dbconnection>twaitforfile>tfilelist>fileschema>tmap>db
I have a subjobok that then commits the data into the table iterates through the directory and movies files to another folder.
Recently I was instructed to change the directory to a shared network path using the same components as before (I originally thought of changing components to tftpfilelist, etc.)
My question being how to direct it to the shared network path. I was able to get it to go through using double \ but it won't read any of the new files arriving.
Thanks!
I suppose if you use tWaitForFile on the local filesystem Talend/Java will hook somehow into the folder and get a message if a new file is being put into it.
Now, since you are on a network drive first of all this is out of reach of the component. Second, the OS behind the network drive could be different.
I understand your job is running all the time, listening. You could change the behaviour to putting a tLoop first which would check the file system for new files and then proceed. There must be some delta check in how the new files get recognized.

FTP transfer server to server using SSH/command line

I have a bunch of vendors that make their FTPs available to download images of their products. Some of these guys like to put them into multiple subfolders, using the collection or style name and then sku. For example, they will make folder structure like:
Main folder
---> Collection A
------> Sku A
----------> SKUApicture1.jpg, SKUApicture2.jpg
------> sku B
----------> SKUBpicture1.jpg, SKUBpicture2.jpg
---> Collection B
------> Sku C
----------> SKUCpicture1.jpg, SKUCpicture2.jpg
------> sku D
----------> SKUDpicture1.jpg, SKUDpicture2.jpg
Until now, I have found it easiest to log onto my server via SSH, navigate to the folder I want, and then log on to my vendor's FTP, at which point I put in the user name a PW and navigate to the folder I want, and then take all the images using mget. If all (or most) of the files are in 1 folder, this is simple.
The problem is mget won't take and folders or subfolders, it will only take files within the given folder. In the above example, my vendor has over 10 folders and each one has 100+ subfolders, so navigating to each one isn't an option.
Also, the industry I deal in isn't tech savy, so asking their "tech people" to enable/allow SCP, SFTP, or rsync, etc., is likely not an option.
Downloading all the images locally and re-uploading them to my server also isn't practical, as this folder is over 10GB.
I'm looking for a command (mget or other) that will enable me to take ALL files and subfolders, as is, and copy straight to my server (via SSH).
Thanks
NOTE: For this particular server I tried rsync, but got an error telling me it wasn't compatible with that command. I doubt I have the command wrong, but if you want to post the proper way to rsync I'll be more then happy to try it again and provide the exact error
Have you tried something like
wget -r ftp://username:password#ftp.example.com/
It should recursively get all the files from the remote ftp.
You can use the lftp:
lftp -e 'mirror <remote download dir> <local download dir>' -u <username>,<pass> <host>
Taken from Copying Folder Contents with Subdirectories Over FTP.
Have you considered using SFTP? You said that FTP works how you want it to work, and SFTP works the exact same way. Your FTP client with SFTP support behaves the exact same way but it's using SSH to connect.

Dump an IMAP Mailbox to a local folder

I need to dump IMAP mailboxes from an IMAP server.
I would like to replicate IMAP folders structure on a specific path.
I would like to dump all emails and sub-folders (and then "sub-emails"), but in a filesystem.
Emails in EML format and, foreach one, a file (MAIL1.EML, MAIL2.EML, etc)
IMAP folder as... folders!
I tried getmail but it does not work as expected (i don't want the qmail-style Maildir...)
Any solution?
The solution i found myself is to use the amazing google library imaputils
https://code.google.com/p/imaputils/
This library have the "iu-dump" utility that make exactly what i am searching about

What is the best way to transfer files between servers constantly?

I need to transfer files between servers ... no specific move, but continuos moves. What is the best way ?
scp, ftp, rsync ... other ?
Of course if it's a "common" method (like ftp) I would block just to works between the IP's of the servers.
Also I need a SECURED form to transfer files ... I mean, to be totally sure that the files have moved successfully .
Has rsync a way to know the files were moved successfully ? maybe an option to check size or checksum or whatever.
Note: The servers are in different location
try rsync, utility that keeps copies of a file on two computer systems