I am using imapcopy on a ubuntu 14.04 server and telnet to migrate and respectively edit heavy imap inboxes accounts. I would like to flatten the folder structure, meaning that I want to select and copy all emails from all children folders into one "Import" folder.
I cant seem to be able to do just that. Any suggestions to an alternative way to do this?
Are you using this? https://launchpad.net/ubuntu/+source/imapcopy/1.04-1. If so, edit imaptools.pas and change the following line as shown to copy all messages to the IMPORT folder on the destination.
Original:
Result := Command ('APPEND '+Mailbox + Flags + ' {' + IntToStr (Length(Msg)) + '}',TRUE);
New:
Result := Command ('APPEND IMPORT' + Flags + ' {' + IntToStr (Length(Msg)) + '}',TRUE);
If the IMPORT folder does not already exist you'll need to create it.
Err := Dst.CreateMailbox ('IMPORT');
Then recompile it.
You can use https://github.com/Schluggi/pymap-copy as well. I think this is more handsome, because you don't have to compile anything.
If you would like it as described above (copy every mail of every subfolder to the import folder):
./pymap-copy.py \
--source-user=user1 \
--source-server=server1.example.org \
--source-pass=2345678 \
--destination-user=user2 \
--destination-server=server2.example.info \
--destination-pass=abcdef \
--redirect *:INBOX.import
And in the case you want to maintain your folder structure:
./pymap-copy.py \
--source-user=user1 \
--source-server=server1.example.org \
--source-pass=2345678 \
--destination-user=user2 \
--destination-server=server2.example.info \
--destination-pass=abcdef \
--destination-root INBOX.import
Related
I am trying to set the root user password in a custom recipe for yocto Dunfell.
Recipe looks like this: I have also tried EXTRA_USERS_PARAMS_append as shown in some other stackOverflow posts and it also did not work.
SUMMARY = "Test"
LICENSE = "CLOSED"
# Remove debugging tweaks
IMAGE_FEATURES_remove += " \
debug-tweaks \
"
# Add root password, and add the 'test' user
inherit extrausers
EXTRA_USERS_PARAMS = " \
usermod -P testpasswd root; \
useradd -p '' test \
"
FILES_${PN} = " /test/temp \
"
do_install () {
install -d ${D}/test/tmp
}
If I build this with my recipe, I can login as root with no password and when I check /etc/shadow the test user is not created.
I have verified that my desired directory /test/temp is created.
You should also remove allow-empty-password and empty-root-password features from IMAGE_FEATURES if they maybe available. enter link description here
and You didn't use semicolons at end of useradd -p '' test . This can cause error.
And you should be sure that debug-tweaks is not added at other strong files like local.conf
I want to mirror a site using wget:
wget --mirror \
--convert-links \
--adjust-extension \
--page-requisites \
--no-parent \
--wait=2 \
--progress=bar \
--show-progress \
--output-file=$LOG_FILE \
--directory-prefix=$DIR_PATH \
$URL
Now, it has been working well but I have come accross a website where the main page from which I want to start is under https://www.website.org/unique_path/here.html but it contains references to files or links that are like: https://www2.website.org/unique_path/there.pdf. However, --no-parent prevents the download of the content under www2... URL. Is there a way to circumvent this? (Or some option that would explicitly work as --no-parent by specifying some wildcard expression that it is ok to go and download here and there?
Is there a way to circumvent this?
You are apparently looking for Spanning Hosts options, you must provide -H option and then you might deliver comma-separated list of acceptable domains via -D, using your example
wget <your current options here> -H -D www.website.org,www2.website.org <your URL here>
I have problems fetching MariaDB. Because I don't need this package I'm trying to remove it. First, I tried to understand what include it:
$ grep -nrw ../layers/ -e mariadb
Binary file ../layers/meta-openembedded/.git/index matches
../layers/meta-openembedded/meta-oe/recipes-core/packagegroups/packagegroup-meta-oe.bb:99: leveldb libdbi mariadb mariadb-native \
Looking into packagegroup-meta-oe.bb I found:
RDEPENDS_packagegroup-meta-oe-dbs ="\
leveldb libdbi mariadb mariadb-native \
mysql-python postgresql psqlodbc rocksdb soci \
sqlite \
${#bb.utils.contains("DISTRO_FEATURES", "bluez4", "mongodb", "", d)} \
"
hence I tried to remove packagegroup-meta-oe-dbs in my <image>.bb:
IMAGE_INSTALL_remove = "packagegroup-meta-oe-dbs"
But it still insists to build it.
Where is my fault?
Since packagegroup-meta-oe-dbs is a runtime dependency of packagegroup-meta-oe-dbs, you cannot remove it without removing packagegroup-meta-oe-dbs.
What you need to do is create bbappend for packagegroup-meta-oe-dbs, and add the following line to it:
RDEPENDS_packagegroup-meta-oe-dbs_remove = "mariadb"
I have downloaded 1000G dataset in the vcf format. Using Plink 2.0 I have converted them into binary format.
Now I need to merge the 1-22 chromosomes.
I am using this script:
${BIN}plink2 \
--bfile /mnt/jw01-aruk-home01/projects/jia_mtx_gwas_2016/common_files/data/clean/thousand_genomes/from_1000G_web/chr1_1000Gv3 \
--make-bed \
--merge-list /mnt/jw01-aruk-home01/projects/jia_mtx_gwas_2016/common_files/data/clean/thousand_genomes/from_1000G_web/chromosomes_1000Gv3.txt \
--out /mnt/jw01-aruk-home01/projects/jia_mtx_gwas_2016/common_files/data/clean/thousand_genomes/from_1000G_web/all_chrs_1000G_v3 \
--noweb
But, I get this error
Error: --merge-list only accepts 1 parameter.
The chromosomes_1000Gv3.txt has files related to chromosomes 2-22 in this format:
chr2_1000Gv3.bed chr2_1000Gv3.bim chr2_1000Gv3.fam
chr3_1000Gv3.bed chr3_1000Gv3.bim chr3_1000Gv3.fam
....
Any suggestions what might be the issue?
Thanks
The --merge-list cannot be used in combination with --bfile. You can either have --bfile/--bmerge or --merge-list only in one plink command.
I was wondering if there was a command to download the contents of a remote folder, i.e all the files contained within that specific folder.
For instance, if we take the URL http://plugins.svn.wordpress.org/hello-dolly/trunk/ - How would it be possible to download the two files contained within the trunk onto my local machine without having to download each file manually?
Also, if there is a way to download all contents including both files AND any listed subdirectories that would be great.
If you ever need to download an entire Web site, perhaps for off-line viewing, wget can do the job.
For example:
$ wget \
--recursive \
--no-clobber \
--page-requisites \
--html-extension \
--convert-links \
--restrict-file-names=windows \
--domains wordpress.org \
--no-parent \
http://plugins.svn.wordpress.org/hello-dolly/trunk/
This command downloads the Web site http://plugins.svn.wordpress.org/hello-dolly/trunk/
The options are:
--recursive: download the entire Web site.
--domains wordpress.org: don't follow links outside wordpress.org.
--no-parent: don't follow links outside the directory tutorials/html/.
--page-requisites: get all the elements that compose the page (images, CSS and so on).
--html-extension: save files with the .html extension.
--convert-links: convert links so that they work locally, off-line.
--restrict-file-names=windows: modify filenames so that they will work in Windows as well.
--no-clobber: don't overwrite any existing files (used in case the download is interrupted and resumed).