migrate postfix accounts into google apps - google-apps

I have a postfix server (Linux) hosting a large amount of emails (120GB for 70 accounts) to be migrated into Google Apps. Only 30 accounts remain active and the remainder are archives.
What is an efficient way to migrate active accounts into Google Apps and minimize disruption? Are there scripts to read direct from the server disk then upload? What about folders and email statuses (read/flagged)?

Depending on the needs of your users, it's usually easier to cut over the MX records first and send your users to Apps for all new mail. Then, migrate over their old data. In this case, the downtime is very limited and they can always access the old server should they need something there.
The alternative would be to do multiple waves of migration that look something like:
Migrate all data from data X - Y
Change MX records for all users and send them to Apps
Run a second migration from date Y - Z to pick up any of the missed data (there won't be duplicates in emails if there's some overlap)
Regarding your last point, I'm guessing your users are using at least IMAP to access the mail server as I believe Postfix is not a mail server on its own. If this is the case, you'll want to use an IMAP migration for which Google provides a tool for (GAMME). An IMAP migration will bring over folder structure as well as read/unread status but I don't believe any 'flagged' status will translate.

Related

Dropbox app with tiered users

Preface:
I'm hoping to upgrade an existing application by adding cloud backup and syncing of the customers data. We want this to be as seamless as possible, but also for the customers only interface to the data to be via the applications front-end interface.
Our application can be connected to the oil pipe of a machine, collects data on the oil condition. When a test has completed we want to push this to the cloud. Because of the distinct test nature of the data (as opposed to one big trend) most IoT platforms don't suit very well, so we're aiming to release a slightly modified version of the application which doesn't have the connection to the sensors and this will be our remote front-end.
Since the existing application uses a relatively simple file structure to store it's data, if we simply replicate these files in the cloud, the remote front-end version can just download these to the same location and it'll work fine. Thus this has lead us to Dropbox (or any recommended more appropriate cloud storage system).
We hope to use the Dropbox API directly in our application to push and pull the files as necessary. All of this so far we believe is perfectly achievable.
Question: Is it possible - and if so how would we go about - to setup a user system with the below requirements
The users personal dropbox is not used
Dropbox is completely hidden from the user
The application vendor has a top level user who has access to all data (for analytic, we do not want to store confidential or sensitive data).
When the user logs in they only have access to their folder and any attackers could not disrupt the overall structure. (We understand that if an attacker got the master account then all is lost, but that is an internal issue to keep it secure. As long as the user accounts are isolated this is okay.)
Alternative Question Is anyone aware of a storage system or IoT system which would better suite this use case? We will still require backups/loss prevention as part of the service.

Recommendations for multi-user Ionic/CouchDB app

I need add multi-user capability to my single-page mobile app developed with Ionic 1, PouchDB and CouchDB. After reading many docs I am getting confused on what would be the best choice.
About my app:
it should be able to work offline, and then sync with the server when online (this why I am using PouchDB and CouchDB, working great so far)
it should let the user create an account with a username and password, which would then be stored within the app so that he does not have to log in again whenever he launches the app. This account will make sure his data are then synced on the server in a secure place so that other users cannot access it.
currently there is no need to have shared information between users
Based on what I have read I am considering the following:
on the server, have one database per user, storing his own data
on the server, have a master database, storing all the data of all users, plus the design docs. This makes it easy to change the design docs in a single place, and have them replicated on each user database (and then within the PouchDB database in the app). The synchronization of data, between the master and the user DBs, is done through a filter, so that only the docs belonging to one user (through some userId field) are replicated to this user's database only
use another module/plugin (SuperLogin? nolanlawson/pouchdb-authentication?) to manage the users from the app (user creation, login, logout, password reset, email notification for password lost, ...)
My questions:
do you think this architecture is appropriate, or do you have something better to recommend?
which software would you recommend for the users management? SuperLogin looks great but needs to run on a separate HTTP server, making the architecture more complex. Does it automatically create a new database for each new user (I don't think so)? Nolanlawson/pouchdb-authentication is client-only, but does it fit well with Ionic 1? Isn't there a LOT of things to develop around it, that come out of the box with SuperLogin? Do you have any other module in mind?
Many thanks in advance for your help!
This is an appropriate approach. The local PouchDBs will provide the data on the client side even if a client went offline. And the combination with a central CouchDB server is a great to keep data synchronized between server and clients.
You want to store the users credentials, so you will have to save this data somehow on your client side, which could be done in a separate PouchDB.
If you keep all your user data in a local PouchDB database and have one CouchDB database per user on the server, you can even omit the filter you mentioned, because the synchronization will only happen between this two user databases.
I recommend SuperLogin. Yes, you have to install NodeJS and some extra libraries (namely morgan, express, http, body-parser and cors), and you will have to open your server to at least one new port to provide this service. But SuperLogin is really powerful to manage user accounts and user databases on a CouchDB server.
For example, if a user registers, you just make a call to SuperLogin via http://server_address:port/auth/register, query the user name, password etc. and SuperLogin not only adds this new user to the user database, it also creates automatically a new database only for this user. Each user can have multiple databases (private or shared) and SuperLogin manages the access rights to all these databases. Moreover, SuperLogin can also send confirmation emails or resend forgotten passwords (an access token, respectively).
Sure, you will have to configure a lot (but, hey, at least you have all these options), and maybe you even have to write some additional API for functionality not covered by SuperLogin. But in general, SuperLogin saves a lot of pain regarding the development of a custom user management.
But if you are unsure about the server configuration, maybe a service such as Couchbase, Firebase etc. is a better solution. These services have also some user management capabilities, and you have to bother less with server security.

Need advice: How to share a potentially large report to remote users?

I am asking for advice on possibly better solutions for the part of the project I'm working on. I'll first give some background and then my current thoughts.
Background
Our clients can use my company's products to generate potentially large data sets for use in their industry. When the data sets are generated, the clients will file a processing request to us.
We want to send the clients a summary email which contains some statistical charts as well as sampling points from the data sets so they can do some initial quality control work. If the data sets are of bad quality, they don't need to file any request.
One problem is that the charts and sampling points can be potentially too large to be sent in an email. The charts and the sampling points we want to include in the emails are pictures. Although we can use low-quality format such as JPEG to save space, we cannot control how many data sets would be included in the summary email, so the total size could still exceed the normal email size limit.
In terms of technologies, we are mainly developing in Python on Ubuntu 14.04.
Goals of the Solution
In general, we want to present a report-like thing to the clients to do some initial QA. The report may contains external links but does not need to be very interactive. In other words, a static report should be fine.
We want to reduce the steps or things that our clients must do to read the report. For example, if the report can be just an email, the user only needs to 1). log in and 2). open the email. If they use a client software, they may skip 1). and just open and begin to read.
We also want to minimize the burden of maintaining extra user accounts for both us and our clients. For example, if the solution requires us to register a new user account, this solution is, although still acceptable, not ranked very high.
Security is important because our clients don't want their reports to be read by unauthorized third parties.
We want the process automated. We want the solution to provide programming interface so that we can automate the report sending/sharing process.
Performance is NOT a critical issue. Our user base is not large. I think at most in hundreds. They also don't generate data that frequently, at most once a week. We don't need real-time response. Even a delay of a few hours is still acceptable.
My Current Thoughts of Solution
Possible solution #1: In-house web service. I can set up a server machine and develop our own web service. We put the report into our database and the clients can then query via the Internet.
Possible solution #2: Amazon Web Service. AWS is quite mature but I'm not sure if they could be expensive because so far we just wanna share a report with our remote clients which doesn't look like a big deal to use AWS.
Possible solution #3: Google Drive. I know Google Drive provides API to do uploading and sharing programmatically, but I think we need to register a dedicated Google account to use that.
Any better solutions??
You could possibly use AWS S3 and Cloudfront. Files can easily be loaded into S3 using the AWS SDK's and API. You can then use the API to generate secure links to the files that can only be opened for a specific time and optionally from a specific IP.
Files on S3 can also be automatically cleaned up after a specific time if needed using lifecycle rules.
Storage and transfer prices are fairly cheap with AWS and remember that the S3 storage cost indicated is by the month so if you only have an object loaded for a few days then you only pay for a few days.
S3: http://aws.amazon.com/s3/pricing
Cloudfront: https://aws.amazon.com/cloudfront/pricing/
Here's a list of the SDK's for AWS:
https://aws.amazon.com/tools/#sdk
Or you can use their command line tools for Windows batch or powershell scripting:
https://aws.amazon.com/tools/#cli
Here's some info on how the private content urls are created:
http://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/PrivateContent.html
I will suggest to built this service using mix of your #1 and #2 options. You can do the processing and for transferring the data leverage AWS S3 which is quiet cheap.
Example: 100GB costs like approx $3.
Also AWS S3 will be beneficial as you are covered for any disaster on your local environment your data will be safe in S3.
For security you can leverage data encryption and signed URLS in AWS S3.

xmpp server and roster issue

I am working on the jabber chatting Applications with the use of XMPP server .
I want to make 2 user friend so I have to add roster with the use of mysql query.
I have make entry in two tables.(1) ofRoster (2)ofRosterGroups.
I make entry in both the table but its not working.
Is there anything where I am missing.
I can do this with the admin panel but i don't want to do that.
I think you are using openfire (those tables in SQL look like the openfire setup). If so, the table you have to edit is "ofGroupUser". To add a user to a group you need to do a sql insert into that table where the group name is the group you want to add the user to, the username is the user you are adding to the group and administrator is the flag of that user's authority (just use 0). An example insert would look like this:
INSERT INTO ofGroupUser VALUES("group name", "user", administrator);
However, as mentioned in the above post this is not a good method for doing this as it will not immediately affect the server. You must restart the server for these changes to take place because openfire (or whatever server you are using) probably only reads the database on start up. Once it caches everything, it will edit the database according to requests (like adding users or groups through the admin console), but will not read from it and your additions will not be seen until a server restart occurs.
Basically, doing manual sql inserts will produce the desired results, and, if you are just testing some functionality, will work just fine as long as you restart the server. If you are using openfire and need to do group administrative work in some way besides the web ui, I would look into using a different server. As far as I know, openfire isn't real great with administration outside of it's web ui. Here is a list of many open source xmpp servers. I'd recommend ejabberd (as mentioned above post) it has a very nice control tool called ejabberdctl with an available expansion module called mod_ctlextra (here is the man page for it which lists commands) that will allow you to do what I assume you are wanting. Then you don't have to worry about sql and restarting, just use their tool which is how it should be.
Also, on a side note, ejabberd is extremely efficient due to the nature of the language used to write it: Erlang. Great stuff.
Hope that helps!
Presumably you are using the odbc modules with ejabberd. The sql schema though defines two tables rostergroups and rosterusers, not the ones you mention in the question. In any case you should not update the tables directly, ejabberd keeps internal state and does not get notified of your changes.
The way to go is by actually having the users send the mutual subscriptions and accept them as per the rfc. Roster Item Exchange might also be useful.

Making a fax accessible from a ColdFusion Web App

We're programming a Testing Web Application for a University in ColdFusion with a MS SQL Backend.
Right now we have to manually take faxes sent to our fax machine and then find the account they are related to and input the info (the actual fax has to be found in a filing cabinet if we ever need to reference it again). What I would like to do is create a way for someone to fax to a certain number and then the fax be sent to an email account we specify.
If that worked properly we would need a way to get the email, store it somewhere on our servers and then link it to an account. The linking process would probably have to be manual and we are ok with that, but an easy way to view all the faxes sent to that email in our ColdFusion application in PDF form (searchable by the name we assign it) is what we are mainly looking for, so that we don't have to get the faxes on paper and file them by hand.
Is there a way to accomplish this? Preferably not through a paid service as we can program almost anything we need ourselves.
Hmm... have you tried services like eFax?
Why reinvent the wheel? Services like eFax and jConnect (there are several others, just Google "electronic fax service") are affordable and do half of what you are trying to do. Save yourself the effort and just spend a few bucks. You'll probably find out, too, that it will cost you less to just pay for the service than it would cost you to pay the developer to write the software.
So after you bite the bullet and sign up for an electronic faxing service, you just need an email account for it to send to, and to use CFPOP to check the inbox and download the attachments. The rest is a piece of cake.
From the sounds of it, I have built something identical to this faxing setup with Coldfusion.
After a few trials and errors I found best way to go is:
1) DIGITIZE INCOMING FAXES: Have all faxes either sent to an email address you can check via CF, or a network folder you save them on, which you can check with CF. You can absolutely keep your fax number and simply call forward incoming calls to your digital fax number.
2) PROCESS INCOMING FAXES When you find a new fax, it is best to process it and make a record of it. I store things like the file name, dig up the fax number it came from, check it against a list of known numbers, and have a routing table (in case it needs to go to someone).
3) PRINT AND ROUTE FAX Auto printing a document once in CF is possible via CF as well.
As for tables, I keep one to store each fax. I store the fax itself in a blob as well. Easy to replicate and move around, no big performance hit. I keep another table to store a list of incoming number profiles (like a caller ID table) to relate the number to a customer. I keep a table for routing rules, if an email comes from here, send it here. Last, but not least, if you have to manage multiple phone numbers, you can create multiple incoming profiles and file them.
Once you have each fax stored in the DB, you can do a lot with it and file/index/ store it digitally how you like. CFDOCUMENT will display disk based PDFs.
I ended up having to program something like this for custom routing options. It is possible to auto link items to certain files/folders/projects if you like as well with CF.
If you need to know anything else, ask, or we can discuss it off line if you need to keep some details private.
Agree with Adam. Don't create a bunch of problems for yourself - you'll save a lot of money and nerves by just using the existing service.
On the topic: I use Popfax and I kind of like it. It's comfy, gives you opportunities, discounts, contests and a lot of stuff you'd like if you'd be interested in. It's cheap (at least, 100% cheaper than your own software) and you can use it not only on PC, but also via mobile phone