I have to deploy odoo with postgres database on amazon cloud. That i can do by simply setting up EC2 server and setting uo odoo on it. In case if internet in down, I want to be able to access same services and already saved data offline as well. For that I plan to install odoo with postgres database on my local machine (in my office) as well. Now I can access odoo services from anywhere (using cloud) when there is internet available. But in case if internet is down, I must be able to use locally installed odoo to get same services. For that purpose I need two things
Both databases should be exact replicas of each other.
On recovering from internet disconnectivity, I want the changes made in my local ( office) database reflect in the database at amazon cloud.
I am new to this stuff, kindly suggest the best possible approach (architecture) in this scenario.
Related
I need to set up real time sync process between a on premise postgresql instance with cloud postgresql instance. Please let me know what are all the options available through which i can achieve it.
Do i have to use any specific tool or it can be managed through replication .
Please advice
Use PgPool
http://www.pgpool.net/mediawiki/index.php/Main_Page
from their web page:
pgpool-II can manage multiple PostgreSQL servers. Using the replication function enables creating a realtime backup on 2 or more physical disks, so that the service can continue without stopping servers in case of a disk failure.
We have setup a community postgresql service on Cloud Foundry (IBM Blumix). This is a free service and no automated backup and recovery is supported out of the box.
Is there a way to set up a standby server or a regular backup in case there is any data corruption/failure?
IBM compose and ElephantSQL can provide this service at a cost, butwe are not ready for it yet.
PostgreSQL is an experimental service and there is not a dashboard and other advanced features (Daily backup for example) that you can find in other services that you mentioned. If you want to do a backup you could write an ad-hoc script that 'saves'\exports all tables as you want and run it every day.
If you need PostegreSQL you can create a PostegreSQL by compose service $17.50 / mo for the first GB and $12 for Extra GB )
We used Postgresql Studio and deployed it on IBM Bluemix. The database service was connected to the pgstudio interface (This restricts the access to only connected databases). We also had to make minor changes to pgstudio so that we could use pg_dump with the interface.
The result: We could manually dump the data. This solution works well as we could take regular dumps (though manually).
In the free tier you are right in saying that you cant get the backup. Those features are available only in Compose for PostgresSQL service - but that's a paid service.
We're about to dive into Odoo (OpenERP). We're planning on using Amazon EC2 for the actual installation, and put the postgreSQL database server on Amazon RDS. (like this guide http://toolkt.com/site/install-openerp-server-and-postgresql-on-separate-servers/ )
If the RDS is only allowed to talk to the EC2 server, does this mitigate any security issues compared to a regular Odoo installation (where database and front facing webserver are on the same machine)? Is this an advisable setup?
Input data in your post is very vague to give you exact answer, but you may consider the following:
RDS can talk to EC2 or any other clients and application servers. Connection only depends on your configuration. You can configure VPC and configure/restrict access to your database and application servers there.
Depending on the size of your system (in terms of I/O, number of users , etc), of course you may want to configure separate database instance and application servers. At scale this separation is important.
In short, Nither any Disadvantage nor any security issues.
In Detail Odoo with AWS EC2,
We "SnippetBucket.com" Team had implemeneted already RDS and better know odoo security.
RDS is bit very expensive.
RDS make private instead of public in AWS
make complete secured.
As well AWS Security helps to make extra protection with inbound and outbound ports. Totally Safe.
Note: AWS "RDS Aurora-Postgresql" is 4X faster than official postgresql. AWS RDS support specific versions by AWS.
I have read that you can replicate a Cloud SQL database to MySQL. Instead, I want to replicate from a MySQL database (that the business uses to keep inventory) to Cloud SQL so it can have up-to-date inventory levels for use on a web site.
Is it possible to replicate MySQL to Cloud SQL. If so, how do I configure that?
This is something that is not yet possible in CloudSQL.
I'm using DBSync to do it, and working fine.
http://dbconvert.com/mysql.php
The Sync version do the service that you want.
It work well with App Engine and Cloud SQL. You must authorize external conections first.
This is a rather old question, but it might be worth noting that this seems now possible by Configuring External Masters.
The high level steps are:
Create a dump of the data from the master and upload the file to a storage bucket
Create a master instance in CloudSQL
Setup a replica of that instance, using the external master IP, username and password. Also provide the dump file location
Setup additional replicas if needed
VoilĂ !
Is it possible to have an app running at aws EC2 and have it's database running at heroku's postgres?
In case it is, what are the downsides I should consider?
Since heroku is hosted at AWS, is there a way to know where is the location of the machine running my database?
Hosting my app in the same region of the database would help to keep the performance?
I would like to hear some opinions about this, I've been searching the topic without much success.
You can determine the public-facing location of your Heroku DB at any given time with a traceroute ... but there's no guarantee that it'll stay at that location, or that there isn't any internal re-routing going on. You'd probably want to speak directly with Heroku support about ways to make sure your Heroku DB instances are local to your AWS application instances, as that certainly would benefit performance. See if you can find out which availability zone, or at least which major region, they run the DB in, and whether you can "pin" your database instance to a given region/zone.
Amazon's RDS looks OK, but doesn't support PostgreSQL. Please keep nagging them to.
I'd probably just run the DB on AWS if performance wasn't particularly important. Use a raid10 of provisioned IOPS EBS volumes on an EBS-optimized instance and you'll get kind-of-ok performance (but at a really big price); alternately, you can use non-crash-safe ssd-based instance store servers and rely on replication and backups to keep your data safe.
I dont have any experience on Heroku PostgreSQL.
Generally of course you can run your own service on Amazon EC2 and use the managed database services of Heroku.
Downsides might be
nobody guarantees, that Herouku exclusively uses AWS and you probably can't determine the physical Heroku service location within the cloud so you will have to deal with network latencies
in addition to your external traffic fees you'll have to pay for the database traffic unless you talk to a server in the same availability zone in the same region
My suggestion ( without knowing any detail about the pros of Heroku )
Have a look at Amazon RDS if you don't want to run a database server on our own.
http://aws.amazon.com/de/rds/
I am operating around 70 server instances on AWS, both RDS and EC2 for more than a year now and I can't imagine any simpler way to keep your stuff running