i have been following the Flask book by Miguel Ginberg and I am thinking about how to deploy my app and use the PostgresDB.
In my local production config I have to manually go in and run Role.insert_roles() before any roles can be assigned.
How do I do this in Heroku with postgres? In fact, how do you connect to the postgres db? It is not really clear where in the code postgres takes over using the environment variable:
https://github.com/miguelgrinberg/flasky/blob/master/config.py
I have a feeling my app is just running sqlite and the book isn't really clear on how to switch over.
SOLUTION:
if you have deployed to heroku and you have not changed the environment variables:
DATABASE_URI to SQLALCHEMY_DATABASE_URI
FLASK_CONFIG = heroku
FLASKY_ADMIN = your email
then ran in your shell:
heroku run python manage.py shell
db.create_all()
db.commit()
Role.insert_roles()
then you are probably running the development config from the SQLite database!
If you want to connect manually, you could use the http://initd.org/psycopg/ libarary directly. Flask-SQLAlchemy provides uses psycopg underneath the hood itself, see here - in your example it may be easier to continue using SQLAlchemy. More information here.
I had the same problem and solved like the follow:
1. provisioning a db
$:heroku addons:create heroku-postgresql bobby-dev
......
add database_url:
$:hero config -s |grep HEROKU-PSOTGRESQL
( then show HEROKU_POSTGRESQL_RED_URL=..... )
$:heroku pg:promote HEROKU-POSTGRESQL-RED
install this extention:psycopg2
3.change the config.py:
config = {
.....
'default' : ProductionConfig
}
Related
I have recently started working on an existing Heroku environment.
How can I tell if there are database backups scheduled?
Assuming you are using Heroku Postgres, you can view backup schedules with the following command:
heroku pg:backups:schedules
You might have to provide the --app argument so Heroku knows which app you're interested in.
I developed a FastAPI app in a Virtual Environment using an SQLite database but deployed it on Heroku with a PostgresSQL database on Heroku as suggested in the tutorial. Although it worked on my PC, adding PostegresSQL as an addon & replacing the value of SQLALCHEMY_DATABASE_URL in the database.py broke everything. Note that I've properly frozen the dependencies on the requirements.txt file. Yet I can't figure out what went wrong.
For further clarification, I've pushed my code to GitHub & it can be accessed at this repository - Self_calculation.
If you are using Postgres Addon on Heroku, probably your solution is simple.
Use os.environ to get the connection parameters , don't try to connect directly, it's Heroku's recommended solution from Heroku Postgres
import os
DATABASE_URL = os.environ.get('DATABASE_URL')
I just need to run this:
1)heroku git:remote -a my_heroku_app_name
2)heroku logs --tail
after that I could see my problem.
In my case I forgot to change postgres url in alembic.ini file
I have a django project that I was initially running on pythonanywhere using a postgres database. However, pythonanywhere doesn't support ASGI, so I am migrating the project over to heroku. Since heroku either much prefers or mandates use of its own postgres functionality, I need to migrate the data from elephantsql. However, I'm really coming up short on how... I tried running pg_dumpall but it didn't seem to want to work and/or the file disappeared into the ether. I'm at a loss for how to make this migration... If anyone could help, I'd appreciate it so much. T-T
After hours of searching and doing what I can to scour heroku's listed info, I found it by running heroku pg:push --help.
For a locally running server, run
heroku pg:push '<db_name>' <heroku_db_name> --app <app_name>
For a hosted one, run
heroku pg:push <postgres_link> <heroku_db_name> --app <app_name>
Currently we have all in one single docker container for our production gitlab, where we are using bundled postgres and redis. So everything in same container. We want to use external postgres db and separate container for redis as well to follow the production standards.
How can I migrate from internal postgres db to external postgres db? If anyone provides process and steps that will be really helpful. We are new to this process. Please let us know If anyone knows
Thank you everyone for your inputs ,
PRS
You can follow the article "Migrating GitLab from internal to external PostgreSQL", which involves:
a database dump/reload, using pg_dumpall
sudo -u gitlab-psql /opt/gitlab/embedded/bin/pg_dumpall \
--username=gitlab-psql --host=/var/opt/gitlab/postgresql > /var/lib/pgsql/database.sql
sudo -u postgres psql -f /var/lib/pgsql/database.sql
Note: yuo can also use a backup of the database, but only if the external PostgreSQL version matches exactly the embedded one.
setting its password
sudo -u postgres psql -c "ALTER USER gitlab ENCRYPTED PASSWORD '***' VALID UNTIL 'infinity';"
and modifying the GitLab configuration:
That is:
# Disable the built-in Postgres
postgresql['enable'] = false
# Fill in the connection details
gitlab_rails['db_adapter'] = 'postgresql'
gitlab_rails['db_encoding'] = 'utf8'
gitlab_rails['db_host'] = '127.0.0.1'
gitlab_rails['db_port'] = 5432
gitlab_rails['db_database'] = "gitlabhq_production"
gitlab_rails['db_username'] = 'gitlab'
gitlab_rails['db_password'] = '***'
apply tour changes:
gitlab-ctl reconfigure && gitlab-ctl restart
#VonC
Hi, let me know about the process I have done below
We currently have single all in one docker gitlab container which is using bundled postgres and redis . To follow the production standards we are looking to maintain separate postgres and redis instances for our prod gitlab..We already have data in bundled db ..so we took back up current gitlab with bundled postgres ..it generated .tar file....Next we did change gitlab.rb to point external post gres db [ same version ]..then we are able connect to gitlab but didn;t see any data because nothing was there as it is fresh db. Later we did the restore using external postgres db ...now we can see all the data?? Can we do in this method ? Now our gitlab is attached to external postgres and I can see all the restored data. Will this process works ? Any downfalls?
How this process is different from pgdump and import ?
I have an app which is hosted on Heroku as well as on a VPS. I am using mongodb on this app with mongoid. I want to know what is the correct way to define database connection in mongoid.yml so that the same file works on both heroku and VPS.
Heroku expects MONGOHQ_URL in the production database while my VPS needs default db mapping:
Correct for heroku:
production:
uri: <%= ENV['MONGOHQ_URL'] %>
Correct for VPS:
production:
<<: *defaults
host: localhost
database: grbr_production
Pushing mongoid.yml separately for Heroku and VPS is a real pain. Is there a way I can create one unified entry which works
It might be easiest to just add a MONGOHQ_URL environment variable to the VPS that points to the mongodb instance on the localhost:
mongodb://localhost/grbr_production
What the Heroku instance expects is for the environment variable MONGOHQ_URL to be defined. The variable name could be anything, and if you are not using MongoHQ, it makes sense to rename it.
To answer your question, you could have the following config file for both:
# mongoid config file
production:
uri: <%= ENV['MONGODB_URI'] %>
Then on your VPS, assuming a bash environment:
export MONGODB_URI="mongodb://username:password#localhost:10010/db-name"
Make sure to change all values to the appropriate ones.
You can either run that in the console, or better yet, add it to the ~/.bashrc file of the user running the mongodb instance so that it persists on restarts.
Then on heroku you define it using the heroku toolbelt command:
heroku config:set MONGODB_URI="mongodb://username:password#VPS-IP-ADDRESS:10010/db-name"