sqlalchemy NoSuchTableError when table exists in database - postgresql

I am using PostgreSQL+Psycopg2, SQLAlchemy. I've already created my database "new_db" using pgAdminIII tool and a new schema in it as "new_db_schema". Under this schema I've all the tables I need. My code looks like this.
from sqlalchemy import create_engine
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker
from sqlalchemy import Column, String, Integer, Boolean
engine_text = 'postgresql+psycopg2://root:12345#localhost:5432/db_name'
database_engine = create_engine(engine_text, echo = True)
Base = declarative_base(database_engine)
class Some_table(Base):
__tablename__ = 'some_table' # This table exist in database.
__table_args__ = {'autoload':True}
col1 = Column(String, primary_key=True)
col2 = Column(Boolean)
def __init__(self, col1_name, col2_name):
self.col1 = col1_name
self.col2 = col2_name
if __name__ == "__main__":
a = Some_table('blah', 'blah')
Now when i try to run the following code, I get sqlalchemy.exc.NoSuchTableError: some_table.
Since I already have the database setup with all the tables, I would like to autoload while creating classes. Am I missing something here? I need to write such classes for all the tables present in the database. Any help would be greatly appreciated.
Thanks

You can either:
Use schema-qualified table names: __tablename__ = 'new_db_schema.some_table'. You have to use them everywhere: in string arguments to ForeignKey etc.
Alter SEARCH_PATH in the database: SET search_path TO new_db_schema;. This SQL command has session scope, so you have to issue it at the start of every connection using SQLAlchemy event system.
Like this:
from sqlalchemy import event
def init_search_path(connection, conn_record):
cursor = connection.cursor()
try:
cursor.execute('SET search_path TO new_db_schema;')
finally:
cursor.close()
engine = create_engine('postgresql+psycopg2://...')
event.listen(engine, 'connect', init_search_path)

Related

How can I listen for the creation of a specific model and create a new one (on a different table) based on this?

I have a User model with a referral_key attribute. I'd like to create a ReferralKeyRecord upon creation of a user. I've read tons of documentation and StackExchange to no avail.
This answer uses after_insert(), but I am not trying to alter or validate the class which is being inserted; I am trying to add a new object from a completely different model—and session.add() isn't supported.
This answer is closer to what I want, but the accepted answer (ultimately) uses after_flush(), which is far too general. I don't want to listen to events thrown whenever the DB is updated somehow. I want it to fire off when a specific model is created.
And something like the following...
#event.listens_for(User, 'after_flush')
def create_referral_record(mapper, connection, target):
session.add(ReferralRecord(key=instances.referral_key))
session.commit()
... results in No such event 'after_flush' for target '<class 'models.User'>. I've looked through SQLAlchemy's documentation (the core events and the ORM events) and see no events that indicate a specific model has been created. The closest thing in Mapper events is the after_insert method, and the closest thing in Session events is the after_flush() method. I imagine this is a pretty common thing to need to do, and would thus be surprised if there wasn't an easy event to listen to. I assume it'd be something like:
#event.listens_for(User, 'on_creation')
def create_referral_record(session, instance):
record = ReferralRecord(key=instance.referral_key)
session.add(record)
session.commit()
Does anyone know better than I?
Or why not create the Referral inside the User constructor?
from sqlalchemy.orm import Session, relationship, Mapper
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy import Column, Integer, ForeignKey, create_engine, event
Base = declarative_base()
class User(Base):
__tablename__ = 'user'
def __init__(self):
self.referral = Referral()
id = Column(Integer(), primary_key=True)
referral = relationship('Referral', uselist=False)
class Referral(Base):
__tablename__ = 'referral'
id = Column(Integer(), primary_key=True)
user_id = Column(Integer(), ForeignKey('user.id'), nullable=False)
engine = create_engine('sqlite:///:memory:')
Base.metadata.create_all(engine)
session = Session(bind=engine)
session.add(User())
session.commit()
print(session.query(User).all())
print(session.query(Referral).all())
You can use the after_flush session event, and inside the event handler you can access the session's new objects (using session.new).
Example:
from sqlalchemy.orm import Session, relationship, Mapper
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy import Column, Integer, ForeignKey, create_engine, event
Base = declarative_base()
class User(Base):
__tablename__ = 'user'
id = Column(Integer(), primary_key=True)
class Referral(Base):
__tablename__ = 'referral'
id = Column(Integer(), primary_key=True)
user_id = Column(Integer(), ForeignKey('user.id'), nullable=False)
engine = create_engine('sqlite:///:memory:')
Base.metadata.create_all(engine)
session = Session(bind=engine)
#event.listens_for(session, 'after_flush')
def session_after_flush(session, flush_context):
for obj in session.new:
if isinstance(obj, User):
session.add(Referral(user_id=obj.id))
session.add(User())
session.commit()
print(session.query(User).all())
print(session.query(Referral).all())
Running this outputs:
[<__main__.User object at 0x00000203ABDF5400>]
[<__main__.Referral object at 0x00000203ABDF5710>]

How to add django-mptt rebuild to migration?

I have add the django-mptt to existing database, and create the new migration.
Migration process was asked for default values for level, left, right and such fields, but doesn't add the model.rebuild operation to migration file.
How to add rebuild operation to migration file manually?
Try the following:
from __future__ import unicode_literals
from django.db import migrations
from mptt import register, managers
def rebuild_tree(apps, schema_editor):
YourMPTTModel = apps.get_model('your_app', 'YourMPTTModel')
manager = managers.TreeManager()
manager.model = YourMPTTModel
register(YourMPTTModel)
manager.contribute_to_class(YourMPTTModel, 'objects')
manager.rebuild()
class Migration(migrations.Migration):
operations = [
migrations.RunPython(
rebuild_tree
)
]

Unknown data type "JSONB" when running tests in play slick with H2 Database

I have evolution problem Unknown data type: "JSONB" when running tests in playframework using
playframework v2.6.6 for scala
play-slick v3.0.2
play-slick-evolutions v3.0.2
postgresql - 42.0.0
h2database - 1.4.194
My H2DbConnector looks like this:
import entities.StubData._
import org.scalatest.{BeforeAndAfterAll, FunSuite}
import play.api.db.DBApi
import play.api.db.evolutions.Evolutions
import play.api.inject.guice.GuiceApplicationBuilder
trait H2DbConnector extends FunSuite with BeforeAndAfterAll {
val appBuilder = new GuiceApplicationBuilder()
.configure(configuration)
val injector = appBuilder.injector
lazy val databaseApi = injector.instanceOf[DBApi]
override def beforeAll() = {
Evolutions.applyEvolutions(databaseApi.database("default"))
}
override def afterAll() = {
Evolutions.cleanupEvolutions(databaseApi.database("default"))
}
}
In application.test.conf
slick.dbs.default.driver = "slick.driver.H2Driver$"
slick.dbs.default.db.driver = "org.h2.Driver"
slick.dbs.default.db.url = "jdbc:h2:mem:play;MODE=PostgreSQL;DB_CLOSE_DELAY=-1;DATABASE_TO_UPPER=FALSE"
I've got one problematic line in evolutions 2.sql file
ALTER TABLE "Messages" ADD COLUMN "metaJson" JSONB NULL;
When I run dao tests getting error like
2017-12-21 16:08:40,409 [error] p.a.d.e.DefaultEvolutionsApi - Unknown data type: "JSONB"; SQL statement:
ALTER TABLE "Messages" ADD COLUMN "metaJson" JSONB NULL [50004-194] [ERROR:50004, SQLSTATE:HY004]
[info] OptoutsDaoTest *** ABORTED ***
[info] play.api.db.evolutions.InconsistentDatabase: Database 'default' is in an inconsistent state![An evolution has not been applied properly. Please check the problem and resolve it manually before marking it as resolved.]
[info] at play.api.db.evolutions.DatabaseEvolutions.$anonfun$checkEvolutionsState$3(EvolutionsApi.scala:285)
[info] at play.api.db.evolutions.DatabaseEvolutions.$anonfun$checkEvolutionsState$3$adapted(EvolutionsApi.scala:270)
[info] at play.api.db.evolutions.DatabaseEvolutions.executeQuery(EvolutionsApi.scala:317)
[info] at play.api.db.evolutions.DatabaseEvolutions.checkEvolutionsState(EvolutionsApi.scala:270)
[info] at play.api.db.evolutions.DatabaseEvolutions.evolve(EvolutionsApi.scala:239)
[info] at play.api.db.evolutions.Evolutions$.applyEvolutions(Evolutions.scala:193)
[info] at H2DbConnector.beforeAll(H2DbConnector.scala:15)
[info] at H2DbConnector.beforeAll$(H2DbConnector.scala:14)
[info] at OptoutsDaoTest.beforeAll(OptoutsDaoTest.scala:5)
[info] at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:212)
[info] ...
Could you help me please to fix this issue?
I recently had this problem with JSONB and H2 too. I solved it by creating an alias of JSONB to JSON and have it run only during the tests profile on H2.
CREATE TYPE "JSONB" AS json;
It's not JSONB but the difference of JSONB to JSON (at least in postgres) is essentially the perfomance of reading, which for the test purposes doesn't really matter (so much).
Maybe this example helps too:
This is an example using flyway. Create an sql entry to create an alias type to jsonb on /resources/db/tests that runs only on the test profile.
We were using spring so here's the entrance on the application.yml:
spring:
profiles: mytest
datasource:
continueOnError: false
url: jdbc:h2:mem:myapp-db;DB_CLOSE_ON_EXIT=FALSE;MODE=PostgreSQL;DATABASE_TO_LOWER=TRUE
flyway:
enabled: true
locations: classpath:db/migration, classpath:db/tests
[......]
And heres a list of the ${project.dir}/resources/db/
Here's the magic:
In the content of the file I create a type called JSONB that is basically an alias to the JSON type. note: For what I've understood, Uppercase is necessary (specially when you're referring to it on the creation of the table) cause H2 seems to automatically change the types names to UPPERCASE:
CREATE TYPE "JSONB" AS json;
Here is an example of a creation of a table with this type:
CREATE TABLE "XXX" (
id BIGSERIAL PRIMARY KEY,
my_json_column_name JSONB NOT NULL
);
On the side of hibernate I use the type JsonBinaryType from hibernate-types52 See more on this link.
#Data
#TypeDef(name = "jsonb", typeClass = com.vladmihalcea.hibernate.type.json.JsonBinaryType.class)
#Entity(name = "XXX")
#Table(name = "XXX")
public class XXX {
#Type(type = "jsonb")
#Column(name = "my_json_column_name", nullable = false)
private String myJsonColumnName;
//OR
#Type(type = "jsonb")
#Column(name = "my_json_column_name", nullable = false)
private List<MYCustomTypeThatMatchesJsonObject> myJsonColumnName;
}
I hope it helps someone. It worked for me.
UPDATED AT 2020-07-13
I stopped using H2 on my projects and started to use testcontainers. Very easy to setup and you can test in your real db environment.
H2 does not support JSONB column type.
All supported column types Supported datatypes of H2
Try to use postgres also in tests or write standard SQL statments which both databases understand.
You can not use PostgreSQL for unit test as you are connecting to something, unit test should only relay on in-memory test as it will be trigged by your build and it's very unlikely to have build server accessing any physical DB, you may need another way to mock your data and avoid DB accessing from it, or change your data type to string[] and encapsulate it to produce JSON
For those who still have this problem in either H2 and PostgreSQL database even after defining a TypeDef ...etc, Check out my answer here

How to custom the table name in peewee?

I want to define a table which the table name is gobang_server,i write code as follow:
class BaseModel(Model):
class Meta:
database = database
class GobangServer(BaseModel):
time = DateField(default=datetime.datetime.now())
name = CharField(max_length=64)
host = CharField(max_length=30)
port = IntegerField()
pid = IntegerField()
but i look at PostgreSQL the table name is "gobangserver"?
How can i define with the table name is gobang_server and the class name is not be modified.
class GobangServer(BaseModel):
...
class Meta:
db_table = 'gobang_server'
In peewee 3.0 it changes from "db_table" to "table_name".

Using restframework mongoengine how to create a queires?

models.py
from mongoengine import Document, fields
class Tool(Document):
Fruit = fields.StringField(required=True)
District = fields.StringField(required=True)
Area = fields.StringField(required=True)
Farmer = fields.StringField(required=True)
Serializers.py file
from rest_framework import serializers
from rest_framework_mongoengine.serializers import DocumentSerializer
from models import Tool
class ToolSerializer(DocumentSerializer):
class Meta:
model = Tool
views.py file
from django.template.response import TemplateResponse
from rest_framework_mongoengine.viewsets import ModelViewSet as MongoModelViewSet
from app.serializers import *
def index_view(request):
context = {}
return TemplateResponse(request, 'index.html', context)
class ToolViewSet(MongoModelViewSet):
lookup_field = 'Fruit'
serializer_class = ToolSerializer
def get_queryset(self):
return Tool.objects.all()
So,I want to create queries like http://127.0.0.1:8000/api/tool/?Fruit=Banana gives me all data for fruit banana only. Also, http://127.0.0.1:8000/api/tool/?District=Pune gives me data for Pune district only .
Unfortunately, I haven't tried this solution myself yet, but AFAIK, in pure DRF with SQL database you'd use Django-Filters package for this.
There's an analogue of it for DRF-ME, called drf-mongo-filters, written by Maxim Vasiliev, co-author of DRF-ME. It contains a decent set of tests, you could use for inspiration.
Basically, you say something like:
from rest_framework.test import APIRequestFactory
from rest_framework.generics import ListAPIView
from mongoengine import Document, fields
from drf_mongo_filters.filtersets import filters, Filterset, ModelFilterset
from drf_mongo_filters.backend import MongoFilterBackend
class TestFilter(Filterset):
foo = filters.CharFilter()
class TestView(ListAPIView):
filter_backends = (MongoFilterBackend,)
filter_class = TestFilter
serializer_class = mock.Mock()
queryset = mock.Mock()
TestView.as_view()(APIRequestFactory().get("/?foo=Foo"))
TestView.queryset.filter.assert_called_once_with(foo="Foo")
Haven't tried doing the same with ViewSets, but as they inherit from GenericView, I guess, they should respect filter_class and filter_backends parameters, too.