I've been constantly getting a warning on the console and I'm going crazy from how much I've been reading but I haven't been able to resolve this:
SAWarning: relationship 'Book.users' will copy column user.uid to column user_book.uid, which conflicts with relationship(s): 'User.books' (copies user.uid to user_book.uid). If this is not intention, consider if these relationships should be linked with back_populates, or if viewonly=True should be applied to one or more if they are read-only. For the less common case that foreign key constraints are partially overlapping, the orm.foreign() annotation can be used to isolate the columns that should be written towards. The 'overlaps' parameter may be used to remove this warning.
The tables the console cites in this notice are as follows:
user_book = db.Table('user_book',
db.Column('uid', db.Integer, db.ForeignKey('user.uid'), primary_key=True),
db.Column('bid', db.Text, db.ForeignKey('book.bid'), primary_key=True),
db.Column('date_added', db.DateTime(timezone=True), server_default=db.func.now())
)
class User(db.Model):
__tablename__ = 'user'
uid = db.Column(db.Integer, primary_key=True)
email = db.Column(db.String(25), nullable=False)
hash = db.Column(db.String(), nullable=False)
first_name = db.Column(db.String(30), nullable=True)
last_name = db.Column(db.String(80), nullable=True)
books = db.relationship('Book', secondary=user_book)
class Book(db.Model):
__tablename__ = 'book'
bid = db.Column(db.Text, primary_key=True)
title = db.Column(db.Text, nullable=False)
authors = db.Column(db.Text, nullable=False)
thumbnail = db.Column(db.Text, nullable=True)
users = db.relationship('User', secondary=user_book)
I use the user_book table to show the user the books he has added.
What am I missing? I take this opportunity to ask, semantically the relationship between tables and foreign keys is being done correctly?
As the warning message suggests, you are missing the back_populates= attributes in your relationships:
class User(db.Model):
# …
books = db.relationship('Book', secondary=user_book, back_populates="users")
# …
class Book(db.Model):
# …
users = db.relationship('User', secondary=user_book, back_populates="books")
# …
I kind of figure this out.
As the code in official tutorial.
from sqlalchemy import Column, ForeignKey, Integer, String, Table
from sqlalchemy.orm import declarative_base, relationship
Base = declarative_base()
class User(Base):
__tablename__ = "user"
id = Column(Integer, primary_key=True)
name = Column(String(64))
kw = relationship("Keyword", secondary=lambda: user_keyword_table)
def __init__(self, name):
self.name = name
class Keyword(Base):
__tablename__ = "keyword"
id = Column(Integer, primary_key=True)
keyword = Column("keyword", String(64))
def __init__(self, keyword):
self.keyword = keyword
user_keyword_table = Table(
"user_keyword",
Base.metadata,
Column("user_id", Integer, ForeignKey("user.id"), primary_key=True),
Column("keyword_id", Integer, ForeignKey("keyword.id"), primary_key=True),
)
Doesn't it make you wander why the relationship only exists in User class rather than both class ?
The thing is, it automatically creates the reverse relationship in Keyword class (a "backref='users' liked parameter is required I supposed ?)
Related
I have 2 tables with a one to many relationship. Parent can have many children, a child can only have one parent.
from sqlalchemy.dialects import postgresql
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
class Parent(Base):
__tablename__ = "parent"
id = Column(UUID(as_uuid=True), primary_key=True, index=True, nullable=False, default=uuid.uuid4)
children_ids = Column(postgres.ARRAY(Text), server_default=postgres.array("{}"))
class Child(Base):
__tablename__ = "child"
id = Column(UUID(as_uuid=True), primary_key=True, index=True, default=uuid.uuid4)
parent_id = Column(UUID, ForeignKey('Parent.id', ondelete='CASCADE'), nullable=False)
parent = relationship("Parent", backref=backref("child", passive_deletes=True, passive_updates=True))
I currently create a child entity with the session maker and use add() and commit(). I then have to manually update the parent entity and push/append the Child.id into the children_ids column. Is there a way how sqlalchemy can trigger a manual update or do I always have to manually update?
I have three Model classes, representing three tables in my PostgreSQL database: Project, Label, ProjectLabel. Many projects can have multiple labels:
class Project(db.Model):
__tablename__ = 'projects'
id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String())
labels = db.relationship('ProjectLabel')
class Label(db.Model):
__tablename__ = 'labels'
label_id = db.Column(db.Integer, primary_key=True)
label_name = db.Column(db.String())
class ProjectLabel(db.Model):
__tablename__ = 'projects_labels'
projectlabel_id = db.Column(db.Integer, primary_key=True)
projectlabel_projectid = db.Column(db.Integer, db.ForeignKey('projects.id'))
projectlabel_labelid = db.Column(db.Integer, db.ForeignKey('labels.label_id'))
How can I query Project model, so that I can get objects from labels table?
Specifically, how can I get label_name of the label assigned to the Project? I somehow need to connect between Project(labels) -> ProjectLabel -> Label classes
This will get the related labels in long form:
db.session.query(Project.id,
Label.label_name)\
.filter(ProjectLabel.projectlabel_projectid==Project.id)\
.filter(Label.label_id==ProjectLabel.projectlabel_labelid)\
.order_by(Project.id.asc()).all()
If you want the labels in comma-delimited lists use func.group_concat():
db.session.query(Project.id,
func.group_concat(Label.label_name).label('related_labels'))\
.filter(ProjectLabel.projectlabel_projectid==Project.id)\
.filter(Label.label_id==ProjectLabel.projectlabel_labelid)\
.group_by(Project.id)\
.order_by(Project.id.asc()).all()
I am new to flask and sqlAlchemy. Any help will be great.
I have my models as:
class ReferentialDatum(db.Model):
__tablename__ = 'referential_data'
sid = db.Column(db.Integer, primary_key=True, nullable=False)
venues = db.Column(db.JSON, primary_key=True, nullable=False)
exchange_tickers = db.Column(db.JSON)
trading_date = db.Column(db.Date, primary_key=True, nullable=False, index=True)
total_volume = db.Column(db.BigInteger)
last_trade_price = db.Column(db.Numeric)
last_reference_price = db.Column(db.Numeric)
exchange_close_price = db.Column(db.Numeric)
adjusted_close_price = db.Column(db.Numeric)
close_price = db.Column(db.Numeric)
close_price_source = db.Column(db.Text)
insertion_time_utc = db.Column(db.DateTime(True), server_default=db.FetchedValue())
last_updated_time = db.Column(db.BigInteger)
Note that I am using venues as a json field in my models and it is also a primary key.
In my function I have:
def ListProtobuf(Reference_data):
venues = {"Canada1":1, "Canada2":1}
exchange_tickers = {"AAPL":1, "APL":1}
trading_date = datetime.date.today()
item = ReferentialDatum.query.filter_by(sid=Reference_data.security_id,trading_date=trading_date,venues=venues)
if item.count()>0:
item.update({"total_volume":sum(Reference_data.total_volume),"last_trade_price":Reference_data.last_trade_price,"last_reference_price":Reference_data.reference_price,"last_updated_time":Reference_data.update_time})
db.session.commit()
else:
entry = ReferentialDatum(sid=Reference_data.security_id,exchange_tickers=exchange_tickers,venues=venues,trading_date=trading_date,total_volume=sum(Reference_data.total_volume),last_trade_price=Reference_data.last_trade_price,last_reference_price=Reference_data.reference_price,last_updated_time=Reference_data.update_time)
db.session.add(entry)
db.session.commit()
When I try to insert a new row(which does not exist earlier) I get the following error:
File "mini_conda/envs/env/lib/python2.7/site-packages/sqlalchem /orm/identity.py", line 97,
in __contains__ if key in self._dict:
TypeError: unhashable type: 'dict'
However if I remove primary_key=True from my models on the venues field.I do not get this error.
Any idea on why is this happening. Does ORM use fields that are primary keys as keys of some dict which gives this kind of error.How do I correct this error given that I require primary_key=True.
I'm trying to delete a child object from a many-to-many relationship in sql-alchemy.
I keep getting the following error:
StaleDataError: DELETE statement on table 'headings_locations' expected to delete 1 row(s); Only 2 were matched.
I have looked at a number of the existing stackexchange questions
(SQLAlchemy DELETE Error caused by having a both lazy-load AND a dynamic version of the same relationship, SQLAlchemy StaleDataError on deleting items inserted via ORM sqlalchemy.orm.exc.StaleDataError, SQLAlchemy Attempting to Twice Delete Many to Many Secondary Relationship, Delete from Many to Many Relationship in MySQL)
regarding this as well as read the documentation and can't figure out why it isn't working.
My code defining the relationships is as follows:
headings_locations = db.Table('headings_locations',
db.Column('id', db.Integer, primary_key=True),
db.Column('location_id', db.Integer(), db.ForeignKey('location.id')),
db.Column('headings_id', db.Integer(), db.ForeignKey('headings.id')))
class Headings(db.Model):
__tablename__ = "headings"
id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String(80))
version = db.Column(db.Integer, default=1)
special = db.Column(db.Boolean(), default=False)
content = db.relationship('Content', backref=db.backref('heading'), cascade="all, delete-orphan")
created_date = db.Column(db.Date, default=datetime.datetime.utcnow())
modified_date = db.Column(db.Date, default=datetime.datetime.utcnow(), onupdate=datetime.datetime.utcnow())
def __init__(self, name):
self.name = name
class Location(db.Model):
__tablename__ = "location"
id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String(80), unique=True)
account_id = db.Column(db.Integer, db.ForeignKey('account.id'))
version = db.Column(db.Integer, default=1)
created_date = db.Column(db.Date, default=datetime.datetime.utcnow())
modified_date = db.Column(db.Date, default=datetime.datetime.utcnow())
location_prefix = db.Column(db.Integer)
numbers = db.relationship('Numbers', backref=db.backref('location'), cascade="all, delete-orphan")
headings = db.relationship('Headings', secondary=headings_locations,
backref=db.backref('locations', lazy='dynamic', cascade="all"))
def __init__(self, name):
self.name = name
And my delete code is as follows:
#content_blueprint.route('/delete_content/<int:location_id>/<int:heading_id>')
#login_required
def delete_content(location_id, heading_id):
import pdb
pdb.set_trace()
location = db.session.query(Location).filter_by(id = location_id).first()
heading = db.session.query(Headings).filter_by(id = heading_id).first()
location.headings.remove(heading)
#db.session.delete(heading)
db.session.commit()
flash('Data Updated, thank-you')
return redirect(url_for('content.add_heading', location_id=location_id))
Whichever way i try and remove the child object (db.session.delete(heading) or location.headings.remove(heading) I still get the same error.
Any help is much appreciated.
My database is postgresql.
Edit:
My code which adds the relationship:
new_heading = Headings(form.new_heading.data)
db.session.add(new_heading)
location.headings.append(new_heading)
db.session.commit()
I would assume that the error message is correct: indeed in your database you have 2 rows which link Location and Heading instances. In this case you should find out where and why did this happen in the first place, and prevent this from happening again
First, to confirm this assumption, you could run the following query against your database:
q = session.query(
headings_locations.c.location_id,
headings_locations.c.heading_id,
sa.func.count().label("# connections"),
).group_by(
headings_locations.c.location_id,
headings_locations.c.heading_id,
).having(
sa.func.count() > 1
)
Assuming, the assumption is confirmed, fix it by manually deleting all the duplicates in your database (leaving just one for each).
After that, add a UniqueConstraint to your headings_locations table:
headings_locations = db.Table('headings_locations',
db.Column('id', db.Integer, primary_key=True),
db.Column('location_id', db.Integer(), db.ForeignKey('location.id')),
db.Column('headings_id', db.Integer(), db.ForeignKey('headings.id')),
db.UniqueConstraint('location_id', 'headings_id', name='UC_location_id_headings_id'),
)
Note that you need to need to add it to the database, it is not enough to add it to the sqlalchemy model.
Now the code where the duplicates are inserted by mistake will fail with the unique constraint violation exception, and you can fix the root of the problem.
I am new to sqlalchemy. I want to create a class which has two foreign key for different tables. Why I get next error?
sqlalchemy.exc.IntegrityError: (IntegrityError) insert or update on table "event" violates foreign key constraint "event_user_fkey"
DETAIL: Key (user)=(U) is not present in table "user".
'INSERT INTO event (id, "user", item) VALUES (%(id)s, %(user)s, %(item)s)' {'item': 'I', 'user': 'U', 'id': 'E'}
My code is next:
class User(Base):
__tablename__ = 'user'
id = Column(String, primary_key=True)
def __init__(self, id):
self.id = id
class Item(Base):
__tablename__ = 'item'
id = Column(String, primary_key=True)
def __init__(self, id):
self.id = id
class Event(Base):
__tablename__ = 'event'
id = Column(String, primary_key=True)
user = Column(String, ForeignKey('user.id'))
item = Column(String, ForeignKey('item.id'))
def __init__(self, id, user_id, item_id):
self.id = id
self.user = user_id
self.item = item_id
I use postgresql as a back end.
Base.metadata.create_all(engine)
Session = sessionmaker(bind=engine)
session = Session()
usr = User('U')
it = Item('I')
event = Event('E', usr.id, it.id)
session.add(usr)
session.add(it)
session.add(event)
The error seems pretty clear:
Key (user)=(U) is not present in table "user".
So it's trying to insert the Event row before the User has been committed to the database, which breaks the ForeignKey constraint, causing this error. Try committing the User and Item to the database before committing the Event, which depends on them and the problem should evaporate.