Reputation: 99
I have a table with a partial unique constraint where a costumer_id can only have one is_primary flag set to True.
The behavior that I would like to achieve is: when I insert a row with is_primary=True and a customer_id that already has a is_primary flag set to True in the db, I would like to set that old entry to is_primary=False, and the new entry would be inserted with is_primary=True, not raising an IntegrityError.
I'm using flask-sqlalchemy with a postgresql db, is there any mechanism to do this easily or do I need to do it manually, first update old entry and then insert the new row?
class Customer_address(db.Model):
customer_id = db.Column(None, db.ForeignKey('customer.id'), nullable=False)
is_primary = db.Column(db.Boolean, nullable=False)
__table_args__ = (
Index(
'only_one_primary_address', # Index name
'customer_id', # Columns which are part of the index
unique=True,
postgresql_where=(is_primary) # The condition
),)
Upvotes: 0
Views: 1274
Reputation: 55933
While it isn't a single database operation that will insert a new record and update the existing records, we can create an event listener in SQLAlchemy to listen for new address insertions and perform the updates. This is conceptually similar to using a database trigger as discussed in the comments, except that it happens in SQLALchemy ORM sessions.
import sqlalchemy as sa
from sqlalchemy import orm
@sa.event.listens_for(Customer_address, 'before_insert')
def receive_init(mapper, connection, target):
"""Listen for the insertion of new customer addresses."""
# Get the active session
session = orm.session.object_session(target)
cls = target.__class__
# Update the existing is_primary == True record for this customer
session.query(cls).filter(sa.and_(cls.customer_id == target.customer_id,
cls.is_primary)).update({'is_primary': False}, synchronize_session='fetch')
A Query.update, as performed in the listener bypasses ORM machinery, so it's worth reading the warnings associated with it in the docs. In this case setting synchronize_session
to 'fetch'
keeps the database and session in synch at the cost of issuing an extra SELECT
, so we get a sequence of events like this:
2020-11-22 09:04:26,565 INFO sqlalchemy.engine.base.Engine SELECT customer_address.id AS customer_address_id
FROM customer_address
WHERE customer_address.customer_id = %(customer_id_1)s AND customer_address.is_primary
2020-11-22 09:04:26,565 INFO sqlalchemy.engine.base.Engine {'customer_id_1': 1}
2020-11-22 09:04:26,566 INFO sqlalchemy.engine.base.Engine UPDATE customer_address SET is_primary=%(is_primary)s WHERE customer_address.customer_id = %(customer_id_1)s AND customer_address.is_primary
2020-11-22 09:04:26,566 INFO sqlalchemy.engine.base.Engine {'is_primary': False, 'customer_id_1': 1}
2020-11-22 09:04:26,567 INFO sqlalchemy.engine.base.Engine INSERT INTO customer_address (customer_id, is_primary) VALUES (%(customer_id)s, %(is_primary)s) RETURNING customer_address.id
2020-11-22 09:04:26,567 INFO sqlalchemy.engine.base.Engine {'customer_id': 1, 'is_primary': True}
2020-11-22 09:04:26,568 INFO sqlalchemy.engine.base.Engine COMMIT
Note also that this approach doesn't work if you create multiple addresses for the customer at once:
# Will violate the constraint
for _ in range(5):
session.add(Customer_address(customer_id=1, is_primary=True))
session.commit()
If this is something that is likely to happen in you code, you could consider listening for events on the customer's address collection or on the address instance instead.
Upvotes: 1