Reputation: 701
I'm working on implementing a soft deletion system for Django models for one of my projects and I've run into a problem of new data conflicting with 'soft-deleted' data. Ideally I'd like to keep both the deleted and the new model, but also enforce unique constraints on existing models.
So essentially I'd like to be able to add fields that are unique unless deleted=True, at which point you can have as many as you'd like. Is there a way to do this that doesn't involve manually overriding the save function for each model I'd like to soft delete?
Upvotes: 0
Views: 801
Reputation: 48952
One solution is to create a partial unique index; that is, an index that only enforces the unique constraint when some expression is true.
As of Django 2.2 you can do this declaratively. It would look something like:
from django.db.models import Model, Q, UniqueConstraint
class MyModel(Model):
...
class Meta:
constraints = [UniqueConstraint(fields=["field"], condition=Q(is_deleted=False)]
In older versions you'd need to create the partial unique index with a data migration. (See my answer here for more details.)
It would look something like:
class Migration(migrations.Migration):
dependencies = [ ... ]
operations = [
migrations.RunSQL("CREATE UNIQUE INDEX my_constraint
ON appname_mymodel (field)
WHERE is_deleted = false")
]
In this case, Django doesn't know anything about the constraint so it won't be able to do any validation. So if you want your admin users to get a nice error message if they violate the constraint you should provide your own validation.
Upvotes: 3