Reputation: 2718
I have an object which has an attribute value. Value can be updated on a weekly basis e.g. Monday by user.
I want to be able to display object value by week number e.g wk35 value, wk36 value ... wk40 value.
I want to be able to compare historical values and get percentage change.
What I do not know is:
Should I create attributes for my object e.g.
datecreated dateupdated (so it contains final last update on specific day) dateedited (so user can edit value multiple times during a day)
I understand user can edit value few time but should be able to update final edit once on that day.
I am struggling with a concept of above idea. I have started to read about simple-history which can help me achieve tracking changes but I do not not how to achieve what's written above.
My model class is:
class ZoneSubStage(models.Model):
zone = models.ForeignKey(Zone)
substage = models.ForeignKey(SubStage)
value = models.PositiveSmallIntegerField(default=0)
slug = models.SlugField(unique=True)
history = HistoricalRecords()
created = models.DateTimeField(auto_now_add=True)
date = models.DateField(null=True)
Upvotes: 1
Views: 743
Reputation: 2214
There might be some third party modules that do this for you but my best home brew suggestion is a logging table. If you need the log entry to be unique by date you would just need to modify the ZoneSubStage.save()
method to lookup a ZoneSubStageLog
by current date before simply creating a new one (which I think you need but will leave to you).
from django.db import transaction
class ZoneSubStage(models.Model):
zone = models.ForeignKey(Zone)
substage = models.ForeignKey(SubStage)
value = models.PositiveSmallIntegerField(default=0)
slug = models.SlugField(unique=True)
history = HistoricalRecords()
created = models.DateTimeField(auto_now_add=True)
date = models.DateField(null=True)
def __init__(self, *args, **kwargs):
super(ZoneSubStage, self).__init__(*args, **kwargs)
self.value_original = self.value
def save(self, **kwargs):
with transaction.atomic():
response = super(ZoneSubStage, self).save(**kwargs)
if self.value_original != self.value:
zone_log = ZoneSubStageLog()
zone_log.zone_sub_stage = self
zone_log.value = self.original_value
zone_log.save()
return response
class ZoneSubStageLog(models.Model):
zone_sub_stage = models.ForeignKey(ZoneSubStage)
value = models.PositiveSmallIntegerField(default=0)
date = models.DateField(auto_now_add=True)
Upvotes: 1
Reputation: 3419
This is a very broad/high-level question, but basically you should store historical data/changes separate from your model.
Think of it this way: your model is a living entity, and it has a current state. At regular intervals, you can take snapshots of that state (the data). Later, you can use those snapshots you made to piece together the model's history (visualize how it has changed over time, for instance). This is basically the Memento pattern.
So, consider your needs. What "resolution" should your model snapshot have? That is, should it store every data point or just a few? Should it store every single change by every single user, or just take a snapshot once a week? If it's just once a week, as you suggested, just run a cron job that copies the model data to another model... or another database, or you could even write it down to a flat JSON or CSV file!
Or... if you want to be able to reconstruct the model exactly as it was at a certain point in time, consider a package like django-reversion.
On the other hand, if there are just one one or two Really Important Numbers to track over time, you could just create a model that stores the id of the originating model as a foreign key, the important number values, and a timestamp.
EDIT: Dotcomly's answer provides a good implementation if all you want to keep is the one ZoneSubStage.value. To better decouple the log from the model, I'd suggest using a post_save
signal to create the log entry, instead of overriding the models' save method.
Upvotes: 0