Reputation: 16037
I am working for a K-12 school and I am developing a gradebook system. The existing gradebook system that we use which I also developed is based on Excel with VBA. It's always a nightmare for me to consolidate 400+ Excel workbooks every end of term. During the summer break, I'm planning to put all data in a database for easy management and stuff.
My problem is this:
For a computation-intensive application like a gradebook, is it good to store the computations in a table field or is it better to ONLY store the raw data and do the computations only on the frontend?
The way the Excel gradebook system works is like this...
Everything above is very easy to make in a spreadsheet but I don't know how to implement it in a database.
As of the moment, I'm thinking something like having a table where all assessments are recorded like this:
tbl_scores
id
student_id
term_id
subject_id
component_id
assessment_id
raw_score
highest_possible_score #not sure about this cause this can be implied from assessment_id
Would it be useful to store the computations (percentage for each score entry, component average, subject average, general average, etc. in the database) and use stored procedures and triggers to update them when a new score comes in?
Or, is it better to just store the raw scores and calculate everything ONLY on the frontend? Will this option be faster than the first one knowing that the SELECT queries here will really be more complex due to subqueries?
By the way, I'm planning to use PostgreSQL for this.
Thanks in advance for any advice.
DashMug
Upvotes: 0
Views: 3234
Reputation: 656391
400 students, that is peanuts for an RDBMS that is designed to handle millions of records. You need not worry about performance.
I have seen nothing in your description that PostgreSQL could not do in simple queries. Store only the necessary information and calculate the rest. Also, do not add a redundant column highest_possible_score
to your table tbl_scores
if that information is in the linked table assessment
already. That would do more harm than good.
If (and only if) you should later find that a query is too slow, you can always use materialized views (write the output of a query to a table, recalculate that table after changes to the base data.)
Upvotes: 2
Reputation: 345
Denormalization depends on total amounts of records. I think that for school it would not be so big to do it, better decision is to consolidate data with queries (yes it will be more complex some joins will appear, that is normal)
Upvotes: 0