Reputation: 535
We want to create a scoring algorithm that will award higher points for less time and less points for greater time. The one caveat is that there is no real range so time can range from 100 milliseconds to as much as 10 minutes or more with a point range of 0 to 50.
Thanks for any help.
Upvotes: 5
Views: 2382
Reputation: 19621
You could transform the delay in some way before applying a linear mapping, to give points quicker earlier and slower later. One option would be to take the logarithm of the time so far. Another option would be the function f(x) = Ax/(A + x)
which increases quickly from 0 at x=0 but later slows down as it gets towards A as x increases, but never quite reaches it. (this increases and you want decreases but you can fix that with a linear function - an obvious example is points = A - Ax/(A + x))
.
For example, if you set A=10
then
f(0) = 10 - 10*0/(10 + 0) = 10,
f(1) = 10 - 10/11 = 9 1/11,
f(2) = 10 - 20/12 = 8 1/3,
f(100) = 10 - 1000/110 = 10/11
and so on.
Upvotes: 3
Reputation: 8106
Everything like this has to be done with some delay if the record set is too huge.
Assume that you are continuously getting times taken per user in a stream you keep a track of max
value over a period and you also have another system which grades points dynamically.
Trigger the calculation for points after decided repetitive interval for the existing record set of time taken in millis ( which might be stored in a linked list or may be a dynamically growing two dimensional array.
The logic for point calculation:
We clearly see it as an inverse linear relation of the time taken. So you know that all records with the time value equal to the average calculated above gets 25 on a scale of 1 - 50.
Hence for a given record assume 100 ms where the max value is 500 ms, the answer would be
50 * (500 - 100 ) / 500
Upvotes: 0
Reputation: 50717
You can simply make it a linear mapping using the following equation:
points = 50 * 100/time_in_ms
This will gives you:
time_in_ms=100ms
=> 50
points...
time_in_ms=10min
=> 0.0083
points...
time_in_ms=+∞
=> 0
pointsYou can easily adjust the above equation if the ranges of time and points changes.
Upvotes: 3
Reputation: 1519
I think you have two options:
If you really want to be able to assign points for any length of time, pick a formula that divides the number of points by the amount of time, e.g:
points = [max_number_of_points]/[time]
Here, time
should be in your smallest unit of measurement so that it can never be less than zero. If you don't want the amount of points to decrease at that speed, divide or multiply time
by some constant until the points distribution looks how you want it.
Decide that really, there is a limit where the time is so great that anything above that time just deserves zero points. For example, I doubt you care about the points difference when time is 100 million years as compared to 100 billion years. So, artificially pick a maximum time in this way, assign all larger times zero points, and then continue with your scoring algorithm within the finite range.
Upvotes: 0