Reputation: 5918
i want to read the IRStatisticsImpl
data but have some problems:
my result is:
IRStatisticsImpl[precision:0.04285714285714287,recall:0.04275534441805227,fallOut:0.0018668022652391654,nDCG:0.04447353132522083,reach:0.997624703087886]
does it meant, that i got only 4% of good recommendations (precision) and about the same level of bad recommendation (recall)?
what should the numbers look like at best - precision at 1.0 and recall at 0.0?
Upvotes: 4
Views: 346
Reputation: 452
Well, by definition:
Precision represents how many results are correct in your result set. Recall represents the probability that a correct element in a test set has to be selected as a correct and picked in a result set.
To be perfect Precision and Recall should be both at 100%. Good results and criteria about these values must be evaluated according to your domain.
For example if you have a bucket with good and bad mushrooms you should aim at 100% for precision no matter how low is your recall. Because precision is critical for your health, you can even leave a lot of good mushrooms. The important thing is not eat the ugly ones. You could pick one good mushroom and so you get 100% precision, but if there were four good mushrooms in your bucket, your recall is 25%. Ideally if precision and recall are 100% means that in your result set all your mushrooms are good and also all good mushrooms are in your result set and none is leaved in your test set.
So values might have different meanings.
Sadly your results seem very ugly, because you're having many false positives and too much false negatives.
Take a look here.
Upvotes: 2