Reputation: 2016
I'm on a big data optimization job. it's very time consuming process,
so i like to save operations as much as possible.
I remember it says something like " division takes much much more time than Addition ".
but is there a chart which could give me a general idea that how many units of time these operation takes.
such as:
Addition 2 units
subtraction 5 units
multiplication 20 units
division 30 units
greater than 10 units
equals to 1 units
Upvotes: 2
Views: 1164
Reputation: 1755
If you don't know for sure that your workload is execution-bound (i.e. that the bottleneck is the operations you're suggesting here), the first thing to do would be to establish that using a profiling tool like VTune, oprofile, gprof or perfmon. To elaborate on what Paul A. Clayton is saying, if your workload is "big data", then it's probably more sensitive to the effects of the memory hierarchy than to arithmetic performance. A profiling tool could tell you if this is the case and whether there's a specific part of the memory hierarchy where you should target your optimization efforts.
Upvotes: 1