small_potato
small_potato

Reputation: 3197

cuda float point precision

Can someone comment on this,

I want to do a vector dot product. My float vector are [2080:2131] and [2112:2163], each one of them contains 52 elements.

a[52] = {2080 2081 2082 ... ... 2129 2130 2131};
b[52] = {2112 2113 2114 ... ... 2161 2162 2163};

for (int i = 0; i < 52; i++)
{
    sum += a[i]*b[i];
}

The result sum for whole length (52 element)was 234038032 by my kernel while matlab gave 234038038. For 1 to 9 element sum of product, my kernel result agrees with matlab result. For 10 element sum, it is off by 1 and gradually increases. The results were reproducible. I checked all the elements and found no problem.

Upvotes: 1

Views: 2849

Answers (1)

Tom
Tom

Reputation: 21108

Since the vectors are float you are experiencing rounding errors. Matlab will store everything with much higher precision (double) and hence won't see the rounding errors so early.

You may want to check out What Every Computer Scientist Should Know About Floating Point by David Goldberg - invaluable reading.

Simple demo in C++ (i.e. nothing to do with CUDA):

#include <iostream>

int main(void)
{
  float a[52];
  float b[52];
  double c[52];
  double d[52];

  for (int i = 0 ; i < 52 ; i++)
  {
    a[i] = (float)(2080 + i);
    b[i] = (float)(2112 + i);
    c[i] = (double)(2080 + i);
    d[i] = (double)(2112 + i);
  }

  float fsum = 0.0f;
  double dsum = 0.0;
  for (int i = 0 ; i < 52 ; i++)
  {
    fsum += a[i]*b[i];
    dsum += c[i]*d[i];
  }

  std::cout.precision(20);
  std::cout << fsum << " " << dsum << std::endl;
}

Run this and you get:

234038032 234038038

So what can you do about this? There are several directions you could go in...

  • Use higher precision: this will affect performance and not all devices support double precision. It also just postpones the problem rather than fixing it, so I would not recommend it!
  • Do a tree based reduction: you could combin the techniques in the vectorAdd and reduction SDK samples.
  • Use Thrust: very straight-forward.

Upvotes: 11

Related Questions