Reputation: 387
I am having memory corruption problems when I explicitly dispose IMem objects created by OpenCL.Net (v2.2.9).
If I don't call IMem.Dispose()
the video card memory is not released.
Unfortunately I found too few examples on the subject. Here a single call to env.Dispose()
is performed.
My code is something like this:
IMem<int> idxsBuffer = context.CreateBuffer(idxs, MemFlags.ReadOnly);
for(int i=0; i<n; i++){
IMem<float4>[] a = CreateA(); // c_.Length times context.CreateBuffer(a_[...], MemFlags.ReadOnly)
for(int j=0; j<m; j++){
b_ = CreateB(i, j);
c_ = CreateC(i, j);
for(int k=0; k<o; j++){
IMem<float4> b = context.CreateBuffer(b_[k], MemFlags.ReadOnly);
IMem<float> c = context.CreateBuffer<float>(c_[k].Length, MemFlags.WriteOnly);
kernel.Run(cmdQueue, idxsBuffer, b, a[k], c, [...]);
float[] u = new float[c_[k].Length];
cmdQueue.ReadFromBuffer(c, u);
b.Dispose(); // Cause problem
c.Dispose(); // Cause problem
}
}
foreach(IMem<float4> a_ in a){
a_.Dispose(); // Cause problem
}
}
To avoid memory corruption I have to comment out the lines that cause problems, but then I see a constant increase of the video card memory consumption.
Edit: I partially solved the problem avoiding the disposal of the objects during the iterations, reusing the same buffers. Still I can't understand what I was doing wrong.
Upvotes: 0
Views: 128