Reputation: 945
I am working with a very computational expansive code in Matlab. It requires the usage of optimisation techniques and long computations using very big matrixes.
I am having the following issue: even if the code run correctly, at the end of the iterations required by the code, Matlab is not storing the biggest cell arrays that I have. I guess that it is due to some memory inefficiency in my code or with my computer (which is probably not sufficiently powerful). However, I followed all the general suggestions in the Matlab documentation and it is still not working.
Using evalc, I managed to save a different variable for each iteration of the code, in order to re-create the original matrix at the end of the loop. However, using:
.. in this way it is working, but it still slow and not very "clean and tidy".
Is there a way to do the same thing in a better way (consider that I have to do the same thing for several variables with different names and dimensions) or i.e. to update a cell array in a .mat file adding a column (row or whatever) without loading it?
Thanks
Upvotes: 0
Views: 121
Reputation: 36720
Use matfile
which allows you writing and reading parts of a mat file without reading it into the memory. A small demonstration:
%initialize matfile
data=matfile('example.mat','writable',true)
n=10
%preallocate cell
data.list=cell(n,1)
for ix=1:n
%do some stuff
var=foo(n)
%store the results
data.list(3,1)={var}
end
The line data.list(3,1)={var}
looks a little odd because matfile
has some limitations when indexing, but it's "meaning" is data.list{3}=var
.
Upvotes: 3