Reputation: 964
I have a large archive of images from outdoor camera. Close to 200000 items, each 1280x960 color pixels. I would like to index this database by constructing SVD (Eigen-images) for this data and making reduced vectors of data (say 100-dimentional vector for every picture).
Loading all this data into RAM at once would require about 200GB of RAM. Firstly, I don't have so much RAM. Secondly, it won't scale much. So, I am looking for implementation of incremental singular vector decomposition that probably should exist for libraries like OpenCV or Eigen.
I don't want to reduce resolution before making SVD because I believe that small parts (resoluted far objects) may be important to me, but reducing resolution I just lost all high-frequency features.
Upd:
I found that NN algorithms GHA or APEX could help here.
Yet another algorithm: http://www.cs.technion.ac.il/~mic/doc/skl-ip.pdf
Upvotes: 1
Views: 367
Reputation: 14530
I haven't seen an implementation using Eigen. But it doesn't seem that difficult to code the same method that scikit-learn uses for incremental PCA.
Upvotes: 1