MaddoScientisto
MaddoScientisto

Reputation: 189

Processing large amounts of files, slowed down by I/O, how to make more efficent?

I've been maintaining an application for a few years ever since creator stopped working on it. This application does batch operations on a large amount of pictures such as resizing, applying text, adding a logo, etc.

Lately I've been rewriting this application in order to improve efficiency and try to fully use every capability of the system by using all the available memory and splitting the operation over multiple threads.

The previous maintainer implemented multithreading in somewhat awful ways such as doing long operations in the UI thread (rendering the stop button he made inoperable) and using a really small thread pool.

So I've been optimizing code and trying out increasing the size of the thread pool but apparently nothing improves at all, the bottleneck seems to be reading and writing to disk.

Is there a way to improve this situation?

Upvotes: 1

Views: 351

Answers (1)

usr
usr

Reputation: 171216

Determine exactly what IO the application does at the moment. Then try to convert it to sequential access patterns. Random IO is 100x slower on magnetic disks and 10x slower on SSDs.

If it is a magnetic disk make sure that only one thread writes sequentially at a time. If you're forced to do random IO tune the optimal queue length.

Upvotes: 1

Related Questions