Anton Stafeyev
Anton Stafeyev

Reputation: 831

Nodejs CPU intensive tasks

recently started to develop with node and ran into a problem. I have a web service which is a raw bank. basically a collection of raw files (photography stuff). Users just upload them and download. nothing fancy. But recently i came up with an idea to add sorting feature which depends on camera settings: shutter speed, geolocation, fstop, colors and etc. basically upon uploading a raw file I need to process it and this is very heavy files, roughly 60-150 MB each and usually user uploads 3-4 files. what would be the best solution to process heavy files without actually harming server performance.

Upvotes: 4

Views: 1256

Answers (2)

tadman
tadman

Reputation: 211540

There's a number of things to consider here:

  • Do you have a good work-queue solution to prioritize jobs and fan them out across multiple worker processes?
  • Do you make use of things like WebWorkers to make each process much more productive on multi-core systems?
  • Do you use compiled libraries to help process the images faster? As KolCrooks says, GPU-accelerated libraries are a huge asset as they can cut processing time down from minutes to fractions of a second. This is only relevant if your server has adequate GPU resources, "built-in" GPUs rarely suffice.
  • How are you storing and exchanging these images? What network topology can you use? 10Gbit vs. 1GBit could make a huge difference here.

Upvotes: 2

KolCrooks
KolCrooks

Reputation: 534

If you are doing the raw calculations yourself you could look into GPU accelerating them. The best library currently out there for that is https://gpu.rocks/. If you haven't already also make your server work asynchronously and even try making the it with node's cluster feature (the closest you can get to multi-threading in js).

Upvotes: 2

Related Questions