Hanuma
Hanuma

Reputation: 41

What is the most efficient way to read heavy files(>~10Gb) using Node.JS(LTS)?

What is the most efficient way to read heavy files(>~10Gb) using Node.JS(LTS)?

Essentially in today's world, I need to read the file content, parse each line to a known data-structure , perform certain validations, and push the data-structure into the database(SQL Server). I do it using C#(memory-mapped files). It works pretty well because I am able to read the file in chunks (in parallel).

I am planning to migrate the solution to Node(and MongoDB) for a business use-case.

Any leads/suggestions?

Environment:

I am using a 64-bit Windows OS, x64 based processor, 8Gigs of RAM

Upvotes: 3

Views: 4815

Answers (1)

Dinesh Pandiyan
Dinesh Pandiyan

Reputation: 6289

What you're looking for is usually referred to as streams in node.js.

You can read or write very large files with streams by processing portions of it.

Here are a few links that could help you to get started.

Parsing huge logfiles in Node.js - read in line-by-line

Using Node.js to Read Really, Really Large Datasets & Files

Read large text files in nodejs

Upvotes: 1

Related Questions