Reputation: 113
I have to read a large text file and to parse it line by line using C#. It could be done easily with StreamReader
for small sized file but it caught out of memory exception while working with large file. How can I adapt it for large files?
Following code catches OutOfMemoryException
:
using (StreamReader reader = new StreamReader(FileNameWithPath))
{
while ((line = reader.ReadLine()) != null)
{
// Do something here...
}
}
Upvotes: 3
Views: 1030
Reputation: 76
How about specify the buffer size ?
like this.
using (var reader = new StreamWriter(path,false,Encoding.UTF8, 1000))
{
.....
}
Upvotes: 0
Reputation: 4223
using (var inputFile = new System.IO.StreamReader(sourceFilePath))
{
while (inputFile.Peek() >= 0) {
string lineData = inputFile.ReadLine();
// Do something with lineData
}
}
Upvotes: 0
Reputation: 176936
I am not sure with this but give try to this class of .net framework
MemoryMappedFile Class-A memory-mapped file maps the contents of a file to an application’s logical address space. Memory-mapped files enable programmers to work with extremely large files because memory can be managed concurrently, and they allow complete, random access to a file without the need for seeking. Memory-mapped files can also be shared across multiple processes.
Upvotes: 1
Reputation: 1063338
That is pretty much the standard code for a lazy line reader, and shouldn't cause an OutOfMemoryException
unless there are some really big single lines. You could also try:
foreach(var line in File.ReadLines(FileNameWithPath)) {
// Do something here...
}
which just makes it cleaner, but does the same thing. So there are two options:
I expect the latter is more likley.
Upvotes: 10