Reputation: 6792
I am new to reading in file data using C#. I have a large text file which has data separated by commas. I would like to load this file into a DataTable
in my application (or into domain objects), but I get an OutOfMemoryException
when doing this. I have tried a few different ways but still the same error. It seems that in order to load the data I need to load it in chunks, process the chunk and then store it, then get the next chunk then process it etc
How can I use StreamReader
to do this? How would I tell StreamReader
where to read upto in the file and how does it know where to continue from to get the next chunk?
Upvotes: 1
Views: 4193
Reputation: 27009
You can use a paging mechanism wherein you read a specified number of lines at a time. Use the File.ReadLines
method as it will only read lines into memory as you access them. Here is some code:
private static int pageNumber = 0;
const int PageSize = 10;
public static void Main()
{
IEnumerable<string> page;
List<string> lines = new List<string>();
while ((page = NextPage()).Any())
{
// Here I am simply throwing the contents into a List
// but you can show it in the DataTable
lines = page.ToList();
// Do processing
}
Console.Read();
}
private static IEnumerable<string> NextPage()
{
IEnumerable<string> page = File.ReadLines("Path")
.Skip(pageNumber * PageSize).Take(PageSize);
pageNumber++;
return page;
}
Upvotes: 1
Reputation: 2700
You can use Buffer Stream for chunk data read here is the code
private void ReadFile(string filePath)
{
const int MAX_BUFFER = 20971520; //20MB this is the chunk size read from file
byte[] buffer = new byte[MAX_BUFFER];
int bytesRead;
using (FileStream fs = File.Open(filePath, FileMode.Open, FileAccess.Read))
using (BufferedStream bs = new BufferedStream(fs))
{
while ((bytesRead = bs.Read(buffer, 0, MAX_BUFFER)) != 0) //reading only 20mb chunks at a time
{
//buffer contains the chunk data Treasure the moments with it . . .
//modify the buffer size above to change the size of chunk . . .
}
}
}
Upvotes: 2