Reputation: 3679
I am wanting to read the text of multiple files into a string using a streamreader, but as of right now, my code has me creating a new streamreader every time I want to read a different file. This works, although when repeated hundreds of times, it seems highly inefficient. I am sure this is a very easy fix, but I am still new to programming, so a bit of knowledge would be much appreciated.
An example of the inefficient code I would like to avoid:
someloop
{
using (StreamReader sR = new StreamReader(changingfilename))
{
textString = sR.ReadToEnd();
}
}
Upvotes: 2
Views: 4013
Reputation: 4517
Reading from disk is comparatively very slow. The instantiation of a StreamReader is negligible is comparison. The slow part is ReadToEnd(), not new StreamReader(). Try to avoid reading from disk if your code is too slow.
Some versions of Visual Studio ship with a profiler which will tell you exactly which methods are spending cpu/clock time of your program
Upvotes: 2
Reputation: 16812
A better way would be to simply let .NET handle this for you an use the File.ReadAllText(yourFileNameAndPath)
static method.
Upvotes: 0
Reputation: 1563
Well you have to create a stramreader for each file what you can do is create an array and read it like this
List<string> lstFileNames=new List<string>();
StringBuilder stringBuffer=new StringBuilder();
lstFileNames.Add("File1");
lstFileNames.Add("File2");
for(int i=0;i<lstFiles.Count;i++)
{
using (StreamReader sR = new StreamReader(lstFiles[i]))
{
stringBuffer.Append(sR.ReadToEnd());
}
}
User string builder for appending strings.
Upvotes: 0