BillyJean
BillyJean

Reputation: 1587

proper use of stream in a computationally intensive program

I have a program that may take up to 3-4 hours to finish. Underway I need to output various information into a general file "info.txt". Here is how I do it currently

char dateStr [9];
char timeStr [9];
_strdate(dateStr);
_strtime(timeStr);

ofstream infoFile("info.txt", ios::out);
infoFile << "foo @ " << timeStr << " , " << dateStr << endl;
infoFile.close();

This I do five times during a single run. My question is the following: Is it most proper (efficiency-wise and standard-wise) to

  1. close infoFile after each output (and, consequently, use five ofstreams infoFile1, infoFile2, ..., infoFile5, one for each time I output)
  2. or only to use "infoFile" and, consequently, have it open during the entire run?

EDIT: By "a single run" I mean a single run of the program. So by "five times during a single run" I mean that I output something to info.txt when running the program once (which takes 3-4 hours).

Upvotes: 0

Views: 94

Answers (3)

eladidan
eladidan

Reputation: 2644

This is a clear case of Premature optimization

It makes no actual difference to the performance of your application which approach you take as this is something that happens only 5 times during the scope of several hours.

Profile your application as the previous answer suggested and use that to identify the REAL bottlenecks in your code.

Only case I could think of where it would matter to you is if you wanted to prevent the info.txt from being deleted/edited during the scope of your application run-time. In which case you'd want to keep the stream alive. Otherwise it doesn't matter.

Upvotes: 0

James Kanze
James Kanze

Reputation: 153919

It's not really clear what you're trying to do. If the code you post does what you want, it's certainly the best solution. If you want the values appended, then you might want to keep the file open.

Some other considerations:

  • unless you close the file or flush the data, external programs may not see the data immediately.

  • When you open the file, any existing file with that name will be truncated: an external program which tries to read the file at precisely this moment won't see anything.

  • Flushing after each output (automatic if you use std::endl), and seeking to the start before each output, will solve the previous problem (and if the data is as small as it seems, the write will be atomic), but could result in misleading data if the values written have different lengths---the file length will not be shortened. (Probably not the case here, but something to be considered.)

With regards to performance: you're talking about an operation which lasts at most a couple of milliseconds, and takes place once or twice an hour. Whether it takes one millisecond, or ten, is totally irrelevant.

Upvotes: 2

dutt
dutt

Reputation: 8209

First; get numbers before optimizing, use a profiler. Then you know which parts take the most time. If you don't have a profiler, think a bit before doing anything. How many runs will you do during those 3-4 hours? If it's few things that only happen once per run are probably less likely to be good targets for optimization, if it's lots and lots of runs those parts can be considered as well since disc access can be rather slow.

With that said, I've saved a bit of time in previous projects by reusing streams instead of opening and closing.

Upvotes: 3

Related Questions