Reputation: 1
I started out just running some tests to see the speed difference between writing to a file and printing to the console, and how much of a difference there was between SSD and HDD. My program just prints the numbers 0-10,000,000
Console: 6.089
file: 4.269
I also ran this test up to a hundred million and consistently saw the same ratios of times. I also checked changing the order of the tests and saw no change in speed.
Here's where it gets weird. I changed both printlns to .println(i*i+42/7*9-89*2%400/2);
after doing this I got
Console: 8.586
file: 4.475
Where the console time increased significantly, but the file time did not. As a final oddity I changed it to .println( ( i*i+42/7*9-89*2 ) %400/2)
and in this case I actually saw a speed up in console output.
Console: 4.352
file: 4.66
Can anyone explain these oddities? I can't seem to find any reason for the drastic speed changes. I'm thinking perhaps it's just a change in the number of bits that have to be written, but I cannot explain why it only effects the console's speed.
Any help or answers are very much appreciated! This problem has been bothering me for a while so I thought I would ask the experts!
Upvotes: 0
Views: 169
Reputation: 938
Hereby a explanation of why printing to a console is slower than to file (taken from why is system out println so slow).
println
is not slow, it's the underlying PrintStream
that is connected with the console, provided by the hosting operating system is slow because:
The sudden increase of speed due to the different calculations I can't really give a explanation. I was initially thinking of casting from int to double, etc but that should apply to both println's.
Is the result that is printed longer than the width of your console?
Upvotes: 1