Reputation: 4808
Example 1:
console.log('Starting app');
setTimeout(() => {
console.log('callback 1');
}, 2000);
sleep(4000);
setTimeout(() => {
console.log('callback 2');
}, 1000);
console.log('Finishing up');
function sleep(milliseconds) {
var start = new Date().getTime();
for (var i = 0; i < 1e7; i++) {
if ((new Date().getTime() - start) > milliseconds){
break;
}
}
}
//Its Output
Starting app
Finishing up
Callback 1
Callback 2
Example 2 :
console.log('Starting app');
setTimeout(() => {
console.log('callback 1');
}, 2000);
sleep(4000);
setTimeout(() => {
console.log('Callback 2');
}, 0);
console.log('Finishing up');
function sleep(milliseconds) {
var start = new Date().getTime();
for (var i = 0; i < 1e7; i++) {
if ((new Date().getTime() - start) > milliseconds){
break;
}
}
}
//Its Output
Starting app
Finishing up
Callback 2
Callback 1
I am trying to understand that two asyncs function timer start executing simultaneously or one by one.
Example 1
is working fine as i expected. Callback 1 reached in queue first because it has timeout of 2 seconds. because there delay is for 4 seconds before the Callback 2.
Example 2
this example is not working as i expected. Callback 1 should be reached in queue first because it has timeout of 2 seconds and delay of 4 seconds before the Callback 2.
Upvotes: 0
Views: 71
Reputation: 10899
It appears the assumption that the loop for the sleep
function won't finish before the timeout might be the issue.
console.log('Starting app');
sleep(4000);
console.log('Finishing app');
function sleep(milliseconds) {
var start = new Date().getTime();
for (var i = 0; i < 1e7; i++) {
if ((new Date().getTime() - start) > milliseconds){
console.log('breaking');
break;
}
}
console.log(`Sleep time: ${new Date().getTime() - start}ms`);
}
The time for the loop to complete is dependent on the computer it is running on but most modern computers will never break since it finishes faster than the time allocated. This creates a range of "sleep" times which can lead to getting different outputs on different machines (even different on the same machine between runs if the deltas are small enough).
For instance, on one of my machines it takes ~800ms (a range of 750ms to 830ms) for this code to complete. Much shorter than the 4000ms specified.
So in the first test example the code would run like so on my computer:
This results in "callback 2" being printed at ~1800ms and "callback 1" being printed at ~2000ms. Given how close those are, I would venture the machine you are using is a bit slower executing the loop so you observed it they way you have indicated.
With the second test example, the code would execute as such on my computer:
This results in "callback 2" being printed at ~800ms and "callback 1" being printed at ~2000ms. Since your computer is fast enough to finish the loop in under 4s, you now observe the "unexpected" output as stated in your question.
So, basically the loop you are using to "sleep" is not providing the wait time you are basing your expectations off of.
Upvotes: 1