tria1312
tria1312

Reputation: 173

Difference between Latency and Jitter in Operating-Systems

discussing criterias for Operating-Systems every time I hear Interupt-Latency and OS-Jitter. And now I ask myself, what is the Difference between these two.

In my opinion the Interrupt-Latency is the Delay from occurence of an Interupt until the Interupt-Service-Routine (ISR) is entered. On the contrary Jitter is the time the moment of entering the ISR differs over time.

Is this the same you think?

Upvotes: 8

Views: 12234

Answers (2)

slebetman
slebetman

Reputation: 114094

Your understanding is basically correct.

Latency = Delay between an event happening in the real world and code responding to the event.

Jitter = Differences in Latencies between two or more events.

Upvotes: 18

prathmesh.kallurkar
prathmesh.kallurkar

Reputation: 5696

In the realm of clustered computing, especially when dealing with massive scale out solutions, there are cases where work distributed across many systems (and many many processor cores) needs to complete in fairly predictable time-frames. An operating system, and the software stack being leveraged, can introduce some variability in the run-times of these "chunks" of work. This variability is often referred to as "OS Jitter". link

Interrupt latency, as you said is the time between interrupt signal and entry into the interrupt handler.

Both the concepts are orthogonal to each other. However, practically, more interrupts generally implies more OS Jitter.

Upvotes: 1

Related Questions