Just a viewer
Just a viewer

Reputation: 11

How do operating systems handle multiple devices at once?

What are some of the challenges that are faced by an operating system when there are multiple devices working simultaneously and what can be used to help determine the order in which these devices are handled?

Upvotes: 0

Views: 263

Answers (1)

Brendan
Brendan

Reputation: 37252

For device drivers and devices, typically (as a basic/simplified model):

  • something (using a CPU) asks device driver to ask its device to do a some kind of job, causing device driver's code to run
  • device driver's code checks if the device is currently busy, and if device is busy the driver puts the new job on some kind of "queue of pending jobs" and returns. If the device is not busy; then the device driver tells its device what to do and then returns. Either way the device driver returns very quickly (CPU goes back to running normal code, etc).
  • then (later) the device sends an IRQ back to say that it (successfully or unsuccessfully) completed doing what it was told, causing a CPU to run the device driver's code again
  • then the device driver's code checks the status of the previous work and does any actions that requires (informing whatever requested the job of the status, transferring any received data, etc)
  • then the device driver checks if there's another job on the "queue of pending jobs". If there is one it's removed and the driver asks the device to do that job before returning from the IRQ. Either way, driver returns from the IRQ (CPU goes back to running normal code again).

The important part here is that the device driver only uses a tiny amount of CPU time to manage the device (while the device spends most of its time doing what it's told to do without using any CPU); and almost all of the CPU's time can be spent doing other things (including spending a little time to manage many other devices and a lot of time running normal user-space code). This means that many different devices can be doing useful things simultaneously, while CPU/s spend most their time executing normal code.

Typically there are only 2 things that manage the order. The first is how the device driver's "queue of pending jobs" is designed. It can be a simple "first come first served" FIFO queue; but often it's something more complex involving "IO priorities" (e.g. to ensure that reading urgently needed data from swap space occurs before prefetching data that isn't actually needed yet from the same device).

The second thing used to manage the order is IRQ priorities (for when multiple devices generate an IRQ at the same/similar time). This can be either "first come first served" (nothing managing the order that IRQs are handled), or can involve hardware support and OS cooperation.

WARNING 1: This was a basic/simplified model. In real systems there are more complications (e.g. jobs that are already on a device driver's "queue of pending jobs" can be cancelled, device driver has to do power management, device may be unplug-able, etc); and some devices have their own internal queues and/or are able to perform multiple jobs simultaneously themselves.

WARNING 2: This was "typical for a modern OS". Operating systems are different, and some (e.g. MS-DOS from 30 years ago) might do none of the above (and might not allow multiple devices to be used simultaneously at all).

Upvotes: 2

Related Questions