Reputation: 3445
How much hardware understanding does one need to fully comprehend "Operating System" and "Computer Architecture" courses one takes as a computer science student?
Upvotes: 4
Views: 7219
Reputation: 115
As a good example we could refer to THE bios, as the first program(bios is a form of os itself and operating system is a program as well) created on the arguably the first PC (ibm pc 5150), looking at the bios you see a lot of codes(the codes are hardcoded in the 6th rom on the motherboard) which tests the hardware or implements some functionalities(they are famous as interrupts), you could only understand those codes(about 5000 lines of equivalent assembly code) if we know the underlying hardware not only deep but in the context.
as an example you could learn 8237 dmac from its datasheet which seems the best place to understand about a chip, but when you look at it on the motherboard, we see even some of its behaviour has been changed to have more control(take it as the AEN pin on 8237 which even though its compulsory to work with it but it hasn't been used, and ibm pc 5150 designers have decided to implement the functionality of that pin in another way).
this means to understand any os, its better to first understand the most low level program(os), the bios, to understand the bios you need to understand the underlying hardware, but this underlying hardware has to be understood in the context.
Upvotes: 0
Reputation: 19765
Two thoughts:
First, everything is going parallel. Multi-threading is one thing, multi-core is another. There are oodles of issues around caching, memory architecture, resource allocation, etc. Many of these are 'handled'' for you but the more you know about the metal the better.
Second, number representations in hardware. This as old as computer science itself, but it still trips everyone up. Not sure who said this, but it's prefect: "Mapping an infinity of numberrs onto a finite number of bits involves approximations." Understanding this and numerical analysis in general will save your bacon time and again. Serialization and endian-ness, etc.
Besides, it's fun!
Upvotes: 2
Reputation: 7168
It helps when you are trying to optimize for the hardware you are targeting. Take a hard drive for example, it helps to write software that takes advantage of locality to minimize seek time. If you just treat a hard drive as 'it works', and stick files and data all over the place, you will run into severe fragmentation issues and result in lower performance.
A lot of this is taken into consideration when designing an operating system since you are trying to maximize performance. So in short, learning something about it can help, and certainly can't hurt in any way.
Upvotes: 2
Reputation: 5784
A good way to determine a baseline knowledge set for hardware knowledge generally needed for Comp Sci studies is to visit the curriculum websites of a wide range of prestigious universities. For me, I'd check the Comp Sci curriculum at MIT, Stanford, University of Illinois at Urbana/Champaign (UIUC), Georgia Tech, etc. Then I'd get an average understanding from that.
Furthermore, you could also personally phone up guidance counselors at Universities to which you are either attending or applying in order to get a personalized view of your needs. They would be available to guide you based on your desires. Professors even more. They are surprisingly accessible and very willing to give feedback on things like this.
Recently, I looked at grabbing my master's degree. As an alum of UIUC, I emailed a few old professors there and told them of my interest. I asked them several questions geared at understanding gradschool and their perspective. They shared and most invited me to call and chat.
Personally, I'd agree with @CookieOfFortune. The more you know about how a computer works internally, the more you can use that to your advantage while writing software. That said, it's not as if you really need to understand the physics of electronics to a high degree. It's interesting, sure, but your focus should be on circuitry, logic, etc. Much of this should be presented in a good Operating Systems course or at least provide you with springboards to learn more on your own.
Upvotes: 1
Reputation: 46098
At the very basic level, you should know about Von Neumann architecture and how it maps onto real-life computers. Above that, the more the better. And not just the OS - in garbage collected & VM languages, how the heap, stack and instructions work and are executed, so you know what will perform bad and how to improve it to get the best out of the architecture.
Upvotes: 4
Reputation: 391820
"Computer science is no more about computers than astronomy is about telescopes."
Upvotes: 2
Reputation: 13974
At that level, the more you know the better, but the bare necessities are boolean logic design for computer architecture. Understand how do you design registers, adders, multiplexers, flip flops, etc. from basic logic units (and, or, clocks). You can probably understand operating systems starting from basic understanding of ASM, memory mapped IO, and interrupts.
EDIT: I'm not certain what you mean by "hardware", do you consider logic design to be hardware? Or were you talking about transistors? I suppose it wouldn't hurt to understand the basics of semiconductors, but architecture is abstracted above the real hardware level. I would also say that operating systems are abstracted above the architecture.
Upvotes: 5