Hardware

Photo via Wikipedia
Photo via Wikipedia

What to watch for

After completing this lesson, you’ll be able to:

  • Define essential terms related to computer hardware
  • Describe in broad strokes what different pieces of computer hardware do and how they do it
  • Construct a mental model of how a computer stores data

We’re going to start our discussion of the building blocks of new media with hardware for one simple reason: hardware is tangible. Unlike so many things we’ll discuss in this course, you can see and touch hardware.

Of course, hardware without software doesn’t make much sense, so we’ll occasionally have to peek around the corner at the next lesson and talk about some basic software concepts in this lesson. If anything we talk about with software here doesn’t make complete sense yet, do your best to trust that things will get disambiguated in the next lesson.

To keep things simple, we’ll talk about the hardware that makes computers tick by sorting things into three broad categories: data, processing, and inputs and outputs.

Let’s dive right in.

Data

We’re going to start with data because, as we’ll see, it’s really the whole point of all of the rest of a computer.

Required reading:
Stanford CS 101 on Bits and Bytes

(674 words / 4-6 minutes)

Digital data is what we call information that computers can understand. It seems both obvious and crazy to say it this plainly, but computers are just clever groupings of billions of very simple, very tiny machines that can perform operations really, really quickly. That speed is what makes them so powerful, but for us to make any use of that power, we need to get our information (data) into a format that computers (machines) can work with.

The basic unit of digital data is the bit, or binary digit1—a switch that is either flipped on or off, represented as a 0 or a 1. (In computer science, you start counting at 0. If you asked a computer to count three apples, it would count “0, 1, 2.” It understands that there are three things; it just starts at 0.)

There are a bunch of different ways you can store these bits: we used to use punch cards (there is / isn’t a hole). Then, we moved on to magnetic storage (magnet’s polarity is north / south). This was the basis for most computer storage for a long time, including mechanical hard disk drives (HDDs), sometimes called spinning disks because the platters in them spun around very quickly and the data on them was read by a floating head, not unlike a record player.2

Thankfully, we’ve mostly moved on to solid state drives (SSDs or flash memory), which use integrated circuits to store data electrically, which means they’re much faster, quieter, and more durable than traditional HDDs, though they’re still a fair bit more expensive than traditional HDDs3.

Anyway, regardless of the medium in which you store it, digital data at the end of the day is just a bunch of 0s and 1s. However, it turns out that a single bit all alone isn’t really all that useful. For a variety of historical reasons, we typically group a bits together into a byte. Let’s pause for a bit 4before we go on and use this new knowledge to resolve that pesky old “megabits / megabytes” confusion:

1 Byte = 8 bits
Kilobyte = KB = 1000 bytes (8000 bits)
Megabyte = MB = 1,000,000 bytes (8,000,000 bits)
Gigabyte = GB = 1,000,000,000 bytes (8,000,000,000 bits)
Terabyte = TB = 1,000,000,000,000 bytes (8,000,000,000,000 bits)

Got it? If not, you can play around more here or here.

Okay, so how do you take groups of eight bits that are either 0s or 1s and turn them into meaningful information? You make up some rules!5 These rules, made up by human beings just like you and me, are called formats or standards.

Here’s an example6:

Say your name is Scott, and you wanted to store your name in a computer. You could use the ASCII (American Standard Code for Information Interchange) standard to do so:

Scott
 S = 83 c = 99 o = 111 t =116
 83 99 111 116 116
 1010011:1100011:110111:1110100:1110100
 5 bytes = 40 bits

Those 40 bits would be stored as a series of positive or negative magnetic charges (on a traditional HDD) or electrical charges (on an SSD) and would take up 5 bytes of space on your disk.

With just a little bit of imagination, you can see how you could store any other type of information in a computer. Want to store a picture? Create a standard to define a grid of a given size, and then a way to define the combination of red, green, and blue color values in each square on that grid, and congratulations—you’re invented a bitmap file (or a JPEG, or a GIF, or whatever—all twists on the same idea)!

Of course, it’s all a bit more complicated than that, but hopefully you’re now starting to conceptualize the general principle of how to make data machine readable.

Required Reading just the first part

The links in the following sections (CPU, GPU, etc.) are a slightly different type of required reading. You are required to click through to the linked pages, but you’re only required to read the main description on any of them. You’ll also probably want to at least look at the pictures to understand what these things look like, too.

If you’re a moderately curious person, you’ll probably at least want to scan the rest of each linked page to get a general sense of the structure of the topic, and you’re of course welcomed (and encouraged) to read some or all of the rest of it. But, if you’re pressed for time / not that interested in the topic at hand, again, all that’s required is the main description of the topic.

Also, now’s probably the time to get used to reading this course by using multiple tabs all open at the same time. I’d suggest leaving this tab open the whole time you’re reading this lesson, opening all the new tabs as you get to each section, and then closing each of said newly-opened tabs as you’re done with it.7

Processing

Okay, so once you have all that wonderful digital data, you want to be able to do stuff with it—edit your pictures, input new data to your spreadsheets, etc. Of course, all of this requires software, too, but for now, let’s focus on the hardware components that process data.

CPU (Central Processing Unit)

Reading:

(276 words / 2-4 minutes)

To use an imprecise-but-still helpful metaphor, the CPU is the brain of the computer—it’s what does most of the general-purpose computing in a computer. Read this excerpt from “What is Code?8:

A computer is a clock with benefits. They all work the same, doing second-grade math, one step at a time: Tick, take a number and put it in box one. Tick, take another number, put it in box two. Tick, operate (an operation might be addition or subtraction) on those two numbers and put the resulting number in box one. Tick, check if the result is zero, and if it is, go to some other box and follow a new set of instructions.

You, using a pen and paper, can do anything a computer can; you just can’t do those things billions of times per second. And those billions of tiny operations add up. They can cause a phone to boop, elevate an elevator, or redirect a missile. That raw speed makes it possible to pull off not one but multiple sleights of hand, card tricks on top of card tricks.

In the above example, the CPU is the clock. That’s why you’ll sometimes hear people talk about “clock speed” when they talk about CPUs. CPU speed is currently measured in gigahertz (GHz), meaning that modern CPUs can perform billions of operations every second.

Not only that, but most modern CPUs are multi-core, meaning that each CPU is actually two—or three, or four, or six or eight, or more—CPUs on a single chip. Why? The actual answers are pretty complex, but there are two main ones.

First, as CPUs run faster and faster, they create more and more heat. Most modern CPUs top out somewhere around 3GHz in speed even though we’ve had the technology to make much faster chips for a while now. It just turns out that doing so turns your laptop (or tablet, or phone, or whatever) into a small griddle, and that’s not terribly pleasant.9

Second, we’ve figured out how to manufacture transistors at smaller and smaller scales, now measured in nanometers. This has allowed us to actually fit those multiple cores on a single chip, and it turns out that doing so leads to two major benefits. First, we can continue to increase computing power with much lower increases in heat output. Second, these increases in computing power also come at a lower power consumption cost, which is especially relevant in the age of battery-powered devices like phones and laptops.1011

There’s a lot more we could say about CPUs, but we’ll leave it at that for now.

GPU

Readings:

(417 words / 3-5 minutes)

GPUs were originally designed just for improving computer systems’ graphics (especially 3D) performance. It turns out that the kind of math you need to do to create 3D graphics is really tough on CPUs and can be handled much more efficiently by very parallel, purpose-built tools. Hence, GPUs.

For a while, the only folks who cared about GPUs were 3D artists and gamers. But, as we’ve gotten better at designing software that can take advantage of GPUs’ parallel computing powers (and as 3D effects have found their way into more everyday computing), we’ve figured out how to leverage GPUs for non-graphics related tasks12, leading to the rise of GPGPU, or general purpose computing on graphics processing units.

SoC (System on a Chip)

Reading:

(340 words / 2-4 minutes)

The rise of SoCs has coincided with the rise of mobile electronics. As the devices we use have gotten smaller and smaller, integrating entire computing systems on a single chip has enabled the necessary reductions in size and has also led to efficiency gains.

RAM (Random-access memory)

Reading:

(262 words / 2-4 minutes)

If a CPU is a computer’s brain, RAM is its working memory. RAM stores information only while a device is powered on. It’s the fastest possible tool a computer has at its disposal for reading and writing data. The amount of RAM in a device affects its ability to multi-task, as well as a few other performance metrics.

Input / Output

We now know about data, and we know, more or less, how computers work with that data. Now let’s talk about how we interact with computers—how we tell them to do things, how we see what they’re doing, and how we connect things to them.

Keyboards, mice, touch, etc.

Good news—you don’t have to read anything about these topics! (Though if you want to, there’s plenty to say—each of these input methods has its own fascinating history and design considerations. Instead, I want to just share one more sneak peek at “What is Code?” to help you think about how keyboards (and, with a little imagination on your part, other input devices) interact with the software that runs a computer:

Consider what happens when you strike a key on your keyboard. Say a lowercase “a.” The keyboard is waiting for you to press a key, or release one; it’s constantly scanning to see what keys are pressed down. Hitting the key sends a scancode.
… Every key makes a code. The computer interprets these codes. There are many steps between pressing the “a” key and seeing an “a” on the screen.
Just as the keyboard is waiting for a key to be pressed, the computer is waiting for a signal from the keyboard. When one comes down the pike, the computer interprets it and passes it farther into its own interior. “Here’s what the keyboard just received—do with this what you will.”
It’s simple now, right? The computer just goes to some table, figures out that the signal corresponds to the letter “a,” and puts it on screen. Of course not—too easy. Computers are machines. They don’t know what a screen or an “a” are. To put the “a” on the screen, your computer has to pull the image of the “a” out of its memory as part of a font, an “a” made up of lines and circles. It has to take these lines and circles and render them in a little box of pixels in the part of its memory that manages the screen. So far we have at least three representations of one letter: the signal from the keyboard; the version in memory; and the lines-and-circles version sketched on the screen. We haven’t even considered how to store it, or what happens to the letters to the left and the right when you insert an “a” in the middle of a sentence. Or what “lines and circles” mean when reduced to binary data. There are surprisingly many ways to represent a simple “a.” It’s amazing any of it works at all.

Video + audio

Readings:

(All Wikipedia)
(3419 words / 17-24 minutes)

Don’t skip the readings on these, even if you think you know what they are—you’ll definitely learn something from each one.

Computers send output to displays by targeting individual pixels. Apple has popularized very high resolution displays which it has marketed as Retina displays. We used to use CRTs as our computer monitors and TVs, but they’ve recently been replaced by LCDs (which are now primarily LED back-lit). Sadly, the era of plasma displays has passed, but that will eventually be okay since we’re rapidly approaching the age of affordable OLED displays.

Thankfully, we now have HDMI as a single cable to transmit video and audio from a device to a display / speakers. Sometimes we use Mini DisplayPort to VGA adapters (which plug in to Thunderbolt ports 13) to connect to old projectors. We still use 3.5mm audio cables to connect headphones and other audio devices to our computers14.

(Displays (and A/V in general) are so important and so fascinating! I could spend an entire lesson on displays. For the sake of brevity, I’ve limited myself to this short description, but you should totally go read all the links, because they’re awesome.)

Network interfaces

These are super-important, but we’re going to save the details for our lesson on networks. For now, just think about the fact that computers really become interesting, fun, and powerful if you have a way to connect them to each other. If you’re really curious, you can read articles on EthernetWifi, cellular data connections like LTE, and the differences between a router and a modem.

Data transfer

Readings:

(All Wikipedia)
(1321 words / 7-12 minutes)

Thankfully, data transfer has gotten a lot better in recent years. We use Bluetooth for connecting devices wirelessly over fairly short (generally less than 30 feet) distances. We mostly use USB 3 to connect most peripherals (external hard drives, scanners, etc.) to our laptops and desktops right now, but the move to USB Type-C is underway, which while a bit of a pain at the current moment will ultimately be a very good thing. Thunderbolt is a very powerful, but fairly expensive, alternative to USB, and it’s mostly found on Macs15. iOS devices generally feature Lightning ports. Other mobile devices feature an array of mini-and micro-USB, though thankfully they’re moving toward USB Type-C, too.

(Yes, you need to know what all of those things are and what they’re for.)

Three rules of hardware

Over time, computer hardware becomes:

  1. Smaller and smaller, to the point where a device’s size is dictated by human factors (the size of our hands, etc.), not hardware requirements
  2. Mechanically simpler, containing fewer moving parts. The fewer things that move, the more reliable the devices.
  3. Becomes faster and/or more efficient. Though it theoretically has a maximum limit, this rule of hardware has been described as Moore’s law (named after a co-founder of Intel), which specifically says that devices’ computing power will double every 18-24 months.

Discussion questions

  • What are some of your earliest memories of electronics hardware? Devices / things you loved? Hated?
  • Does your current computer use a traditional HDD (spinning disk) or SSD?
  • Real talk: does the concept of digital data really, truly make sense to you? If not (and that’s okay—this is complicated stuff!), talk about what you’re confused about, and see if anyone in your discussion group can help clear things up. (If you’re all still confused, just bring me into the conversation with a mention! (You can do this any time, by the way—not just on this question / topic!))
  • Did you read past the first part on any of the “Required Reading just the first part”? If so, what did you learn?
  • Do you remember the speed of the CPU in your first computer? If so, how does that compare to your current computer?
  • How many cores does your current computer have?
  • One of our excerpts from “What is code?” concludes with “It’s amazing any of it works at all.” After reading this lesson, do you agree?

Words on / reading time for this page: 2,952 words / 15-20 minutes

Words in / reading time for required readings: 5,917 words / 37-59 minutes

Total words in / reading time for this lesson: 8,631 words / 52-79 minutes

 


  1. It can be helpful to remember this to help distinguish bits from bytes, which we’ll get to in a second

  2. You have one of these if your computer makes noise when it launches an application or opens a file. If you’re really old like me, you actually used to get concerned when your computer stopped making noise when it was on.

  3. Though that price gap is shrinking rapidly

  4. (Ha!)

  5. Seriously!

  6. Cribbed from the NMI’s founder and the instructor for the previous version of this course, Dr. Scott Shamp

  7. Admittedly, this is easier done on a laptop/desktop/tablet than on mobile, but even on mobile, it works pretty well.

  8. With which we’ll be spending much more time next lesson

  9. It’s for this reason that high-end desktop gaming computers, that want to use the fastest possible chips no matter what, are sometimes water-cooled. It’s also why data centers are often built in cooler climates near water sources.

  10. PPW—performance per watt—is now an important metric.

  11. Some mobile CPUs even are designed to have both high-powered cores that can be switched on for processor-intensive tasks (and switched off when they’re finished) and low-powered cores that take care of simpler / background operations.

  12. Like machine learning, for example

  13. Which confusingly in version 3 will look like USB-C ports—more on those in a bit

  14. Though increasingly less so for our phones

  15. Though this may change with Thunderbolt 3 and changes to the licensing program for Thunderbolt