CS 100 (Learn)CS 100 (Web)Module 01


What Does "Digital" Mean?

(direct YouTube link)

NOTE: If your internet access is restricted and you do not have access to YouTube, we have provided alternate video links.

TRANSCRIPT

You've probably heard the phrase "digital" used in many different contexts: digital computer, digital thermometer, digital camera, or you may have heard someone say that we live in the "digital age". In this video we will explore what it means for something to be digital.

Digital comes from the same Latin roots as the word digit -- which effectively means "finger". Your fingers (and your toes) are also known as digits.

So digital means "related to your fingers". If your doctor gives you a digital exam, they are examining you with their finger (which may not be a pleasant experience).

If you think about it, if someone wants your phone number and asks you for your "digits", it is actually really creepy.

Counting is related to fingers because most of us learn to count on our fingers. Numbers are composed of digits and there are ten different digits (zero through nine) and we have ten fingers, so it all starts to make sense.

From another perspective, something is digital if you can count in on your fingers, or it is countable.

As it turns out discrete quantities are countable, and continuous quantities are not.

To put it all together, if data is "digital", it simply means that it is discrete data, and not continuous. In fact, in most contexts, the words "discrete" and "digital" are interchangeable.

Just as converse of discrete is continuous, the converse of digital is analog. If something is analog, it means that it is continuous.

In the early days of computing there were some analog computers. For an appreciation of what an analog computer would be like, you can look up what a "slide rule" is -- perhaps your parents (or your grandparents) once used a slide rule.

Since the invention of the transistor and discrete electronics, pretty much every computer has been digital. Soon there was a desire to process more and more data from the real world. This data had to be discretized so it could be stored and represented on a digital computer. In this context, discretization became known as digitization.

Eventually, the word "digital" was used to describe anything that could be stored on a computer. Because computers use binary, and represent data with ones and zeros, the world "digital" also means anything that can be represented by ones and zeros. Well, it just so happens that all discrete data can be represented in binary, but this is a topic for a different video.

The word "digital" evolved further and now means anything related to computers. So when people say we live in a "digital age", they mean we live in an age where computers are ubiquitous.

Despite how the terminology is used today, it's good to remember the roots of where the terminology came from, and that whenever you are working with something digital, it means you are working with discretized data. It does not necessarily mean that it is stored on a computer, but that is usually what is meant.

As a final example, consider two thermometers. Analog mercury thermometers are continuous and have infinite precision, whereas the digital thermometers are discrete and can only represent the temperature accurate to a few decimal places.