Digital vs. Analog

So what is "digital" and what is "analog", anyway?

These days, it seems the word "digital" is printed on everything. Sometimes when you see the word "digital" on something, it is not really "digital" at all. For example, a "digital AM/FM tuner" is "digital" only in the sense that the numbers that show the station are digits; the electronics inside are NOT digital, but analog!

The Victor Monarch Special is an analog machineTo get it straight: Analog recording can be considered as the "old school" or the "original way" of recording. All "old" audio formats are analog, including:

Truly "digital" audio formats include:

A DCC (Digital Compact Cassette), a digital tape deckTo understand what is "analog" and what is "digital", it is useful to compare the two. The following has been written from the perspective of recording something to tape; however any other recording format can be substituted for "tape" in this example:

In the "analog" world, all sound can be represented by what is called a "sine wave". This is the measurement of movement caused by the sound as it passes through air. High sounds or low sounds will move air (or the diaphragm of a speaker or microphone) at different rates; therefore this movement is referred to as the sound's "waveform".

Most forms of analog recording, where the recording medium can be used over and over again, are magnetic. LP records are NOT magnetic, but they cannot be recorded over. Cassette tapes, for example, ARE magnetic. They can be recorded over and over again (albeit with some loss in quality), and cassettes can be completely erased by passing a very strong magnet over them.

In the case of analog tape, the tape is passed over a recording head during the recording process. The motors move the tape reels and pass it over the head at a constant rate of speed. The head changes magnetic polarity according to the signal it receives. The recording head changes the polarity from "North" to "South" and back again to represent all sounds sent to it. There is a "true North" and "true South", but mostly the recording is made with a combination of "North" and "South" to represent the waveform. This is why it is called "analog" recording, because the signal which is recorded is "analagous" to the original waveform.

A sine waveThe illustration shown to the right demonstrates a waveform. This waveform can represent a particular sound. If the sound is a certain note, the sound can be represented by the waveform shown as the dotted line. If the note is a higher note (a higher "frequency"), the waveform will look different, as represented by the solid line. If complete silence were being recorded, there would be no line at all. This is where the difference begins between "digital" and "analog". For example, with an analog recording, over a particular space in time (think one second of time), the recorder may have recorded a particular sound as 3,000 Hz. This means that the actual sound at that moment in time was a note somewhere in what is known as the "upper midrange". Common sounds in the upper midrange can be spoken words. If someone were speaking to you, most likely that person's voice will be somewhere between 2,560 and 5,120 Hz. In this example we are assuming that a particular sound, when measured, is 3,000 Hz. This means that in order to record this sound on an analog machine, a magnet must move from north to south and north again 3,000 times in one second. Now suppose another sound were recorded, and this time it was 11,000 Hz. This sound is a much higher sound than the first one. For example, you could assume that you were recording the sound of a small bird. To record this "high" sound, the magnetic field in the recorder would have been shifted from north to south and then north again 11,000 times in that same period of time to represent the waveform (the sound). In analog formats, then, the measurement varies according to the sound being recorded.

However, digital formats take a constant measurement regardless of the sound that is being produced. It makes no difference if a rock concert is being recorded, or a symphony orchestra, or if there is no sound at all; the same amount of data is recorded (unlike in an analog format, where if silence were to be recorded, there would be NO movement of the magnetic field). This is where the term "sample rate" comes in. Almost all digital formats use a sample rate, whereas analog formats do not use a specified sample rate. A sample rate means that within a particular space in time, the recorder will "sample" the waveform a certain number of times. This would be equivalent to someone asking you to close your eyes and not to peek. However you are curious as to what that person is doing, so you take a quick peek when you think it is safe to do so. Each time you open your eyes and then quickly close them again you have taken one "sample" of what is happening. If you kept your eyes closed for 5 minutes,  and "peeked" 3 times during the 5 minutes, your "sampling rate" during that 5 minutes can be described as 3. In practice, digital recording equipment is rated according to the number of samples that are taken of the waveform (i.e., the music) in one second. All audio CD's are recorded at a rate of 44.1 kHz, which means that in one second, 44,100 samples of the waveform are taken. Again, it makes no difference if a high note were being recorded, or a low one, or if there were no sound at all; there are a constant number of samples being taken each second. If you were to isolate one of these samples, decode the digital information from that sample, and play only that sample, you would hear only one constant sound. However because there are many samples, playing them one after another creates the music that you hear.

Different digital recorders can use different sampling rates. DCC for example can record in sampling rates of 32, 44.1, and 48 kHz; that is, in one second they can take either 32,000, or 44,100, or 48,000 samples of the waveform (the music being recorded). Obviously the more samples taken, the better the sound will be; however as indicated before some formats such as the lowly audio CD have a fixed sampling rate and will not play back other sampling rates. So for example if someone were to create a recording at 48 kHz, it would have to be "resampled" at 44.1 kHz before it could be made into a CD.

Another factor in digital recording is the number of bits used, or the "wordlength". In a digital recording, all data is represented by 0's or 1's. However, the quality of the recording will be determined by how MANY 0's and 1's are used to represent each sample. For example, consider a worlength of 1. This would mean that any sound must be represented by only 1 digit. Thus, a digital recorder with a wordlength of 1 could only represent 2 distinct sounds, represented by "0" or "1". Any other sounds would be recorded as one of these two sounds only, or not recorded at all. Therefore a digital audio recorder with a wordlength of 1 is not very useful. In practice, audio CD's use a 16-bit wordlength; that means that each sample is represented by either sixteen "0"'s or sixteen "1"'s, or some combination of 0's and 1's with a total of sixteen digits. A particular sample (ONE of these 44,100 samples taken in a particular second) on an audio CD may look like this:

1001101100110100

You can see here that digital recording does not allow the value to be anything other than "0" or "1". This is true of all digital formats. But all digital audio formats are not equal in the amount of detail they use to describe a sound. As mentioned before, the audio CD uses a combination of 16 "1"'s or "0"'s to describe a sample, whereas other digital audio formats, such as DAT, can describe a particular wave in much more detail. Professional DAT recorders have a 24-bit wordlength, and therefore describe each sample as 24 0's and 1's. Obviously, the "longer" the wordlength, the more detail there is to what is being recorded. To imagine this, think what you would say if someone asked you to describe what you did last week, in one word. How is it possible for you to describe with much accuracy what you did last week if you can only use one word to do it? Now suppose that someone asks you to describe what you did last week, using 3,000 words or less. Obviously if you are allowed 3,000 words to describe the week, you will be able to describe the week's events in much greater detail. Therefore, the greater the wordlength, the more detail the digital signal can describe the sound, and the better the overall result will be.

You can see that the analog recording is "analagous" to the original sound, since the actual waveform heard is exactly what is captured by the analog recorder. A 3,000 Hz sine wave is recorded as a 3,000 Hz sine wave. In other words, the recording on an analog machine is an "analog" or "picture" of the original sound. By contrast, the digital recording is merely a description of the original sound. In an analog system such as a tape, a coil of wire with a gap between north and south poles may be used to playback the recording. It will follow the same path of north and south poles on the tape, and send the result to an amplifier, which amplifies it and sends it to a speaker. With a digital system however, it is not possible to simply pick up and amplify the digital bit stream in order to retrieve anything meaningful, because the digital bit stream doesn't "record" the waveform, but "describes" it in 1's and 0's. Each of the samples taken each second (44,100 in the case of an audio CD) will carry information about only two things: what was the polarity of the sound, and how loud was it? Knowing the answer to these two questions 44,100 times per second allows the digital recorder to recreate a sound which sounds like the original one, using what's called a "digital/analog converter" (usually referred to as a "D/A converter" or a "DAC"). The D/A converter takes the information from the digital bit stream and creates a waveform depending on the data that is on the tape or disc, and it is this waveform which it sends out to an amplifier which then be amplifies it. In this way, a waveform which sounds like the original waveform is created. However, it is important to remember that the sound you hear is actually being generated by the D/A converter inside the CD player (or other digital audio device) at the time you play it back. The waveform itself doesn't actually exist. Each time you play back the CD (or tape), the waveform is being created again according to the digital information.

Because the waveform is being created by the D/A converter every time the CD (or other digital device) is played back, this means that the D/A converter itself then becomes a very important factor in digital audio equipment. Professional digital audio devices actually allow the user to upgrade or change D/A converters for this reason. For a consumer-grade CD player, the D/A converter is built into the CD player, and different CD players (especially CD players at different price levels) have different D/A converters. This difference in D/A converters will mean that some CD players will sound better than others.

A sine waveSo if the actual waveform is recorded in an analog system, and is only "represented" in a digital one, why is it that digital is "better"? The quick answer to this is, digital ISN'T necessarily always "better"!

The main weakness of analog formats is their inability to capture the waveform in as much detail as is needed. Consumer analog formats are not of adequate quality to accomplish this. The chart on the left is a comparison of various recording formats to the range of sounds, and also a comparison of these formats to the capability for humans to hear. As you can see, some formats are a better match than others for the range of human hearing. Some professional analog tape formats can capture analog information in so much detail that it can be used as a master from which to make digital recordings, but not so with consumer equipment. Remember that the audio cassette was first invented as a tape format for dictation devices; it was not intended to be used for audio. The fact that its capability has been stretched to a point where it is considered an audio format demonstrates the innovativeness of many Japanese companies, but nevertheless the audio cassette is not designed to be a truly high quality audio format. In addition, in an analog system every link in the "chain" is a potential weakness. When recording live the microphone type, quality and placement of mics is of critical importance (this is true to BOTH digital and analog systems), but from the microphones to the actual tape there are many "links" that the sound must pass through. Each one of these links can add noise and distortion. With a digital system, there is not an issue with noise or distortion being added at any point in between the mic and the tape (or disc). Either the connection is good or it isn't. It is not possible to change some 0's to 1's or vice-versa. For this reason a digital recording may sound better than an analog one.

In the final analysis, both digital and analog recordings can sound good, if you know what you are doing and you are careful about it. The quality of all your recordings on Minidisc should sound better than the same recordings on cassette tape in any case.

Enjoy your recordings!