In this series:

This blog post is the first in a series presenting an overview of the theories and practices involved in the conversion of analog signals into their digital counterparts. This is called analog-to-digital conversion or ADC. ADC is normally used in the front ends of systems that digitally process and/or analyze analog signals. Even though the basic principles of ADC can be relatively simple to grasp, their correct implementation can require special techniques, and a proper understanding of the characteristics and limitations of ADC is important to optimize their usage.

The purpose of this series of blogs is to:

  • Introduce the concepts of sampling and digitizing for those who are unfamiliar with them
  • Present the different techniques for preparing signals prior to analog-to-digital conversion
  • Present an overview of ADCs and their characteristics
  • Discuss ADC’s specifications and inherent limitations

The Analog World We Used To Know

The world around us has fundamentally changed over the last thirty years. Before the advent of the first microprocessor in the ’70s, almost all the technology surrounding us was “analog”. From sound vibrations imprinted in the groove of a vinyl record, to the light intensity and color captured by the silver crystals on a 35 mm film, to television programs magnetically stored in the iron oxide molecules of videotapes, everything was based on technology that worked in an analog fashion. “Analog” means that the information processed or stored was represented in a manner that was “similar” or “comparable” to its original counterpart.

The fact that technology in most of the 20th century had evolved “analogically” was perfectly understandable. The world around us is a continuous flow of infinitely varying information and it was only natural that the technology necessary to handle all that information operated in a similar manner. Besides, to be frank, there wasn’t really any other option.

Limitations of Analog Technologies

As was just mentioned, working with analog information involves working with a continuous flow of infinitely varying information. Processing and storing all that information usually requires bulky, cumbersome and expensive equipment and media. Not to mention how susceptible analog signals can be to noise and degradation (remember the bad quality of long-distance calls, the scratches ruining your favorite vinyl record, or the fading colors in the family photo album?). If we had continued to pursue the analog route, none of the technology we now take for granted in the 21th century would ever have existed. It was all made possible with advances in digital technologies.

The Brave New Digital World

The invention of the transistor revolutionized technology in the early 50’s. The transistor was initially used in the development of analog technologies, but two subsequent improvements changed everything. The first improvement was the integration of more and more transistors on the same piece of silicon (the integrated circuit, or IC). The second improvement was the use of the transistor as a signal switch. By turning transistors on and off it was now possible to easily create circuits that operated using large numbers of binary “0” and “1” states. These states can be combined to sequentially and logically process combinations of “true” and “false” conditions and do computations on numbers expressed as combinations of binary digits. These new transistor-based ICs eventually replaced the vacuum tubes that were used in large mainframe computers to perform the same type of functions. This major achievement eventually led to the invention of the digital microprocessor by Intel, which in turn led to the development of the personal computer. The rest, as they say, is history.

Many of the major technological transformations that followed the advent of the personal computer were direct consequences of the new software capabilities that the microprocessor was bringing to the masses. This, combined with the improvements in analog-to-digital conversion technologies, led to the fundamental transformations that came later on, and that are now evident in the technologies we use every day. Some examples of this transition are vinyl records to CD, videotape to DVD, “brick” cellphones to smartphones, analog TV transmission to digital, and CRT TVs to flat-screens, to name just a few.

Bridging the Gap between Analog and Digital

The transition from analog to digital seems most prevalent in technologies related to media but, in fact, is now present in just about every other aspect of the reality surrounding us. There isn’t a technology in existence today that hasn’t be affected in one way or another by this transition. Technologies used to perform the actual transformation from “the analog domain” to “the digital domain” are now readily available, but require a certain understanding of their basic principles to be implemented correctly. The purpose of the following series of blogs is to introduce these principles one by one so that a reader unfamiliar with them can acquire a fuller understanding of the theoretical foundation of the conversion process as well as an awareness of its implementation and limitations.

Conclusion

We’ve seen that up until recently the most prevalent technologies were based on analog electronics. Over the last thirty years, most technologies have converted to more sophisticated and efficient digital substitutes. A fundamental technique required by digital devices is performing the analog-to-digital conversion (ADC) of the analog signals they must operate on. We’ll be discussing the principles and techniques surrounding the use of ADC in a series of upcoming blog posts.