Introduction to Analog to Digital Converters (ADCs)
Analog to Digital Converters (ADCs) are circuits that convert continuous (analog) voltage values into binary (digital) values that can be processed by digital devices so that they can function as digital computing. With another definition, that this Analog to Digital Converter allows Digital circuits to interact with the real world by encoding Analog signals to Digital signals in the form of Binary.
This ADC circuit is generally made in the form of an IC and integrated with a Microcontroller.
In the real world, Analog signals coming from various sources and sensors measuring sound, light, motion and temperature will keep changing values (continuously) thus giving an infinite number of different values. While Digital circuits on the other hand work with Binary signals which only have two discrete states namely logic 0 (low) and logic 1 (high). Therefore, we need an electronic circuit that can convert two different domains of a continuous analog signal into a discrete digital signal. This circuit is what we call an Analog to Digital Converters (ADCs) , a device that acts as an intermediary to convert analog signals into digital signals to be understood by the microcontroller and microprocessor.
How Analog to Digital Converters (ADCs) Works
Analog signal types in our daily lives can be in the form of sound, light, temperature or movement. Whereas digital signals are represented by discrete value sequences where the signal is broken down into sequences that depend on the time series or sampling rate.
The sequence of the ADC process in converting analog signals into digital signals is to take samples of analog signals, measure and convert them into digital values in the form of binary values. Thus, the ADC converts the analog signal it receives into output data in the form of a series of digital values.
There are two main factors in the ADC that determine the accuracy of the digital values it produces. The two factors are Resolution and Sample Rate.
For example, if a 1V signal is converted to a Digital signal using a 3 bit ADC, it will produce 8 levels of division (2= 8 or in binary it is 111). In other words, there are 8 stages to achieve 1V output. Each stage is 0.125V (1/8 = 0.125V). So the minimum change from this 3 bit ADC to 1V is 0.125V or 125mV per level.
If we increase the Bit Rate higher, it will get a signal that is more precise and good. For example, if 1V is converted to ADC Resolution which uses 6 bits then each level will be 0.0156V or about 15.6mV.
For more details, please see the image below:
Sample Speed or Sample Rate
The number of sample conversions from analog to digital that the converter can make per second is called the Sample Speed or Sample Rate. Sample Speed is measured in units of S/s (Samples per Second) or SPS (Samples per Second). For example a good ADC can have a sample rate or sampling ratio of up to 300Ms/s (reads to 300 million samples per second).