Analog vs. Digital Microphone Sensitivity: Definition & Equations
Advertisement
This article explains the difference between analog and digital microphone sensitivity. It also provides the equations used to define each type of microphone sensitivity.
Sensitivity Definition: Sensitivity is defined as the ratio of the analog output voltage (for analog microphones) or the digital output value (for digital microphones) to the input pressure. It’s a crucial parameter for any microphone.
The equation for microphone sensitivity is as follows:
Typically, microphone sensitivity is measured using a 1 kHz sine wave at 94 dB SPL (Sound Pressure Level) or 1 Pa (Pascal) pressure. As mentioned, sensitivity is determined by measuring the magnitude of the analog or digital output signal from the microphone in response to a specific input sound stimulus.
Analog Microphone Sensitivity
The sensitivity of an analog microphone is calculated using the following equation:
The unit for analog microphone sensitivity is mV/Pa in linear units.
This can also be expressed in dBV (decibels relative to 1 Volt) using the following equation:
Where Output AREF is equal to a reference output ratio of 1000 mV/Pa (i.e., 1 V/Pa).
Digital Microphone Sensitivity
The sensitivity of a digital microphone is calculated using the following equation:
\text{Sensitivity (%)}= \frac{\text{Digital Output Value}}{\text{Full Scale Output}} \cdot 100 \%It’s measured as a percentage of the full-scale output generated by a 94 dB SPL input.
As the output units of analog and digital microphones are different, it can be challenging to directly compare the performance of these two types of microphones.