In: Electrical Engineering
Decreasing the quantization interval of an ADC will decrease the average quantization error, True or False.
"Decreasing the quantization interval of an ADC, will decrease
the average quantization error"
The above statement is TRUE.
We know that, whenever a signal is converted into a number, range of that number is discrete values. For example, when a signal is changed into a 8 bit number, its range lies between discrete values..ie.. 256 discrete values. If the range of analogue signal amplitude is 0.0 and 5.0V, quantization interval can be calculated by 5/256.. or 0.0195 V. From this example itself, we can understand that number of bits,to which signals are converted is inversely proportional to the quantization interval.
Majority of the quantized samples will vary from the original samples by a little amount.. this small amount is called as quantization error..ie quantization error can be defined as the difference between sampled signal to quantized signal. This error can be minimised by increasing quantization level or number of bits. We have already explained that, increasing the number of bits result in the, decrease of quantization interval. Therefore we get the conclusion..,decreasing the quantization interval of an ADC will decrease the average quantization error.
THANKYOU SIR/MADAM..Hope this helped you.