In: Chemistry
Suppose you prepared a solution with an absorbance that was too high and off the scale of your calibration curve (Assume the Spec-20 instrument was properly calibrated). What would be the most likely problem? What would you do to reduce the absorbance? Explain why it should be reduced.
Ans. Beer-Lambert’s Law, A = e C L - equation 1,
where,
A = Absorbance
e = molar absorptivity at specified wavelength (M-1cm-1)
L = path length (in cm)
C = Molar concentration of the solute
#a. Note that absorbance of a solution is proportional to it concertation (Beer- Lambert’s law).
A too high absorbance means that the solution is relatively very concertation
# All solution obey Beer-Lambert’s law and give linear graph only within a certain range of concentration. If concertation of solute exceeds the threshold concertation, absorbance does not increases proportionally to concertation; thus leading to deviation from linearity.
#b. Theoretically, increase the path length (say, from 1.0 cm to 2.0 cm) and decreasing the concertation would decrease the absorbance.
However, changing the path length is most often practically not feasible because most spectrophotometers have it fixed at 1.0 cm.
So, the best way to decrease the absorbance is to dilute the solution. A solution can be diluted optimally to yield an absorbance value within the range.
#c. The absorbance shall be reduced to bring it in the range of obeying Beer-Lambert’s law. Once the absorbance is within limit, the result calibration curve obeys Beer-Lambert’s law which is crucial to obtain accurate results.