In: Chemistry
Why could the second measurement of the absorbance of the most concentrated sample differ from the first?
Lambert beer's law that underlies absorbance measurements has the following assumptions:
1) Each absorbing molecule (chromophore) is independent of the other - they dont interact with each other - that means they are in a perfect solution
2) Each molecule in the solution has an equal probability of absorbing a photon when placed in a beam of light
These assumptions break down at high concentrations - for example one chromophore molecule can shade the other. Therefore The absorbance value observed at high concentrations is lower than what it should be. This is reflected in the asymptotic flattening of a standard curve observed at high concentrations. The older spectrophotometers did not correct for this effect. Therefore we were taught to work strictly in the range of about 0.1 to 0.6 Abs units. If one's samples showed a higher absorbance, one diluted the solution to a lower concentration, or used a lower concentration of the chromogenic substrate in the reaction being monitored.