In: Chemistry
in GC calibration curve, which concentration remains constant and which one changes? (analyte vs internal standard)
An internal standard is a known amount of a
compound, different from analyte, that is added to the unknown.
Signal from analyte is compared with signal from the internal
standard to find out how much analyte is present.
Internal standards are especially useful for analyses in which the
quantity of sample analysed or the instrument response varies
slightly from run to run for reasons that are difficult to
control.
For example, Gas or liquid flow rates that vary by a few percent in a chromatography experiment could change the detector response. A calibration curve is only accurate for the one set of conditions under which it is obtained.
However, the relative response of the detector to the analyse and standard is usually constant over a wide range of conditions. If signal from the standard increases by 8.4% because of a change in solvent flow rate, signal from the analyte usually increases by 8.4% also. As long as the concentration of standard is known, the correct concentration of analyte can be derived.
Internal standards are widely used in chromatography because the
small quantity of sample solution injected into the chromatograph
is not very reproducible in some experiments.
Internal standards are also desirable when sample loss can occur
during sample preparation steps prior to analysis. If a known
quantity of standard is added to the unknown prior to any
manipulations, the ratio of standard to analyte remains constant
because the same fraction of each is lost in any operation.