Atomic absorption spectroscopy



In analytical chemistry, atomic absorption spectroscopy is a technique for determining the concentration of a particular metal element in a sample. Atomic absorption spectroscopy can be used to analyze the concentration of over 62 different metals in a solution.

Although atomic absorption spectroscopy dates to the nineteenth century, the modern form was largely developed during the 1950s by a team of Australian chemists. They were led by Alan Walsh and worked at the CSIRO (Commonwealth Science and Industry Research Organization) Division of Chemical Physics in Melbourne, Australia.

Principles
The technique makes use of absorption spectrometry to assess the concentration of an analyte in a sample. It relies therefore heavily on Beer-Lambert law.

In short, the electrons of the atoms in the atomizer can be promoted to higher orbitals for an instant by absorbing a set quantity of energy (i.e. light of a given wavelength). This amount of energy (or wavelength) is specific to a particular electron transition in a particular element, and in general, each wavelength corresponds to only one element. This gives the technique its elemental selectivity.

As the quantity of energy (the power) put into the flame is known, and the quantity remaining at the other side (at the detector) can be measured, it is possible, from Beer-Lambert law, to calculate how many of these transitions took place, and thus get a signal that is proportional to the concentration of the element being measured.

Instrumentation
In order to analyze a sample for its atomic constituents, it has to be atomized. The sample should then be illuminated by light. The light transmitted is finally measured by a detector. In order to reduce the effect of emission from the atomizer (e.g. the black body radiation) or the environment, a spectrometer is normally used between the atomizer and the detector.

Types of Atomizer
The technique typically makes use of a flame to atomize the sample, but other atomizers such as a graphite furnace or plasmas, primarily inductively coupled plasmas, are also used.

When a flame is used, it arranged so that it is laterally long (usually 10 cm) and not deep. The height of the flame above the burner head can be controlled by adjusting the flow of the fuel mixture. A beam of light passes through this flame at its longest axis (the lateral axis) and hits a detector.

Analysis of liquids
A liquid sample is normally turned into an atomic gas in three steps:


 * 1) Desolvation – the liquid solvent is evaporated, and the dry sample remains
 * 2) Vaporization – the solid sample vaporises to a gas
 * 3) Atomization – the compounds making up the sample are broken into free atoms.

Light Sources
The light source chosen has a spectral width narrower than that of the atomic transitions.

Hollow cathode lamps
In its conventional mode of operation, the light is produced by a hollow cathode lamp. Inside the lamp is a cylindrical metal cathode containing the metal for excitation, and an anode. When a high voltage is applied across the anode and cathode, the metal atoms in the cathode are excited into producing light with a specific wavelength. The type of hollow cathode tube depends on the metal being analyzed. For analyzing the concentration of copper in an ore, a copper cathode tube would be used, and likewise for any other metal being analyzed.

Diode lasers
Atomic absorption spectroscopy can also be performed by lasers, primarily diode lasers because of their good properties for laser absorption spectrometry. The technique is then either referred to as diode laser atomic absorption spectrometry (DLAAS or DLAS), or, since wavelength modulation most often is employed, wavelength modulation absorption spectrometry.

Background Correction methods
The narrow bandwidth of hollow cathode lamps make spectral overlap rare. That is, it is unlikely that an absorption line from one element will overlap with another. Molecular emission is much broader, so it is more likely that some molecular absorption band will overlap with an atomic line. This can result in artificially high absorption and an improperly high calculation for the concentration in the solution. Three methods are typically used to correct for this:
 * Zeeman correction - A magnetic field is used to split the atomic line into two sidebands (see Zeeman effect). These sidebands are close enough to the original wavelength to still overlap with molecular bands, but are far enough not to overlap with the atomic bands. The absorption in the presence and absence of a magnetic field can be compared, the difference being the atomic absorption of interest.


 * Smith-Hieftje correction (invented by Stanley B. Smith and Gary M. Hieftje) - The hollow cathode lamp is pulsed with high current, causing a larger atom population and self-absorption during the pulses. This self-absorption causes a broadening of the line and a reduction of the line intensity at the original wavelength.


 * Deuterium lamp correction - In this case, a separate source (a deuterium lamp) with broad emission is used to measure the background emission. The use of a separate lamp makes this method the least accurate, but its relative simplicity (and the fact that it is the oldest of the three) makes it the most commonly used method.