Basic concepts of quantum mechanics

:This article is intended as an accessible, non-technical introduction to the subject. For the main encyclopedia article, see Quantum mechanics. For a somewhat more technical introduction to the subject that requires some algebra, see Introduction to quantum mechanics.
Quantum mechanics explains the behaviour of matter and energy on the scale of atoms and subatomic particles.
Classical physics explains matter and energy at the macroscopic level of the scale familiar to human experience, including the behavior of astronomical bodies. It remains the key to measurement for much of modern science and technology; but at the end of the 19th Century observers discovered phenomena in both the large (macro) and the small (micro) worlds which classical physics could not explain.
This article describes the limitations of classical physics, and explains the main concepts of the quantum theory which supplanted it in the early decades of the 20th Century. These concepts are described in roughly the order they were first discovered.
Origins in black body radiation
Thermal radiation is electromagnetic radiation emitted from the surface of an object due to the object's temperature. If the object is heated sufficiently, it starts to emit light at the red end of the spectrum: it is red hot. Heating it further causes the color to change, as light of shorter wavelengths (higher frequencies) becomes stronger. A good emitter is also a good absorber. When it is cold, such an object looks black, as it emits practically no visible light, absorbing all light that falls on it. Consequently, its absorbing properties make it an an ideal emitter; this is known as a black body, and the radiation it emits is called black body radiation.
In the late 19th Century, thermal radiation had been fairly well characterised experimentally. The wavelength at which the radiation is strongest is given by Wien's displacement law, and the overall power emitted per unit area is given by the Stefan-Boltzmann law. As temperature increases, the glow color changes from red to yellow to white to blue. Even as the peak wavelength moves into the ultra-violet,enough radiation continues to be emitted in the blue wavelengths that the body continues to appear blue. It never becomes invisible—indeed, the radiation of visible light increases monotonically with temperature.
Physicists were searching for a theoretical explanation of these experimental results.
The answer from classical physics is called the Rayleigh-Jeans law. This law agrees with experimental results at long wavelengths. At short wavelengths, however, it predicts that energy is emitted by a hot body at an infinite rate. This result, which is clearly wrong, is known as the ultraviolet catastrophe.
The first model which was able to explain the full spectrum of thermal radiation was put forward by Max Planck in 1900. He modelled the thermal radiation as being in equilibrium by analogy with a set of harmonic oscillators. But to reproduce the experimental results, each oscillator had to have an exact number of energy units, rather than being able to have any arbitrary amount of energy. In other words, the energy of each oscillator was quantised. He determined that the energy of each oscillator was proportional to the frequency, i.e. that it was always an exact multiple of a constant now known as the Planck constant.
Planck's law was the first quantum theory in physics, and Planck won the Nobel Prize in 1918 "in recognition of the services he rendered to the advancement of Physics by his discovery of energy quanta". At the time, however, Planck's own view was that quantisation was purely a mathematical trick to explain the unexpected experimental results, rather than (as we now know) a fundamental change in our understanding of the world.
Photons: the quantisation of light
In 1905, Albert Einstein took an extra step. He suggested that quantisation wasn't just a mathematical trick, but that the energy in a beam of light occurs in individual packets now called photons. The energy of a single photon is Planck's constant multiplied by the photon's frequency.
Einstein's proposal was able to explain several puzzling properties of the photoelectric effect, which is the way certain metals give off electrons when light falls on them. For centuries, scientists had debated between two possible theories of light: was it a wave or did it instead consist of a stream of tiny particles? By the 19th Century, the debate was generally considered to have been settled in favour of the wave theory, because it was able to explain observed effects such as refraction, diffraction and polarisation. Because of this, Einstein's proposal was met by great scepticism. Eventually, however, his particle analogy became accepted, as it helped explain how light delivers energy in multiples of certain set values, called quanta of energy. Nevertheless, the wave analogy remained indispensable for helping to explain other phenomena of light, such as diffraction.
Thus, for the first time, was an object demonstrating both wave-like and particle-like characteristics modelled as being based upon discrete energy levels.
Bohr model of the atom
By the early 20th century, it was known that atoms consisted of a diffuse cloud of negatively-charged electrons surrounding a small, dense, positively-charged nucleus. This suggested a model in which the electrons circled around the nucleus like planets orbiting the sun. However, it was also known that the atom in this model would be unstable: the orbiting electrons should give off electromagnetic radiation, causing them to lose energy and spiral towards the nucleus, colliding with it in a fraction of a second.
A second, related, puzzle was the emission spectrum of atoms. When a gas is heated, it gives off light at certain discrete frequencies; for example, the visible light given off by hydrogen consists of four different colours, as shown in the picture below. In contrast, white light contains light at the whole range of visible frequencies.
In 1913 Niels Bohr proposed a new model of the atom that included quantised electron orbits. This solution became known as the Bohr model of the atom. In Bohr's model, electrons could inhabit only particular orbits around the atomic nucleus. When an atom emits or absorbs energy, the electron does not move in a continuous trajectory from one orbit around the nucleus to another, as might be expected in classical theory. Instead, the electron jumps instantaneously from one orbit to another, giving off the difference in energy as light in the form of a photon. The Bohr model was able to explain the emission spectrum of hydrogen, but wasn't able to make accurate predictions for multi-electron atoms, or to explain why some spectral lines are brighter than others.
Wave-particle duality
Quantum mechanics is based upon the concept that subatomic particles can have both wave-like and particle-like properties. This phenomenon is known as wave-particle duality. The explanation stems from a theory proposed by French physicist Louis de Broglie in 1924, that subatomic particles such as electrons are associated with waves. Experiments later showed that he was correct: electrons can bend around objects and can display wave shapes.
Consequently, neither wave nor particle is an entirely satisfactory model to use in understanding light. Indeed, astrophysicist A.S. Eddington proposed in 1927 that "We can scarcely describe such an entity as a wave or as a particle; perhaps as a compromise we had better call it a 'wavicle' ". This term was later popularised by mathematician Banesh Hoffmann.
The concept of waves and particles, and the analogies which use them, are mechanisms of classical physics. Quantum mechanics, which seeks to explain nature at a level underlying that of the atoms which comprise matter, cannot be understood in such terms. The classical concepts presuppose an artificial division of matter (as particles) and energy (as waves) that has no objective validity on the sub-atomic level. If the distinction no longer holds true, it is not surprising if classes of object can exhibit the characteristics of either.
Uncertainty principle
Suppose that we want to measure the position and speed of an object -- for example a car going through a radar speed trap. Naively, we assume that (at a particular moment in time) the car has a definite position and speed, and how accurately we can measure these values depends on the quality of our measuring equipment -- if we improve the precision of our measuring equipment, we will get a result that is closer to the true value. In particular, we would assume that how precisely we measure the speed of the car does not affect its position, and vice versa.
In 1927 German physicist Werner Heisenberg proved that in the sub-atomic world such assumptions are not correct. Quantum mechanics shows that certain pairs of physical properties, such as position and speed, cannot both be known to arbitrary precision. He showed that the more precisely one of them is known, the less precisely the other can be known. This statement is known as the uncertainty principle (or Heisenberg's uncertainty principle). It is not a statement about the accuracy of our measuring equipment, but about the nature of the system itself -- our naive assumption that an object has a definite position and speed is incorrect. On a scale of cars and people, these uncertainties are still present but are too small to be noticed; yet they are large enough that when dealing with individual atoms and electrons they become critical.
Heisenberg gave, as an illustration, the measurement of the position and momentum of an electron using a photon of light. In measuring the electron's position, the higher the frequency of the photon the more accurate is the measurement of the position of the impact, but the greater is the disturbance of the electron, which absorbs a random amount of energy, rendering the measurement obtained of its momentum increasingly uncertain (momentum is velocity multiplied by mass), for one is necessarily measuring its post-impact disturbed momentum, from the collision products, not its original momentum. With a photon of lower frequency the disturbance - hence uncertainty - in the momentum is less, but so is the accuracy of the measurement of the position of the impact.
The uncertainty principle shows mathematically that the product of the uncertainty in the position and momentum of a particle can never be less than a certain value, and that this value is related to Planck's constant. It is, in point of fact, up to a small numerical factor equal to Planck's constant.
Schrödinger's wave equation
Although Heisenberg had no problem with the existence of discontinuous quantum jumps, Austrian physicist Erwin Schrödinger hoped that a theory based on continuous wave-like properties could avoid what he called (in the reported words of Wilhelm Wien) "this nonsense about quantum jumps."
Building on De Broglie's theoretical model of particles as waves, Schrödinger accordingly brought forth in 1926 what has been called "the fundamental equation" of quantum mechanics.
In point of fact, Schrödinger's wave equation is as central to quantum mechanics as Einstein's equation <math>E = mc^2</math> is to Relativity.
The equation describes the probability waves which govern the motion of sub-atomic particles, "and it specifies how these waves are altered by external influences. Schrödinger established the correctness of the equation by applying it to the hydrogen atom, predicting many of its properties with remarkable accuracy. The equation is used extensively in atomic, nuclear, and solid-state physics."
The uncertainty principle states that an electron cannot be viewed as having an exact location at any given time. The concepts of exact position and exact velocity (distance traveled per unit of time) taken together really have no meaning in nature. An orbital, then, is a "cloud" of possible locations in which an electron might be found, a distribution of probabilities rather than a precise location.
Quantum field theory
The idea of quantum field theory began in the late 1920s with British physicist Paul Dirac, when he attempted to quantise the electromagnetic field — a procedure for constructing a quantum theory starting from a classical theory.
A field in physics is "a region or space in which a given effect (such as magnetism) exists." Other effects that manifest themselves as fields are gravitation and static electricity. In 2008, physicist Richard Hammond wrote that
Sometimes we distinguish between quantum mechanics (QM) and quantum field theory (QFT). QM refers to a system in which the number of particles is fixed, and the fields (such as the electromechanical field) are continuous classical entities. QFT . . . goes a step further and allows for the creation and annihilation of particles . . . .
He added, however, that quantum mechanics is often used to refer to "the entire notion of quantum view."
In 1931, Dirac proposed the existence of particles that later became known as anti-matter. Dirac shared the Nobel Prize in physics for 1933 with Schrödinger, "for the discovery of new productive forms of atomic theory."
Practical use
The main value of the quantum mechanics theory is its practical applications. Examples include the laser, the transistor, the electron microscope, and magnetic resonance imaging. The study of semiconductors led to the invention of the diode and the transistor, which are indispensable for modern electronics.
In even the simple light switch, quantum tunnelling is vital, as otherwise the electrons in the electric current could not penetrate the potential barrier made up of a layer of oxide. Flash memory chips found in USB drives also use quantum tunnelling, to erase their memory cells.
 
< Prev   Next >