Cambridge University Press
0521820359 - Introductory Quantum Optics - by Christopher Gerry and Peter Knight
Excerpt



Chapter 1
Introduction




1.1 Scope and aims of this book

Quantum optics is one of the liveliest fields in physics at present. While it has been a dominant research field for at least two decades, with much graduate activity, in the past few years it has started to impact the undergraduate curriculum. This book developed from courses we have taught to final year undergraduates and beginning graduate students at Imperial College London and City University of New York. There are plenty of good research monographs in this field, but we felt that there was a genuine need for a straightforward account for senior undergraduates and beginning postgraduates, which stresses basic concepts. This is a field which attracts the brightest students at present, in part because of the extraordinary progress in the field (e.g. the implementation of teleportation, quantum cryptography, Schrödinger cat states, Bell violations of local realism and the like). We hope that this book provides an accessible introduction to this exciting subject.

   Our aim was to write an elementary book on the essentials of quantum optics directed to an audience of upper-level undergraduates, assumed to have suffered through a course in quantum mechanics, and for first- or second-year graduate students interested in eventually pursuing research in this area. The material we introduce is not simple, and will be a challenge for undergraduates and beginning graduate students, but we have tried to use the most straightforward approaches. Nevertheless, there are parts of the text that the reader will find more challenging than others. The problems at the end of each chapter similarly have a range of difficulty. The presentation is almost entirely concerned with the quantized electromagnetic field and its effects on atoms, and how nonclassical light behaves. One aim of this book is to connect quantum optics with the newly developing subject of quantum information processing.

   Topics covered are: single-mode field quantization in a cavity, quantization of multimode fields, the issue of the quantum phase, coherent states, quasi-probability distributions in phase space, atom–field interactions, the Jaynes–Cummings model, quantum coherence theory, beam splitters and interferometers, nonclassical field states with squeezing, etc., test of local realism with entangled photons from down-conversion, experimental realizations of cavity quantum electrodynamics, trapped ions, etc., issues regarding decoherence, and some applications to quantum information processing, particularly quantum cryptography. The book includes many homework problems for each chapter and bibliographies for further reading. Many of the problems involve computational work, some more extensively than others.


1.2 History

In this chapter we briefly survey the historical development of our ideas of optics and photons. A detailed account can be found in the “Historical Introduction” for example in the 6th edition of Born and Wolf. A most readable account of the development of quantum ideas can be found in a recent book by Whitaker [1]. A recent article by A. Muthukrishnan, M. O. Scully and M. S. Zubairy [2] ably surveys the historical development of our ideas on light and photons in a most readable manner.

   The ancient world already was wrestling with the nature of light as rays. By the seventeenth century the two rival concepts of waves and corpuscles were well established. Maxwell, in the second half of the nineteenth century, laid the foundations of modern field theory, with a detailed account of light as electromagnetic waves and at that point classical physics seemed triumphant, with “minor” worries about the nature of black-body radiation and of the photoelectric effect. These of course were the seeds of the quantum revolution. Planck, an inherently conservative theorist, was led rather reluctantly, it seems, to propose that thermal radiation was emitted and absorbed in discrete quanta in order to explain the spectra of thermal bodies. It was Einstein who generalized this idea so that these new quanta represented the light itself rather than the processes of absorption and emission, and was able to describe how matter and radiation could come into equilibrium (introducing on the way the idea of stimulated emission), and how the photoelectric effect could be explained. By 1913, Bohr applied the basic idea of quantization to atomic dynamics and was able to predict the positions of atomic spectral lines.

   Gilbert Lewis, a chemist, coined the word photon well after the light quanta idea itself was introduced. In 1926 Lewis said

It would seem appropriate to speak of one of these hypothetical entities as a particle of light, a corpuscle of light, a light quantum, or light quant, if we are to assume that it spends only a minute fraction of its existence as a carrier of radiant energy, while the rest of the time it remains as an important structural element within the atom . . . I therefore take the liberty of proposing for this hypothetical new atom, which is not light but plays an important part in every process of radiation, the name photon [3].

Clearly Lewis’s idea and ours are rather distantly connected!

   De Broglie in a remarkable leap of imagination generalized what we knew about light quanta, exhibiting wave and particle properties to matter itself. Heisenberg, Schrödinger and Dirac laid the foundations of quantum mechanics in an amazingly short period from 1925 to 1926. They gave us the whole machinery we still use: representations, quantum-state evolution, unitary transformations, perturbation theory and more. The intrinsic probabilistic nature of quantum mechanics was uncovered by Max Born, who proposed the idea of probability amplitudes which allowed a fully quantum treatment of interference.

   Fermi and Dirac, pioneers of quantum mechanics, were also among the first to address the question of how quantized light interacts with atomic sources and propagates. Fermi’s Reviews of Modern Physics article in the 1930s, based on lectures he gave in Ann Arbor, summarize what was known at that time within the context of nonrelativistic quantum electrodynamics in the Coulomb gauge. His treatment of interference (especially Lipmann fringes) still repays reading today. It is useful to quote Willis Lamb in this context:

Begin by deciding how much of the universe needs to be brought into the discussion. Decide what normal modes are needed for an adequate treatment. Decide how to model the light sources and work out how they drive the system [4].

This statement sums up the approach we will take throughout this book.

   Weisskopf and Wigner applied the newly developed ideas of non-relativistic quantum mechanics to the dynamics of spontaneous emission and resonance fluorescence, predicting the exponential law for excited-state decay. This work already exhibited the self-energy problems, which were to plague quantum electrodynamics for the next 20 years until the development of the renormalization programme by Schwinger, Feynman, Tomonaga, and Dyson. The observation of the anomalous magnetic moment of the electron by Kusch, and of radiative level shifts of atoms by Lamb and Retherford, were the highlights of this era. The interested reader will find the history of this period very ably described by Schweber in his magisterial account of QED [5]. This period of research demonstrated the importance of considering the vacuum as a field which had observable consequences. In a remarkable development in the late 1940s, triggered by the observation that colloids were more stable than expected from considerations of van der Waals interactions, Casimir showed that long-range intermolecular forces were intrinsically quantum electrodynamic. He linked them to the idea of zero-point motion of the field and showed that metal plates in vacuum attract as a consequence of such zero-point motion.

   Einstein had continued his study of the basic nature of quantum mechanics and in 1935 in a remarkable paper with Podolsky and Rosen was able to show how peculiar quantum correlations were. The ideas in this paper were to explode into one of the most active parts of modern physics with the development by Bohm and Bell of concrete predictions of the nature of these correlations; this laid the foundations of what was to become the new subject of quantum information processing.

   Optical coherence had been investigated for many years using amplitude interference: a first-order correlation. Hanbury Brown and Twiss in the 1950s worked on intensity correlations as a tool in stellar interferometry, and showed how thermal photon detection events were “bunched.” This led to the development of the theory of photon statistics and photon counting and to the beginnings of quantum optics as a separate subject. At the same time as ideas of photon statistics were being developed, researchers had begun to investigate coherence in light–matter interactions. Radio-frequency spectroscopy had already been initiated with atomic beams with the work of Rabi, Ramsey and others. Sensitive optical pumping probes of light interaction with atoms were developed in the 1950s and 1960s by Kastler, Brossel, Series, Dodd and others.

   By the early 1950s, Townes and his group, and Basov and Prokhorov, had developed molecular microwave sources of radiation: the new masers, based on precise initial state preparation, population inversion and stimulated emission. Ed Jaynes in the 1950s played a major role in studies of whether quantization played a role in maser operation (and this set the stage for much later work on fully quantized atom–field coupling in what became known as the Jaynes–Cummings model). Extending the maser idea to the optical regime and the development of lasers of course revolutionized modern physics and technology.

   Glauber, Wolf, Sudarshan, Mandel, Klauder and many others developed a quantum theory of coherence based on coherent states and photodetection. Coherent states allowed us to describe the behaviour of light in phase space, using the quasi-probabilities developed much earlier by Wigner and others.

   For several years after the development of the laser there were no tuneable sources: researchers interested in the details of atom–light or molecule–light interactions had to rely on molecular chance resonances. Nevertheless, this led to the beginning of the study of coherent interactions and coherent transients such as photon echoes, self-induced transparency, optical nutation and so on (well described in the standard monograph by Allen and Eberly). Tuneable lasers became available in the early 1970s, and the dye laser in particular transformed precision studies in quantum optics and laser spectroscopy. Resonant interactions, coherent transients and the like became much more straightforward to study and led to the beginnings of quantum optics proper as we now understand it: for the first time we were able to study the dynamics of single atoms interacting with light in a non-perturbative manner. Stroud and his group initiated studies of resonance fluorescence with the observation of the splitting of resonance fluorescence spectral lines into component parts by the coherent driving predicted earlier by Mollow. Mandel, Kimble and others demonstrated how the resonance fluorescence light was antibunched, a feature studied by a number of theorists including Walls, Carmichael, Cohen-Tannoudji, Mandel and Kimble. The observation of antibunching, and the associated (but inequivalent) sub-Poissonian photon statistics laid the foundation of the study of “non-classical light”. During the 1970s, several experiments explored the nature of photons: their indivisibility and the build up of interference at the single photon level. Laser cooling rapidly developed in the 1980s and 1990s and allowed the preparation of states of matter under precise control. Indeed, this has become a major subject in its own right and we have taken the decision here to exclude laser cooling from this text.

   Following the development of high-intensity pulses of light from lasers, a whole set of nonlinear optical phenomena were investigated, starting with the pioneering work in Ann Arbor by Franken and co-workers. Harmonic generation, parametric down-conversion and other phenomena were demonstrated. For the most part, none of this early work on nonlinear optics required field quantization and quantum optics proper for its description. But there were early signs that some could well do so: quantum nonlinear optics was really initiated by the study by Burnham and Weinberg (see Chapter 9) of unusual nonclassical correlations in down-conversion. In the hands of Mandel and many others, these correlations in down-conversion became the fundamental tool used to uncover fundamental insights into quantum optics.

   Until the 1980s, essentially all light fields investigated had phase-independent noise; this changed with the production of squeezed light sources with phase-sensitive noise. These squeezed light sources enabled us to investigate Heisenberg uncertainty relations for light fields. Again, parametric down-conversion proved to be the most effective tool to generate such unusual light fields.

   Quantum opticians realized quite early that were atoms to be confined in resonators, then atomic radiative transition dynamics could be dramatically changed. Purcell, in a remarkable paper in 1946 within the context of magnetic resonance, had already predicted that spontaneous emission rates, previously thought of as pretty immutable were in fact modified by enclosing the source atom within a cavity whose mode structure and densities are significantly different from those of free space. Putting atoms within resonators or close to mirrors became possible at the end of the 1960s. By the 1980s the theorists’ dream of studying single atoms interacting with single modes of the electromagnetic field became possible. At this point the transition dynamics becomes wholly reversible, as the atom coherently exchanges excitation with the field, until coherence is eventually lost through a dissipative “decoherence” process. This dream is called the Jaynes–Cummings model after its proposers and forms a basic building block of quantum optics (and is discussed in detail in this book).

   New fundamental concepts in information processing, leading to quantum cryptography and quantum computation, have been developed in recent years by Feynman, Benioff, Deutsch, Jozsa, Bennett, Ekert and others. Instead of using classical bits that can represent either the values 0 or 1, the basic unit of a quantum computer is a quantum mechanical two-level system (qubit) that can exist in coherent superpositions of the logical values 0 and 1. A set of n qubits can then be in a superposition of up to 2n different states, each representing a binary number. Were we able to control and manipulate say 1500 qubits, we could access more states than there are particles in the visible universe. Computations are implemented by unitary transformations, which act on all states of a superposition simultaneously. Quantum gates form the basic units from which these unitary transformations are built up. In related developments, absolutely secure encryption can be guaranteed by using quantum sources of light.

   The use of the quantum mechanical superpositions and entanglement results in a high degree of parallelism, which can increase the speed of computation exponentially. A number of problems which cannot feasibly be tackled on a classical computer can be solved efficiently on a quantum computer. In 1994 a quantum algorithm was discovered by Peter Shor that allows the solution of a practically important problem, namely factorization, with such an exponential increase of speed. Subsequently, possible experimental realizations of a quantum computer have been proposed, for example in linear ion traps and nuclear magnetic resonance schemes. Presently we are at a stage where quantum gates have been demonstrated in these two implementations. Quantum computation is closely related to quantum cryptography and quantum communication. Basic experiments demonstrating the in-principle possibility of these ideas have been carried out in various laboratories.

   The linear ion trap is one of the most promising systems for quantum computation and is one we study in this book in detail. The quantum state preparation (laser cooling and optical pumping) in this system is a well-established technique, as is the state measurement by electron shelving and fluorescence. Singly charged ions of an atom such as calcium or beryllium are trapped and laser cooled to micro-Kelvin temperatures, where they form a string lying along the axis of a linear radio-frequency (r.f.) Paul trap. The internal state of any one ion can be exchanged with the quantum state of motion of the whole string. This can be achieved by illuminating the ion with a pulse of laser radiation at a frequency tuned below the ion’s internal resonance by the vibrational frequency of one of the normal modes of oscillation of the string. This couples single phonons into and out of the vibrational mode. The motional state can then be coupled to the internal state of another ion by directing the laser onto the second ion and applying a similar laser pulse. In this way general transformations of the quantum state of all the ions can be generated. The ion trap has several features to recommend it. It can achieve processing on quantum bits without the need for any new technological breakthroughs, such as micro-fabrication techniques or new cooling methods. The state of any ion can be measured and re-prepared many times without problem, which is an important feature for implementing quantum error correction protocols.

   Trapped atoms or ions can be strongly coupled to an electromagnetic field mode in a cavity, which permits the powerful combination of quantum processing and long-distance quantum communication. This suggests ways in which we may construct quantum memories. These systems can in principle realize a quantum processor larger than any which could be thoroughly simulated by classical computing but the decoherence generated by dephasing and spontaneous emission is a formidable obstacle.

   Entangled states are the key ingredient for certain forms of quantum cryptography and for quantum teleportation. Entanglement is also responsible for the power of quantum computing, which, under ideal conditions, can accomplish certain tasks exponentially faster than any classical computer. A deeper understanding of the role of quantum entanglement in quantum information theory will allow us to improve existing applications and to develop new methods of quantum information manipulation. These are all described in later chapters.

   What then is the future of quantum optics? It underpins a great deal of laser science and novel atomic physics. It may even be the vehicle by which we can realize a whole new technology whereby quantum mechanics permits the processing and transmission of information in wholly novel ways. But of course, whatever we may predict now to emerge will be confounded by the unexpected: the field remains an adventure repeatedly throwing up the unexpected.


1.3 The contents of this book

The layout of this book is as follows. In Chapter 2, we show how the electromagnetic field can be quantized in terms of harmonic oscillators representing modes of the electromagnetic field, with states describing how many excitations (photons) are present in each normal mode. In Chapter 3 we introduce the coherent states, superposition states carrying phase information. In Chapter 4 we describe how light and matter interact. Chapter 5 quantifies our notions of coherence in terms of optical field correlation functions. Chapter 6 introduces simple optical elements such as beam splitters and interferometers, which manipulate the states of light. Chapter 7 describes those nonclassical states whose basic properties are dictated by their fundamental quantum nature. Spontaneous emission and decay in an open environment are discussed in Chapter 8. Chapter 9 describes how quantum optical sources of radiation can be used to provide tests of fundamental quantum mechanics, including tests of nonlocality and Bell inequalities. Chapter 10 discusses how atoms confined in cavities and trapped laser-cooled ions can be used to study basic interaction phenomena. Chapter 11 applies what we have learnt to the newly emerging problems of quantum information processing. Appendices set out some mathematical ideas needed within the main body of the text. Throughout we have tried to illustrate the ideas we have been developing through homework problems.


References

[1] A. Whitaker, Einstein, Bohr and the Quantum Dilemma (Cambridge: Cambridge University Press, 1996).
[2] A. Muthukrishnan, M. O. Scully and M. S. Zubairy, Optics and Photonics News Trends, 3, No. 1 (October 2003).
[3] G. N. Lewis, Nature, 118 (1926), 874.
[4] W. E. Lamb, Jr., Appl. Phys. B, 66 (1995), 77.
[5] S. S. Schweber, QED and the Men Who Made It: Dyson, Feynman, Schwinger and Tomonaga (Princeton University Press, Princeton, 1994).

Suggestions for further reading

Many books on quantum optics exist, most taking the story much further than we do, in more specialized monographs.

L. Allen and J. H. Eberly, Optical Resonance and Two Level Atoms (New York: Wiley, 1975 and Mineola: Dover, 1987).

H. Bachor, A Guide to Experiments in Quantum Optics (Berlin & Weinheim: Wiley-VCH, 1998).

S. M. Barnett and P. M. Radmore, Methods in Theoretical Quantum Optics (Oxford: Oxford University Press, 1997).

C. Cohen-Tannoudji, J. Dupont-Roc and G. Grynberg, Photons and Atoms (New York: Wiley-Interscience, 1989).

C. Cohen-Tannoudji, J. Dupont-Roc and G. Grynberg, Atom–Photon Interactions (New York: Wiley-Interscience, 1992).

V. V. Dodonov and V. I. Man’ko (editors), Theory of Nonclassical States of Light (London: Taylor and Francis, 2003).

P. Ghosh, Testing Quantum Mechanics on New Ground (Cambridge: Cambridge University Press, 1999).

H. Haken, Light, Volume Ⅰ: Waves, Photons, and Atoms (Amsterdam: North Holland, 1981).

J. R. Klauder and E. C. G. Sudarshan, Fundamentals of Quantum Optics (New York: W. A. Benjamin, 1968).

U. Leonhardt, Measuring the Quantum State of Light (Cambridge: Cambridge University Press, 1997).

W. H. Louisell, Quantum Statistical Properties of Radiation (New York: Wiley, 1973).

R. Loudon, The Quantum Theory of Light, 3rd edition (Oxford: Oxford University Press, 2000).

L. Mandel and E. Wolf, Optical Coherence and Quantum Optics (Cambridge: Cambridge University Press, 1995).

P. Meystre and M. Sargent Ⅲ, Elements of Quantum Optics, 2nd edition (Berlin: Springer- Verlag, 1991).

G. J. Milburn and D. F. Walls, Quantum Optics (Berlin: Springer-Verlag, 1994).

H. M. Nussenzveig, Introduction to Quantum Optics (London: Gordon and Breach, 1973).

M. Orszag, Quantum Optics: Including Noise, Trapped Ions, Quantum Trajectories, and Decoherence (Berlin: Springer, 2000).

J. Peřina, Quantum Statistics of Linear and Nonlinear Optical Phenomena, 2nd edition (Dordrecht: Kluwer, 1991).

V. Peřinová, A. Lukš, and J. Peřina, Phase in Optics (Singapore: World Scientific, 1998).

R. R. Puri, Mathematical Methods of Quantum Optics (Berlin: Springer, 2001).

M. Sargent, Ⅲ, M. O. Scully and W. E. Lamb, Jr., Laser Physics (Reading: Addison-Wesley, 1974).

M. O. Scully and M. S. Zubairy, Quantum Optics (Cambridge: Cambridge University Press, 1997).

W. P. Schleich, Quantum Optics in Phase Space (Berlin: Wiley-VCH, 2001).

B. W. Shore, The Theory of Coherent Atomic Excitation (New York: Wiley-Interscience, 1990).

W. Vogel and D.-G. Welsch, Lectures in Quantum Optics (Berlin: Akademie Verlag, 1994).

M. Weissbluth, Photon–Atom Interactions (New York: Academic Press, 1989).

Y. Yamamoto and A. İmamoǧlu, Mesoscopic Quantum Optics (New York: Wiley-Interscience, 1999).

A useful reprint collection of papers on coherent states, including the early work by Glauber, Klauder, and others, is the following.

J. R. Klauder and B.-S. Skagerstam (editors), Coherent States (Singapore: World Scientific, 1985).

The history of quantum optics and of fundamental tests of quantum theory can be found in a number of places. We have found the following invaluable.

R. Baeierlin, Newton to Einstein (Cambridge: Cambridge University Press, 1992).

M. Born and E. Wolf, Principles of Optics (Cambridge: Cambridge University Press, 1998).

A. Whitaker, Einstein, Bohr and the Quantum Dilemma (Cambridge: Cambridge University Press, 1996).




Chapter 2
Field quantization




In this chapter we present a discussion of the quantization of the electromagnetic field and discuss some of its properties with particular regard to the interpretation of the photon as an elementary excitation of a normal mode of the field. We start with the case of a single-mode field confined by conducting walls in a one-dimensional cavity and later generalize to multimode fields in free space. The photon number states are introduced and we discuss the fluctuations of the field observables with respect to these states. Finally, we discuss the problem of the quantum description of the phase of the quantized electromagnetic field.


2.1 Quantization of a single-mode field

We begin with the rather simple but very important case of a radiation field confined to a one-dimensional cavity along the z-axis with perfectly conducting walls at z = 0 and z = L as shown in Fig. 2.1.

   The electric field must vanish on the boundaries and will take the form of a standing wave. We assume there are no sources of radiation, i.e. no currents or charges nor any dielectric media in the cavity. The field is assumed to be polarized along the x-direction, E(r, t) = ex Ex(z, t), where ex is a unit polarization vector. Maxwell’s equations without sources are, in SI units,

A single-mode field satisfying Maxwell’s equations and the boundary conditions is given by





© Cambridge University Press