[EAS]Top Physics Stories

Peter J. Kindlmann peter.kindlmann at yale.edu
Thu Jan 4 22:59:40 EST 2001


Dear Colleagues -

The very informative e-publication "Physics News Update" has its
annual top physics stories item. Progress in work at the quantum
limts has been amazing. I've appended the earlier items that are
being cited.

  --Peter Kindlmann

===================================================================
PHYSICS NEWS UPDATE                         
The American Institute of Physics Bulletin of Physics News
Number 519  January 4, 2001   by Phillip F. Schewe and Ben
Stein          

PHYSICS NEWS STORIES OF THE YEAR.  Our top three
stories represent one definite sighting and two near misses: the
discovery of the tau neutrino (Update 495) and the report of
statistically-poor but fascinating evidence for quark-gluon plasma
(Update 470) and the Higgs boson (502).  Other top physics
events for the year 2000 include (in roughly chronological order
through the months) the resolution of the astrophysical x-ray
background into discrete sources (467); the ability to guide atoms
around "atom chips" (469, 486); all-optical NMR (472); quantum
entanglement of 4 ions in a trap (475); the fabrication of a "left-
handed" composite material, one possessing both a negative
electrical permittivity and a negative magnetic permeability (476);
the best map yet of the cosmic microwave background, showing
that the curvature of the universe is zero (479, 481); the
observation of quantum heat, particles of thermal energy moving
down wires (481); the best measurement, by a factor of 10, of the
gravitational constant G (482), with a corresponding adjustment in
the mass of the Earth; the first-time measurement of gravity at the
micron distance scale as part of the search for extra dimensions
(483); the quantum superposition of superfluid currents flowing in
both directions through a SQUID (492); a record number of
daughter particles made in heavy-ion collisions at RHIC (505);
numerous advances  in quantum cryptography (480); light slowed
to 1 mph (472); advances in delivering drugs and genes with
ultrasound-activated bubbles (487); and the discovery that
entangled photons can defeat the diffraction limit (503).

===================================================================
Following below are the items referred to above:

-----------------------------
PHYSICS NEWS UPDATE                         
The American Institute of Physics Bulletin of Physics News
Number 495  July 20, 2000   by Phillip F. Schewe and Ben Stein

DIRECT EVIDENCE FOR TAU NEUTRINOS will be reported
tomorrow in a seminar at Fermilab.  While the existence of neutrinos
associated with the tau lepton was not in doubt, actually observing
the particle interact had not occurred until now.   This rounds out the
program of experimental sightings of the truly fundamental building
blocks prescribed by the standard model of particle physics.  This
official alphabet consists of six quarks  known as up, down, strange,
charm, top, and bottom and six leptons electron, electron neutrino,
muon, muon neutrino, tau, and tau neutrino.  All matter, according to
the theory, should be made up from these most basic of constituents. 
Other particles, such as the anti-matter counterparts of the quarks and
leptons, the force-carrying bosons (e.g., photons, gluons, etc.), and
the Higgs boson (which confers mass upon some of the other
particles) also appear in the theory.  (Still other candidates, such as
the "supersymmetric" particles, are not part of, but are expected to be
compatible with, the standard model.)  The evidence for the tau
neutrino is slim but impressive: five scattering events are being
exhibited at the seminar by Fermilab physicist Byron Lundgren,
leader of Experiment 872, the Direct Observation of Nu Tau (or
DONUT) collaboration (http://fn872.fnal.gov/).  Their experiment
proceeds in the following manner.  Fermilab's 800-GeV proton beam
(the highest beam energy in the world) was steered onto a tungsten
target, where some of the prodigious incoming energy is turned into
new particles.  Some of these quickly decay into taus and tau
neutrinos.  Next comes an obstacle course of magnets (meant to
deflect charged particles away) and shielding material (meant to
absorb most of the other particles except for rarely interacting
neutrinos).  Beyond this lies a sequence of emulsion targets in which
the neutrinos can interact, leaving a characteristic signature. 
Evidence for a tau neutrino in the emulsion is the creation of a tau
lepton, which itself quickly decays (after traveling about 1 mm) into
other particles.  The E872 physicists estimate that about 10^14 tau
neutrinos entered the emulsion, of which perhaps 100 interacted
therein.  It is a carefully analyzed handful of such events that is now
being presented to the public in evidence.  The tau neutrino is the
third neutrino type to be detected.  The detection of the electron
neutrino by Clyde Cowan and Frederick Reines garnered Reines the
1995 Nobel Prize for physics (Cowan had died some years before). 
For discovering the muon neutrino, Leon Lederman, Melvin
Schwartz, and Jack Steinberger won the Nobel Prize in 1988.   

-----------------------------
PHYSICS NEWS UPDATE                         
The American Institute of Physics Bulletin of Physics News
Number 470  February 10, 2000   by Phillip F. Schewe and Ben Stein

A NEW FORM OF NUCLEAR MATTER has been detected at the CERN lab in
Geneva. Results from seven different experiments, conducted at CERN
over a span of several years, were announced at a series of
seminars today.  In the experiments a high energy beam of lead ions
(160 GeV/nucleon, times 208 nucleons, for a total energy of about
33 TeV) smashes into fixed  targets of lead or gold atoms.  The
center-of-mass energy of these collisions, the true energy
available for producing new matter, is about 3.5 TeV.  From the
debris that flies out of the smashups, the CERN scientists estimate
that the "temperature" of the ensuing nuclear fireball might have
been as high as 240 MeV (under these extreme conditions energy
units are substituted for degrees kelvin), well above the
temperature where new nuclear effects are expected to occur.  In
the CERN collisions the effective, momentary, nuclear matter
density was calculated to be 20 times normal nuclear density.  It
is not quite certain whether the novel nuclear state is some kind
of denser arrangement of  known nuclear matter or a manifestation
of the much-sought quark-gluon plasma (QGP), in which quarks, and
the gluons which normally bind the quarks into clumps of two quarks
(mesons) or three quarks (baryons), spill together in a seething
soup analogous to the condition of ionized atoms in a plasma. Such
a  nuclear plasma might have existed in the very early universe
only microseconds after the big bang.    Evidence for the
transition from a hadron phase (baryons and mesons) into a QGP
phase was expected to consist of (1) an enhanced production of
strange mesons, (2) a decrease in the production of heavy psi
mesons (each consisting of a charm and anticharm quarks), and (3)
an increase in the creation of energetic photons and
lepton-antilepton pairs.  Just this sort of (indirect) evidence (at
least of types 1 and 2) has now turned up in the CERN data. (CERN
press release, www.cern.ch)  To demonstrate the existence of QGP
more directly, one would like the plasma state to last longer, and
one should observe the sorts of particle jets and gamma rays that
come with still higher-energy fireballs.  That energy (about 40
TeV, center-of-mass) will be available in the next few months at
the Relativistic Heavy Ion Collider undergoing final preparations
at Brookhaven.  

-----------------------------
PHYSICS NEWS UPDATE                         
The American Institute of Physics Bulletin of Physics News
Number 502  September 14, 2000   by Phillip F. Schewe and Ben Stein

AN INTRIGUING HINT OF THE HIGGS BOSON in collider data
at the LEP accelerator at CERN has prompted officials there to
extend the running period of the Large Electron Positron (LEP)
collider by at least a month, instead of turning it off now to make way
for the building of the Large Hadron Collider (or LHC, a proton-
colliding machine to be housed in the same deep tunnel as LEP). 
CERN decided today that the high energy electron-positron collisions
at LEP will continue, the better to supplement the meager, but
potentially crucial, evidence for the Higgs boson, the particle widely
thought to be responsible for endowing other known particles with
mass.  What happens at LEP, in effect, is that a lot of energy
squeezed into a very tiny volume almost instantly rematerializes in
the form of new particles.  Theorists have said that in some collisions
a Higgs boson (h) might be produced back to back with a Z boson,
one of the carriers of the weak force and itself the object of a
dramatic particle hunt at CERN 20 years ago.  In these rare events,
both h and Z are expected to decay quickly into two sprays, or jets, of
particles.  One tactic then is to search 4-jet events for signs that the
combined mass of two jets at a special energy seems to stand out
above pedestrian "background" events in which no true exotic
particle had been produced.  What has caught LEP physicists'
attention is just such an enhancement, at a mass around 114 GeV/c^2. 
The enhancement is not statistically significant enough for CERN to
claim a discovery yet, even when all four detector groups combine
their data, but sufficient to cause excitement since the Higgs is
perhaps the most sought after particle in all of high energy physics. 
The LEP extension is not expected to cause much of a delay in LHC
construction.  (Some websites: press.web.cern.ch;
opal.web.cern.ch/Opal/; alephwww.cern.ch/WWW/)

-----------------------------
PHYSICS NEWS UPDATE                         
The American Institute of Physics Bulletin of Physics News
Number 467  January 18, 2000   by Phillip F. Schewe and Ben Stein

THE X-RAY BACKGROUND, the glow of x rays seen in all directions in
space, has now largely been resolved into emissions from discrete
sources by the Chandra X-Ray Telescope, ending the notion that the
x rays come from distant hot gas.  Previously only about 20-30% of
the x-ray background had been ascribed to point sources (by such
telescopes as ASCA).  Chandra was launched in July 1999 and put in
an elliptical orbit.  With its high angular resolution and acute
sensitivity it could tell apart x- ray objects (many of them
thought to be accretion disks around black holes) that before had
been blurred into a continuous x-ray curtain.  (Of course, now that
the background has  been resolved into points it ceases to be a
background.)  Richard Mushotzky of Goddard Space Flight Center
reported these Chandra results at last week's meeting in Atlanta of
the American Astronomical Society (AAS).  Resolving the x-ray
background was not all.  Mushotzky added that the Chandra survey
had revealed the existence of two categories of energetic galaxies
that had been imaged only poorly or not at all by optical
telescopes.  He referred to one category as "veiled galactic
nuclei," objects (with a redshift of about 1) bright in x rays but
obscured by dust at optical wavelengths.  The other category was
"ultra-faint galaxies."  One interpretation of these galaxies is
that optical emission is suppressed owing to absorption over what
could be a very long pathway to Earth.  Mushotzky speculated that
such high redshift (z greater than 5) galaxies could be the most
distant, and hence earliest, objects ever identified.   The XMM
x-ray telescope, just launched, should provide complementary
information in the form of high-precision spectra (from which
redshifts are derived) of the distant objects.

-----------------------------
PHYSICS NEWS UPDATE                         
The American Institute of Physics Bulletin of Physics News
Number 469  February 2, 2000   by Phillip F. Schewe and Ben Stein

GUIDING NEUTRAL ATOMS AROUND CURVES can be performed with tiny
current-carrying wires which deflect the atoms through a 
lithographically patterned "atom waveguide."  Physicists at the
University of Colorado and from NIST-Boulder send laser-cooled (42
micro-kelvin) atoms into a 10-cm guide where they undergo three
curves (with a 15-cm radius of curvature).  Three million atoms per
second can be sent through the course; at the far end the atoms are
ionized and then counted.   A possible use for the new waveguide,
part of a growing toolbox of atom optics components, will be in
atom interferometry and other forms of high-precision metrology. 
The researchers hope to send atoms (or should we say atom?) from a
Bose-Einstein condensate into the waveguide.  (Muller et al.,
Physical Review Letters, 20 December 1999; Select Article.)

-----------------------------
PHYSICS NEWS UPDATE                         
The American Institute of Physics Bulletin of Physics News
Number 486   May 26, 2000   by Phillip F. Schewe and Ben Stein

ATOM CHIPS.  Last year scientists at the University of Innsbruck
in Austria succeeded in guiding neutral atoms along the outside of
current-carrying wires (Update 416); the atoms were trapped and
manipulated by magnetic fields generated by the current in the
wire.  Now the same scientists have, through a deft series of  steps
involving extra current-carrying coils and laser beams, been able to
herd cold lithium atoms to within a few microns of a patterned
microchip, where the atoms come under the control and guidance
of currents in the chip.  The goal of the Innsbruck physicists (Joerg
Schmiedmayer, joerg.schmiedmayer at uibk.ac.at,
011-43-512-507-6306) is to develop an integrated circuit for atoms
and eventually (when the source of the atoms is not a mere atom
beam but a true Bose-Einstein condensate) for atom waves.  Such a
device might be of service for doing quantum optics or
computation involving quantum entanglement.  (Foman et al.,
Physical Review Letters, 15 May 2000; Select Article.)

-----------------------------
PHYSICS NEWS UPDATE                         
The American Institute of Physics Bulletin of Physics News
Number 472  February 24, 2000   by Phillip F. Schewe and Ben Stein

ALL-OPTICAL NMR has been achieved by David Awschalom's research
group at the University of California, Santa Barbara.  Previous
versions of nuclear magnetic resonance (NMR) have relied on
radio-frequency electromagnetic fields to tip nuclear magnetic
moments.  This approach (nearing its 50th anniversary with proven
success in medical imaging and chemistry), is modified by the UCSB
scientists in the following way.  They use a laser to excite a bath
of electron spins, which then do all the work.   As these electron
magnetic moments swarm about the nuclei, their number and direction
are controlled by the laser in a way that tips nuclear spins. 
Nuclei are monitored during the process by a second laser beam,
making this an entirely optical approach.  Demonstrated in the
semiconductor GaAs, this fundamentally different alternative to
ordinary NMR offers potentially increased resolution because light
can be focused more tightly than RF fields.  Moreover, because the
UCSB strategy exploits electrons as an intermediary, individual
electron orbits themselves might be used to obtain atomic-scale
focusing.   (For background, see Kikkawa and Awschalom, Science,
Jan. 21, 2000).   One aim of this research is to "imprint" electron
spin on the nuclear system within integrated "spintronic" devices,
where electron spin supplants charge as a source of information.

-----------------------------
PHYSICS NEWS UPDATE                         
The American Institute of Physics Bulletin of Physics News
Number 475  March 17, 2000   by Phillip F. Schewe and Ben Stein

THE FIRST ENTANGLEMENT OF FOUR PARTICLES has been experimentally
achieved by researchers at NIST (Christopher Monroe, 303-497-7415),
demonstrating a technique that significantly advances the difficult
prospect of building a useful quantum computer.  To perform
powerful calculations, such as factoring huge numbers or quickly
finding items in large databases, a quantum computer typically must
contain many particles "entangled" with each other.  Entanglement
describes a very special interlinking that can occur between
particles (such as photons or ions) even if they are physically
separated or otherwise isolated from one another.  While entangled,
each particle is in a fuzzy, noncommital state (for example, being
in a combination or "superposition" of a low and high energy state)
but has a precisely defined relationship with its partners. 
Specifically, when one particle eventually "collapses" into a
definite state, it essentially causes its entangled partner to
collapse into a complementary state, even if it is halfway across
the galaxy.  Entanglement is difficult enough to achieve in two
particles, or even three (Update 414), but last year, theorists in
Denmark proposed a practical method for entangling any number of
particles. (Molmer and Sorensen, Phys. Rev. Lett., 1 Mar 1999; see
article at Physics News Select Articles.)  Their proposal, based in
turn on a earlier idea (Cirac and Zoller, Phys. Rev. Lett, 15 May
1995), involves trapping a string of ions in electromagnetic
fields.  To create multiple entanglement, laser pulses can
interlink each ion's internal state (known as its spin) to the
overall motion of the ions rocking back and forth.  The
Molmer-Sorensen technique enables researchers to accomplish this in
a single pulse.  NIST researchers demonstrated this technique with
four ions (electrical noise made it difficult to do more), but they
showed that entanglement of many more particles is now possible.
(Sackett et al, Nature, 16 March 2000.)

-----------------------------
PHYSICS NEWS UPDATE                         
The American Institute of Physics Bulletin of Physics News
Number 476  March 24, 2000   by Phillip F. Schewe and Ben Stein

TOPSY TURVY: THE FIRST TRUE "LEFT HANDED" MATERIAL
has been devised by scientists at the University of California at
San Diego.  In this medium, light waves are expected to exhibit a
reverse Doppler effect.  That is, the light from a source coming
toward you would be reddened and the light from a receding source
would be blue shifted.  The UCSD composite material, consisting of
an assembly of copper rings and wires (see figure at
www.aip.org/physnews/graphics), should eventually have important
optics and telecommunications applications.
   To understand how a reverse Doppler shift and other bizarre
optical effects come about, consider that a light wave is a set of
mutually reinforcing oscillating electric and magnetic fields.  The
relationship between the fields  and the light motion is described
picturesquely by what physicists call the "right hand rule": if the
fingers of the right hand represent the waves' electric field, and
if the fingers curl around to the base of the hand, representing
the magnetic field, then the outstretched thumb indicates the
direction of the flow of light energy.  Customarily one can depict
the light beam moving through a medium as an advancing plane of
radiation, and this plane, in turn, is equivalent to the sum of
many constituent wavelets, also moving in the same direction as the
energy flow.  But in the UCSD composite medium this is not the
case.  The velocity of the wavelets runs opposite to the energy
flow (an animated video illustrates this concept nicely: see www-
physics.ucsd.edu/~rshelby/lhmedia), and this makes the UCSD
composite a "left handed substance," the first of its kind.
    Such a material was first envisioned in the 1960's by the
Russian physicist Victor Veselago of the Lebedev Physics Institute
(Soviet Physics Uspekhi, Jan-Feb 1968), who argued that a material
with both a negative electric permittivity and a negative magnetic
permeability would, when light passed through it, result in novel
optical phenomena, including a reverse Doppler shift, an inverse
Snell effect (the optical illusion which makes a pencil dipped into
water seem to bend), and reverse Cerenkov radiation.  Permittivity
(denoted by the Greek letter epsilon) is a measure of a material's
response to an applied electric field, while permeability (denoted
by the letter mu) is a measure of the material's response to an
applied magnetic field.  In Veselago's day no negative-mu materials
were known, nor thought likely to exist.  More recently, however,
John Pendry of Imperial College has shown how negative-epsilon
materials could be built from rows of wires (Pendry et al.,
Physical Review Letters, 17 June, 1996) and negative-mu materials
from arrays of tiny resonant rings (Pendry et al., IEEE, Trans. MTT
47, 2075, 1999). 
    Now, this week, at the American Physical Society meeting in
Minneapolis, Sheldon Schultz and David Smith of UCSD reported that
they had followed Pendry's prescriptions and succeeded in 
constructing a material with both a negative mu and a negative
epsilon, at least at microwave frequencies.  The raw materials
used, copper wires and copper rings, do not have unusual properties
of their own and indeed are non- magnetic.  But when incoming
microwaves fall upon alternating rows of the rings and wires
mounted on a playing-card-sized platform and set in a cavity, then
a resonant reaction between the light and the whole of the
ring-and-wire array sets up tiny induced currents, which contribute
fields of their own.  The net result is a set of fields moving to
the left even as electromagnetic energy is moving to the right. 
This effective medium is an example of a "meta-material."  Another
example is a photonic crystal (consisting of stacks of tiny rods or
solid material bored out with a honeycomb pattern of voids) which
excludes light at certain frequencies.
    At a press conference in Minneapolis, Schultz
(sschultz at ucsd.edu) and Smith (drs at sdss.ucsd.edu) said that having
demonstrated that their medium possessed a negative mu and epsilon,
they were now proceeding to explore the novel optical effects
predicted by Veselago.  Furthermore, they hope to adapt their
design to accommodate shorter wavelengths.  As for applications in
microwave communications, a medium which focuses waves when other
materials would disperse them (and vice versa) ought to be useful
in improving existing delay lines, antennas, and filters.
     Outside commentators at the press conference showed interest
and curiosity.  Marvin Cohen of UC Berkeley said that until he read
the UCSD paper (Smith et al., just accepted for publication in
Physical  Review Letters; science writers should go to
www.aip.org/physnews/select) he had not thought a negative-mu
material was possible.  Walter Kohn of UC Santa Barbara (winner of
the 1998 Nobel Prize in chemistry) considered the UCSD work "...an
extremely interesting result.  I would be surprised if there
weren't interesting applications."

-----------------------------
PHYSICS NEWS UPDATE                         
The American Institute of Physics Bulletin of Physics News
Number 479    April 13, 2000   by Phillip F. Schewe and Ben Stein

DARK ENERGY AND THE MICROWAVE BACKGROUND.  The
theory of general relativity introduced the notion that spacetime
could be warped or curved by the presence of matter.  Locally,
stars or any object with mass will curve space, but the expansion
of the universe itself may introduce a curvature of its own.  This
is how cosmologists summarize things:  a static universe with no
matter (if such a thing were possible) would have no curvature. 
If, however, the empty universe were expanding it would have
negative overall curvature.  Increase the mass density from zero
and the curvature would be less negative.  Add still more mass and
you might reach a net zero curvature.  The ratio of matter to the
critical matter needed for zero curvature is called omega; the
popular version of the big bang model, featuring a very rapid
expansion in an early "inflation" phase, predicts that omega should
equal 1 exactly.  A new paper in Physical Review Letters by Scott
Dodelson of Fermilab and Lloyd Knox of the University of Chicago
(773-834-3287) provides the theoretical underpinning for the
higher-precision mappings of the cosmic microwave background (CMB)
reported over the past nine months.  The paper was prepared just as
the first of the observational results appeared last summer: a
Princeton- Pennsylvania collaboration taking data from Cerro Toco
in Chile.  Their findings (preprint astro-ph/9906421) can be
plotted as the size of the observed fluctuations in the CMB as a
function of the angular size of the fluctuation region (actually
astrophysicists usually transform the data so that it can be
plotted against the size of angular moment, or "l").  These data
and those of the "Boomerang" (preprint 9911444  and 9911445; also
see Update 460) and "Viper" (preprint 9910503) groups sit right on
top of a theoretical curve drawn by Dodelson and Knox corresponding
to the case where omega equals 1 and the net curvature of the
universe is zero.  With the contribution of matter (luminous and
dark) to the density of the universe expected to be about one-third
the critical value (of omega=1), this presents a stronger-
than-ever argument in favor of the existence of yet another form of
energy, often called "dark energy," to provide the missing
two-thirds of the energy needed to make omega=1.  This dark energy
would also provide the "negative pressure" or repulsiveness needed
to make the expansion of the universe greater than in the past, a
development suggested independently by studies of distant
supernovas.  (Dodelson and Knox, Physical Review Letters, 17 April;
Select Article.)

-----------------------------
PHYSICS NEWS UPDATE                         
The American Institute of Physics Bulletin of Physics News
Number 481   April 27, 2000   by Phillip F. Schewe and Ben Stein

BEST MAP YET OF THE COSMIC MICROWAVE BACKGROUND (CMB). 
The CMB is a redshifted picture of the  universe at the moment
photons and newly formed hydrogen atoms parted company roughly
300,000 years after the big bang.  First detected in the 1960s, the
CMB appeared to be utterly uniform until, eight years ago, the COBE
satellite provided the first hint of slight temperature variations,
on a coarse scale, with an angular resolution of about 7 degrees. 
Since then several detectors have obtained resolutions of better
than 1 degree.  Actually, the contribution to small-scale
fluctuations in the CMB is customarily rendered in terms of
multipoles (a sort of coefficient), denoted by the letter l.    The
contribution to the temperature fluctuations in the CMB for a
multipole value of  l comes from patches on the sky with an angular
size of pi/l.   COBE's CMB measurements extended to a multipole of
only about 20, but a major new map, made using a detector mounted
on a balloon blown all the way around the Antarctic continent,
covers the multipole range from 50 to 600, thus probing CMB
fluctuations with much finer angular detail, over about 3% of the
sky.  The 36-member, international "Boomerang" (Balloon
Observations of Millimetric Extragalactic Radiation and
Geomagnetics) collaboration, led by Andrew Lange of Caltech and
Paolo de Bernardis of the University of  Rome, confirms that a plot
of CMB strength peaks at a multipole value of about 197
(corresponding to CMB patches about one degree in angular spread),
very close to what theorists had predicted for a cosmology in which
the universe's overall curvature is zero and the existence of cold
dark matter is invoked.  The absence of any noticeable subsidiary
peaks (higher harmonics) in the data, however, was not in accord
with theory.  The shape of the observed pattern of temperature
variations suggests that a disturbance very like a sound wave
moving through air passed through the high- density primordial
fluid and that the CMB map can be can be thought of  as a sort of
sonogram of the infant universe.  (de Bernardis et al., Nature, 27
April 2000.)

QUANTUM HEAT. The movement of electrons down a wire
becomes a quantum affair when the electron wavelength (the size
of the quantum wave counterpart of the particulate  electron) is
comparable in size to the width of the wire.  Theorists have thought
the same would be true of "particles" of heat (phonons) moving
down a wire.  In the case of electrons, quantum reality manifests
itself in the form of quantization: the electrons can only have
conduction values in multiples of a basic unit equal to 2 times the
electric charge squared, divided by Planck's constant.  In the case
of heat, the unit of  thermal conduction would equal the
temperature times pi squared times the square of Boltzmann's
constant, divided by three times Planck's constant.  Such quantized
thermal conduction has now been seen for the first time by
physicists at Caltech (Michael Roukes, roukes at caltech.edu),
where heat added to a tiny (4x4 micron) silicon nitride "phonon
cavity" can depart only across narrow bridges, essentially wires
only 500 atoms wide (Schwab et al., Nature, 27 April 2000).  Heat
is added, and the temperature of the cavity monitored, by tiny gold
circuits leading to SQUIDs (superconducting quantum interference
devices).  With further refinements, the researchers hope to explore
the particle nature of heat, in effect a sort of "quantum phonon
optics."  In the same issue, commentators Leo Kouwenhoven and
Liesbeth Venema refer to the Caltech observations as "the first
demonstration of quantum physics in nanomechanical structures."

-----------------------------
PHYSICS NEWS UPDATE                         
The American Institute of Physics Bulletin of Physics News
Number 482   May 3, 2000   by Phillip F. Schewe and Ben Stein

BEST MEASUREMENT OF THE GRAVITATIONAL
CONSTANT.  At this week's American Physical Society Meeting in
Long Beach, Jens H. Gundlach of the University of Washington
(paper P11.3) reported a long-awaited higher precision measurement
of the gravitational constant, usually denoted by the letter G. 
Although G has been of fundamental importance to physics and
astronomy ever since it was introduced by Isaac Newton in the
seventeenth century (the gravitational force between two objects
equals G times the masses of the two objects and divided by their
distance apart squared), it has been relatively hard to measure,
owing to the weakness of gravity.  Now a group at the University of
Washington has reduced the uncertainty in the value of G by almost
a factor of ten.  Their preliminary value is G=6.67390 x 10^-11 
m^3/kg/s^2 with an uncertainty of 0.0014%. Combining this new
value of G with measurements made with the Lageos satellite
(which uses laser ranging to keep track of its orbital position to
within a millimeter) permits the calculation of a brand new, highest
precision mass for the earth: 5.97223 (+/- .00008) x 10^24 kg. 
Similarly the new mass of the sun becomes 1.98843 (+/- .00003) x
10^30 kg. Gundlach's (206-543-4080, jens at phys.washington.edu)
setup is not unlike Cavendish's venerable torsion balance of two
hundred years ago: a hanging pendulum is obliged to twist under the
influence of some nearby test weights.  But in the Washington
experiment measurement uncertainties are greatly reduced by using
a feedback mechanism to move the test weights, keeping pendulum
twisting to a minimum.  (See Gundlach's written summary at
http://www.aps.org/meet/APR00/baps/vpr/layp11-03.html; figures at
www.aip.org/physnews/graphics.)

-----------------------------
PHYSICS NEWS UPDATE                         
The American Institute of Physics Bulletin of Physics News
Number 483   May 5, 2000   by Phillip F. Schewe and Ben Stein

GRAVITY HAS BEEN MEASURED AT THE SUB-
MILLIMETER SCALE for the first time.  Gravity has of course
long been studied over planetary distances but is more difficult to
gauge at the terrestrial scale, where intrusive electric and magnetic
fields, many orders of magnitude stronger than gravity fields, can be
overwhelming.  Nevertheless, Eric Adelberger and his colleagues at
the University of Washington have managed to measure the force of
gravity over distances as small as 150 microns using a disk-shaped
pendulum carefully suspended above another disk, with a copper
membrane stretched between them to help isolate electrical forces.
(This experiment should not be confused with another University of
Washington effort in which the gravitational constant is measured
with higher precision see Update 482).   Adelberger (206-543-
4294, eric at gluon.npl.washington.edu) presented one of several
talks at this week's APS meeting in Long Beach, California devoted
to short-range gravity, a subject which has suddenly attracted much
theoretical and experimental interest owing to a relatively new
model which supposes the existence of extra spatial dimensions in
which gravity, but not other forces, might be operating.  According
to Nima Arkani-Hamed of LBL (arkani at thsrv.lbl.gov, 510-486-
4665) this is why gravity is so weak: it dilutes itself in the extra
dimensions. In other words, ordinary particles are tethered to our
conventional spacetime, or "brane," while gravitons are free to roam
into otherwise unseeable dimensions.  One implication of the model,
testable with tabletop experiments such as Adelberger's, is that the
gravitational force might depart from Newton's inverse square law
(gravity inversely proportional to the square of the distance between
two objects) at close range. Adelberger did not observe such a
departure at distances down to tenths of a millimeter and will
continue to explore even shorter distances.  For a list of tabletop
experiments underway, see
http://gravity.phys.psu.edu/mog/mog15/node12.html.
     Another interesting implication of  the model introduced by
Arkani-Hamed (and others; see preprint hep-th 9803315) two years
ago is that the unification of the four known forces would not
necessarily occur at energies as high as 10^19 GeV but possibly at
energies as low as 10^4 GeV, an energy scale within reach of the
Large Hadron Collider under construction at CERN.  Extra
dimensions could, for example, manifest themselves in proton-
proton smashups as an apparent disappearance of energy, implying
that some of the collision energy had been converted into gravitons
(the particle form of gravity) which then disappear into the extra
dimensions.  The gravitons produced in this way might come back
into our conventional world of 3 spatial dimensions and decay into
two photons.  Physicists have already looked for this kind of event. 
Gregory Landsberg of Brown University (401-863-1464;
landsberg at hep.brown.edu) reported that at the D0 experiment at
Fermilab some energetic two-photon events have been observed
(including one in which the energy of the photons added up to 574
GeV, representing the highest composite mass ever seen in the D0
experiment) but not often enough to constitute evidence for extra
dimensions.  In fact this shortage of events has been translated into a
lower limit of 1300 GeV for the energy at which a prospective
unification of the forces could take place.

-----------------------------
PHYSICS NEWS UPDATE                         
The American Institute of Physics Bulletin of Physics News
Number 492  July 6, 2000   by Phillip F. Schewe and Ben Stein

A SUPERCONDUCTING "SCHRODINGER'S CAT" has been
demonstrated in the lab by a group at Stony Brook.  Quantum
phenomena can be big things; examples include supercurrents,
consisting of billions of electron pairs, moving around a
macroscopically sized superconductor, or ensembles of billions of
photons making up a pulse of laser light, all residing in a single
quantum state.  By contrast, quantum superposition, in which the
system exists in two states (such as having two different values of
angular momentum or being in two different places) at the same
time, has mostly been a small thing, or a thing of few parts. 
Examples: a single ion simultaneously in two places (several nm
apart) within an atom trap (David Wineland, NIST); or wavelike
manifestations of C-60 molecules split and sent along separate paths
of an atom interferometer (Anton Zeilinger, Univ Vienna).  In the
Stony Brook experiment (Jonathan Friedman, 631-632-8079,
jonathan.friedman at sunysb.edu) the superposition of quantum states
is both big in size and in the number of parts.  The quantum system
in question is a supercurrent (containing billions of electron pairs)
flowing around a 140-micron-sized superconducting quantum
interference device (SQUID) circuit.  As for the superposition of
states in this case, it consists of the fact (improbably enough) that
the supercurrent can flow in both directions at the same time (note:
the current is in a superposition of clockwise or anticlockwise; it is
never zero).  Normally the two supercurrent quantum states
(clockwise and counterclockwise flow) sit in two separate potential
wells (in the abstract space of quantum states).  But the Stony Brook
researchers (James Lukens heads the team) apply a gentle blast of
microwaves that nudges the quantum current states part of the way
out of their valleys, high enough to make quantum tunneling
between the states possible (facilitating  currents flowing in both
directions at the same time) but not so high as to break up the
electron pairs which are the heart of the superconducting condition. 
One hope is that this type of large coherent quantum state, well
isolated from the outside (nonquantum) environment, could be put
to service in quantum computing.  (Friedman et al., Nature, 6 July
2000.)

-----------------------------
PHYSICS NEWS UPDATE                         
The American Institute of Physics Bulletin of Physics News
Number 505  October 4, 2000   by Phillip F. Schewe and Ben Stein

FIRST RESULTS FROM RHIC.  Brookhaven's Relativistic Heavy
Ion Collider (RHIC) had their first heavy-ion collisions back in
June and since then extremely energetic smashups between gold
atoms have been lighting up detectors in the four interaction halls,
creating fireballs that approximate tiny pieces of the universe as it
might been only microseconds after the big bang. One conspicuous
goal at RHIC is to rip apart protons and neutrons inside the
colliding nuclei in order to create novel new forms of nuclear
matter, such as quark gluon plasma.  The beam energies have been
as high as 130 GeV per nucleon and the beam density is up to
about 10% of its design value.  In this first published RHIC paper,
the PHOBOS collaboration (contact Gunther Roland, MIT,
gunther.roland at cern.ch) describes the "pseudorapidity" (related to
the velocity along the direction of the beams) of the myriad
particles emerging from the collisions.  The researchers pay special
attention to particles emerging at right angles to the incoming
beams.  These particles emanate from the most violent of
collisions, which on average create about 6000-7000 particles per
event, more than have ever been seen in accelerator experiments
before.  The number of particles produced in turn is indicative of
the energy density of the fireball produced at the moment of
collision; this density, 70% higher than in previous heavy-ion
experiments, carries the RHIC researchers into a new portion of
the nuclear phase diagram.  The data presented here help to
constrain models of this high-density nuclear realm.  (Back et al.,
Physical Review Letters, 9 Oct Select Articles.)  All four RHIC
detector groups (STAR, PHENIX, and BRAHMS are the three
others) will be presenting their first scientific findings at the
American Physical Society Division of Nuclear Physics Meeting in
Williamsburg, VA on October 4-7 (www.aps.org/meet/DNP00). 
While no announcement of a quark gluon plasma is expected,
researchers plan to describe numerous impressive aspects of
RHIC's early operation.
     
-----------------------------
PHYSICS NEWS UPDATE                         
The American Institute of Physics Bulletin of Physics News
Number 480   April 24, 2000   by Phillip F. Schewe and Ben Stein

EXPLOITING QUANTUM "SPOOKINESS" TO CREATE
SECRET CODES has been demonstrated for the first time by three
independent research groups, advancing hopes for eventually
protecting sensitive data from any kind of computer attack.  In the
latest--and most foolproof--variation yet of the data-encryption
scheme known as quantum cryptography, researchers employ pairs
of "entangled" photons, particles that can be so intimately
interlinked even when far apart that a perplexed Einstein once
derided their behavior as "spooky action at a distance." 
Entanglement-based quantum cryptography has unique features for
sending coded data at practical transmission rates and detecting
eavesdroppers.  In short, the entanglement process can generate a
completely random sequence of 0s and 1s distributed exclusively to
two users at remote locations.  Any eavesdropper's attempt to
intercept this sequence will alter the message in a detectable way,
enabling the users to discard the appropriate parts of the data.  This
random sequence of digits, or "key," can then be plugged into a code
scheme known as a "one-time pad cipher,"which converts the
message into a completely random sequence of letters.
   This code scheme--mathematically proven to be unbreakable
without knowledge of the key--actually dates back to World War I,
but its main flaw had been that the key could be intercepted by an
intermediary.  In the 1990s, Oxford's Artur Ekert
(artur.ekert at qubit.org) proposed an entanglement-based version of
this scheme, not realized until now.  In the most basic version, a
specially prepared crystal splits a single photon into a pair of
entangled photons.   Both the message sender (traditionally called
Alice) and the receiver (called Bob) get one of the photons.  Alice
and Bob each have a detector for measuring their photon's
polarization, the direction in which its electric field vibrates. 
Different polarizations could represent different digits, such as the 0
and 1 of binary code.  But according to quantum mechanics, each 
photon can be in a combination (or superposition) of polarization
states, and essentially be a 0 and 1 at the same time.  Only when one
of them is measured or otherwise disturbed does it "collapse" to a
definite value of 0 and 1, in a random way.  But once one particle
collapses, its entangled partner is also forced to collapse into a
specific digit correlated with the first digit.  With the right
combination of detector settings on each end, Alice and Bob will get
the exact same digit.  After receiving a string of entangled photons,
Alice and Bob discuss which detector settings they used, rather than
the actual readings they obtained, and they discard readings made
with the incorrect settings. At that point, Alice and Bob have a
random string of digits that can serve as a completely secure key for
the mathematically unbreakable one-time pad cipher.  In their
demonstration, Los Alamos researchers (Paul Kwiat, 505-667-6173,
kwiat at lanl.gov) simulated an eavesdropper (by passing the photons
through a filter on their way to Alice and Bob) and readily detected
disturbances in their transmissions (by employing what may be the
first practical application of the quantum-mechanical test known as
Bell's theorem), enabling them to discard the purloined information.
   In a separate demonstration of entangled cryptography for
completely isolated Alice and Bob stations separated by 1 km of
fiber optics, an Austrian research team (Thomas Jennewein,
University of Vienna, 011-43-1-4277-51207,
thomas.jennewein at univie.ac.at) created a secret key and then
securely transmitted an image of the "Venus" von Willendorf, one of
the earliest known works of art. (See figures at www.quantum.at and
www.aip.org/physnews/graphics.)   Meanwhile, a University of
Geneva group (Nicholas Gisin, Nicolas.Gisin at physics.unige.ch,
011-41 22 702 65 97) demonstrates entangled cryptography over
many kilometers of fiber using a photon frequency closest to what is
used on real-life fiber optics lines.  In these first experiments, the
three groups demonstrated relatively slow data transmission rates. 
However, entanglement-based cryptography is potentially faster than
non-entangled quantum cryptography, which requires single-photon
sources (and therefore, faint light sources) to foil eavesdropping. 
Entangled cryptography also produces relatively small amounts of
excess photons which an eavesdropper could conceivably skim for
information.  (Three upcoming papers in Physical Review Letters;
Select Article.)

-----------------------------
PHYSICS NEWS UPDATE                         
The American Institute of Physics Bulletin of Physics News
Number 472  February 24, 2000   by Phillip F. Schewe and Ben Stein

LIGHT AT 1 mph.  A year ago Lene Verstergaard Hau used a Bose
Einstein condensate (BEC) as a special nonlinear optical medium for
slowing light from 3 x 10^8 m/sec to a mere 17 m/sec (38 mph;
Update 415).  This comes about when an incoming light pulse enters
the BEC and experiences an extremely abrupt change in index of
refraction (and as for absorption of the light, this is prevented
by applying two laser beams which induce a transparency at the
frequency of the incoming light).  In a talk presented at this
week's meeting of the American Association for the Advancement of
Science (AAAS) in Washington, DC, Hau said that she and her Harvard
colleagues had slowed the light further, to a speed of 1 mph.  She
said that if  the velocity could be slowed still more, to a value
of 1 cm/sec, then this would be comparable to the speed of sound in
the condensate and it might be possible to get atoms to surf on the
front of the light pulse.  Hau believes that this approach to
slowing light, if it can be simplified, would lead to highly
sensitive light switches and to low-power nonlinear optics (right
now high-power laser light is required to produce nonlinear
effects).

-----------------------------
PHYSICS NEWS UPDATE                         
The American Institute of Physics Bulletin of Physics News
Number 487  June 2, 2000   by Phillip F. Schewe and Ben Stein

DELIVERING GENES AND DRUGS WITH ULTRASOUND-
ACTIVATED BUBBLES.  At this week's meeting of the
Acoustical Society of America, Evan Unger of the University of
Arizona and ImaRx Therapeutics in Tucson (520-770-1259,
eunger at imarx.com) presented several new uses for ultrasound
contrast agents, micron-sized bubbles that are injected into the
bloodstream for medical ultrasound purposes.   Traditionally used
to enhance ultrasound images of the heart, because they reflect
sound so well, the bubbles can now dissolve blood clots and
potentially deliver genes and drugs to targeted parts of the body.  
Introducing the microbubbles into a rabbit's blood vessel, and
aiming ultrasound at it, Unger and his coworkers dissolved a blood
clot, by causing the bubbles to pop in that location and sweep away
the clot in small pieces.  In addition, Unger and his colleagues
have attached drugs and genes to the microbubbles in several
ways.  Introducing gene-containing microbubbles into an animal
and aiming ultrasound at its heart, the researchers observed
significant quantities in the heart of CAT-15, the protein expressed
by the gene.  In traditional gene therapy, the gene is delivered via a
modified virus, which may cause serious allergic reactions in some
cases.  But ultrasound-activated microbubbles may provide a safer
alternative, and a more effective one, since the application of
ultrasound even without the bubbles seems to enhance  the
introduction of genes and drugs into cells in many cases.  Even
without the bubbles, Unger showed that ultrasound enabled the
tumor-suppressing drug interleukin-12 to be taken up in 10-1000
times greater amounts in mice.  Unger speculated that the
microbubbles might someday be used in outpatient heart exams,
first to detect plaque, then to dissolve the plaque if it is present. 
While promising, these applications all require further testing and
development.

-----------------------------
PHYSICS NEWS UPDATE                         
The American Institute of Physics Bulletin of Physics News
Number 503  September 22, 2000   by Phillip F. Schewe and Ben
Stein

ENTANGLED PHOTONS CAN DEFEAT THE DIFFRACTION
LIMIT, a new paper suggests.  This might lead to a much sharper
form of microchip lithography than is possible with "classical"
photons.  The factor that ordinarily determines how small a standard
lithography technique can write features on a chip is known as the
diffraction limit, or Rayleigh criterion, which says that you can't
inscribe a feature, or see a detail, smaller than half the wavelength of
the light or other radiation used to perform the task.   But new
research (Jonathan Dowling, JPL/Caltech, 818-393-5343,
Jonathan.P.Dowling at jpl.nasa.gov) shows that the Rayleigh criterion
applies to classical physics but not quantum physics.  In their
proposal for "quantum interferometric lithography," two entangled
photons enter a setup containing mirrors and beamsplitters.   The two
photons--acting as a single unit--constitute a light wave which is split
up and then recombined on a surface, creating patterns on the surface
equivalent to those that would be made by a single photon with half
the wavelength.  On a 2-D surface, this would allow researchers to
write features four times smaller than prescribed by the Rayleigh
limit.  Preparing three entangled photons (still more difficult) and
sending them through the device would create even better results:
effectively a single photon with a third of the wavelength, enabling
nine-fold smaller features on a 2-D surface.  Although more work is
needed to realize this proposal, the technique potentially allows the
creation of features smaller than 25 nm, the size limit below which
classical computer designs would begin to fail because of phenomena
such as electron tunneling.  (Boto et al., Physical Review Letters, 25
Sept 2000; Select Articles.)

==============================END=================================




More information about the EAS-INFO mailing list