Tag Archives: physics

Physicists solve quantum tunneling mystery

An international team of scientists studying ultrafast physics have solved a mystery of quantum mechanics, and found that quantum tunneling is an instantaneous process.

The new theory could lead to faster and smaller electronic components, for which quantum tunneling is a significant factor. It will also lead to a better understanding of diverse areas such as electron microscopy, nuclear fusion and DNA mutations.

“Timescales this short have never been explored before. It’s an entirely new world,” said one of the international team, Professor Anatoli Kheifets, from The Australian National University (ANU).

“We have modelled the most delicate processes of nature very accurately.”

At very small scales quantum physics shows that particles such as electrons have wave-like properties – their exact position is not well defined. This means they can occasionally sneak through apparently impenetrable barriers, a phenomenon called quantum tunneling.

Quantum tunneling plays a role in a number of phenomena, such as nuclear fusion in the sun, scanning tunneling microscopy, and flash memory for computers. However, the leakage of particles also limits the miniaturisation of electronic components.

Professor Kheifets and Dr. Igor Ivanov, from the ANU Research School of Physics and Engineering, are members of a team which studied ultrafast experiments at the attosecond scale (10-18 seconds), a field that has developed in the last 15 years.

Until their work, a number of attosecond phenomena could not be adequately explained, such as the time delay when a photon ionised an atom.

“At that timescale the time an electron takes to quantum tunnel out of an atom was thought to be significant. But the mathematics says the time during tunneling is imaginary – a complex number – which we realised meant it must be an instantaneous process,” said Professor Kheifets.

“A very interesting paradox arises, because electron velocity during tunneling may become greater than the speed of light. However, this does not contradict the special theory of relativity, as the tunneling velocity is also imaginary” said Dr Ivanov, who recently took up a position at the Center for Relativistic Laser Science in Korea.

The team’s calculations, which were made using the Raijin supercomputer, revealed that the delay in photoionisation originates not from quantum tunneling but from the electric field of the nucleus attracting the escaping electron.

The results give an accurate calibration for future attosecond-scale research, said Professor Kheifets.

“It’s a good reference point for future experiments, such as studying proteins unfolding, or speeding up electrons in microchips,” he said.

The research is published in Nature Physics.

Source: ANU

Experiment confirms quantum theory weirdness

The bizarre nature of reality as laid out by quantum theory has survived another test, with scientists performing a famous experiment and proving that reality does not exist until it is measured.

Physicists at The Australian National University (ANU) have conducted John Wheeler’s delayed-choice thought experiment, which involves a moving object that is given the choice to act like a particle or a wave. Wheeler’s experiment then asks – at which point does the object decide?

Common sense says the object is either wave-like or particle-like, independent of how we measure it. But quantum physics predicts that whether you observe wave like behavior (interference) or particle behavior (no interference) depends only on how it is actually measured at the end of its journey. This is exactly what the ANU team found.

“It proves that measurement is everything. At the quantum level, reality does not exist if you are not looking at it,” said Associate Professor Andrew Truscott from the ANU Research School of Physics and Engineering.

Despite the apparent weirdness, the results confirm the validity of quantum theory, which governs the world of the very small, and has enabled the development of many technologies such as LEDs, lasers and computer chips.

The ANU team not only succeeded in building the experiment, which seemed nearly impossible when it was proposed in 1978, but reversed Wheeler’s original concept of light beams being bounced by mirrors, and instead used atoms scattered by laser light.

“Quantum physics’ predictions about interference seem odd enough when applied to light, which seems more like a wave, but to have done the experiment with atoms, which are complicated things that have mass and interact with electric fields and so on, adds to the weirdness,” said Roman Khakimov, PhD student at the Research School of Physics and Engineering.

Professor Truscott’s team first trapped a collection of helium atoms in a suspended state known as a Bose-Einstein condensate, and then ejected them until there was only a single atom left.

The single atom was then dropped through a pair of counter-propagating laser beams, which formed a grating pattern that acted as crossroads in the same way a solid grating would scatter light.

A second light grating to recombine the paths was randomly added, which led to constructive or destructive interference as if the atom had travelled both paths. When the second light grating was not added, no interference was observed as if the atom chose only one path.

However, the random number determining whether the grating was added was only generated after the atom had passed through the crossroads.

If one chooses to believe that the atom really did take a particular path or paths then one has to accept that a future measurement is affecting the atom’s past, said Truscott.

“The atoms did not travel from A to B. It was only when they were measured at the end of the journey that their wave-like or particle-like behavior was brought into existence,” he said.

The research is published in Nature Physics.

Source: ANU

Shown here is "event zero," the first detection of a trapped electron in the MIT physicists' instrument. The color indicates the electron's detected power as a function of frequency and time. The sudden “jumps” in frequency indicate an electron collision with the residual hydrogen gas in the cell.

Courtesy of the researchers

Source: MIT News

New tabletop detector “sees” single electrons

Magnet-based setup may help detect the elusive mass of neutrinos.

Jennifer Chu


MIT physicists have developed a new tabletop particle detector that is able to identify single electrons in a radioactive gas.
As the gas decays and gives off electrons, the detector uses a magnet to trap them in a magnetic bottle. A radio antenna then picks up very weak signals emitted by the electrons, which can be used to map the electrons’ precise activity over several milliseconds.

Shown here is "event zero," the first detection of a trapped electron in the MIT physicists' instrument. The color indicates the electron's detected power as a function of frequency and time. The sudden “jumps” in frequency indicate an electron collision with the residual hydrogen gas in the cell. Courtesy of the researchers Source: MIT News
Shown here is “event zero,” the first detection of a trapped electron in the MIT physicists’ instrument. The color indicates the electron’s detected power as a function of frequency and time. The sudden “jumps” in frequency indicate an electron collision with the residual hydrogen gas in the cell.
Courtesy of the researchers
Source: MIT News

The team worked with researchers at Pacific Northwest National Laboratory, the University of Washington, the University of California at Santa Barbara (UCSB), and elsewhere to record the activity of more than 100,000 individual electrons in krypton gas.
The majority of electrons observed behaved in a characteristic pattern: As the radioactive krypton gas decays, it emits electrons that vibrate at a baseline frequency before petering out; this frequency spikes again whenever an electron hits an atom of radioactive gas. As an electron ping-pongs against multiple atoms in the detector, its energy appears to jump in a step-like pattern.
“We can literally image the frequency of the electron, and we see this electron suddenly pop into our radio antenna,” says Joe Formaggio, an associate professor of physics at MIT. “Over time, the frequency changes, and actually chirps up. So these electrons are chirping in radio waves.”
Formaggio says the group’s results, published in Physical Review Letters, are a big step toward a more elusive goal: measuring the mass of a neutrino.

A ghostly particle
Neutrinos are among the more mysterious elementary particles in the universe: Billions of them pass through every cell of our bodies each second, and yet these ghostly particles are incredibly difficult to detect, as they don’t appear to interact with ordinary matter. Scientists have set theoretical limits on neutrino mass, but researchers have yet to precisely detect it.
“We have [the mass] cornered, but haven’t measured it yet,” Formaggio says. “The name of the game is to measure the energy of an electron — that’s your signature that tells you about the neutrino.”
As Formaggio explains it, when a radioactive atom such as tritium decays, it turns into an isotope of helium and, in the process, also releases an electron and a neutrino. The energy of all particles released adds up to the original energy of the parent neutron. Measuring the energy of the electron, therefore, can illuminate the energy — and consequently, the mass — of the neutrino.
Scientists agree that tritium, a radioactive isotope of hydrogen, is key to obtaining a precise measurement: As a gas, tritium decays at such a rate that scientists can relatively easily observe its electron byproducts.
Researchers in Karlsruhe, Germany, hope to measure electrons in tritium using a massive spectrometer as part of an experiment named KATRIN (Karlsruhe Tritium Neutrino Experiment). Electrons, produced from the decay of tritium, pass through the spectrometer, which filters them according to their different energy levels. The experiment, which is just getting under way, may obtain measurements of single electrons, but at a cost.
“In KATRIN, the electrons are detected in a silicon detector, which means the electrons smash into the crystal, and a lot of random things happen, essentially destroying the electrons,” says Daniel Furse, a graduate student in physics, and a co-author on the paper. “We still want to measure the energy of electrons, but we do it in a nondestructive way.”
The group’s setup has an additional advantage: size. The detector essentially fits on a tabletop, and the space in which electrons are detected is smaller than a postage stamp. In contrast, KATRIN’s spectrometer, when delivered to Karlsruhe, barely fit through the city’s streets.
Tuning in
Furse and Formaggio’s detector — an experiment called “Project 8” — is based on a decades-old phenomenon known as cyclotron radiation, in which charged particles such as electrons emit radio waves in a magnetic field. It turns out electrons emit this radiation at a frequency similar to that of military radio communications.
“It’s the same frequency that the military uses — 26 gigahertz,” Formaggio says. “And it turns out the baseline frequency changes very slightly if the electron has energy. So we said, ‘Why not look at the radiation [electrons] emit directly?’”
Formaggio and former postdoc Benjamin Monreal, now an assistant professor of physics at UCSB, reasoned that if they could tune into this baseline frequency, they could catch electrons as they shot out of a decaying radioactive gas, and measure their energy in a magnetic field.
“If you could measure the frequency of this radio signal, you could measure the energy potentially much more accurately than you can with any other method,” Furse says. “The problem is, you’re looking at this really weak signal over a very short amount of time, and it’s tough to see, which is why no one has ever done it before.”
It took five years of fits and starts before the group was finally able to build an accurate detector. Once the researchers turned the detector on, they were able to record individual electrons within the first 100 milliseconds of the experiment — although the analysis took a bit longer.
“Our software was so slow at processing things that we could tell funny things were happening because, all of a sudden, our file size became larger, as these things started appearing,” Formaggio recalls.
He says the precision of the measurements obtained so far in krypton gas has encouraged the team to move on to tritium — a goal Formaggio says may be attainable in the next year or two — and pave a path toward measuring the mass of the neutrino.
Steven Elliott, a technical staff member at Los Alamos National Laboratory, says the group’s new detector “represents a very significant result.” In order to use the detector to measure the mass of a neutrino, Elliott adds, the group will have to make multiple improvements, including developing a bigger cell to contain a larger amount of tritium.
“This was the first step, albeit a very important step, along the way to building a next-generation experiment,” says Elliott, who did not contribute to the research. “As a result, the neutrino community is very impressed with the concept and execution of this experiment.”
This research was funded in part by the Department of Energy and the National Science Foundation.
Star formation in what are now "dead" galaxies sputtered out billions of years ago. ESO’s Very Large Telescope and the NASA/ESA Hubble Space Telescope have revealed that three billion years after the Big Bang, these galaxies still made stars on their outskirts, but no longer in their interiors. The quenching of star formation seems to have started in the cores of the galaxies and then spread to the outer parts.

This diagram illustrates this process. Galaxies in the early Universe appear at the left. The blue regions are where star formation is in progress and the red regions are the "dead" regions where only older redder stars remain and there are no more young blue stars being formed. The resulting giant spheroidal galaxies in the modern Universe appear on the right.

Credit:
ESO

Giant Galaxies Die from the Inside Out

VLT and Hubble observations show that star formation shuts down in the centres of elliptical galaxies first


Astronomers have shown for the first time how star formation in “dead” galaxies sputtered out billions of years ago. ESO’s Very Large Telescope and the NASA/ESA Hubble Space Telescope have revealed that three billion years after the Big Bang, these galaxies still made stars on their outskirts, but no longer in their interiors. The quenching of star formation seems to have started in the cores of the galaxies and then spread to the outer parts. The results will be published in the 17 April 2015 issue of the journal Science.

Star formation in what are now "dead" galaxies sputtered out billions of years ago. ESO’s Very Large Telescope and the NASA/ESA Hubble Space Telescope have revealed that three billion years after the Big Bang, these galaxies still made stars on their outskirts, but no longer in their interiors. The quenching of star formation seems to have started in the cores of the galaxies and then spread to the outer parts. This diagram illustrates this process. Galaxies in the early Universe appear at the left. The blue regions are where star formation is in progress and the red regions are the "dead" regions where only older redder stars remain and there are no more young blue stars being formed. The resulting giant spheroidal galaxies in the modern Universe appear on the right. Credit: ESO
Star formation in what are now “dead” galaxies sputtered out billions of years ago. ESO’s Very Large Telescope and the NASA/ESA Hubble Space Telescope have revealed that three billion years after the Big Bang, these galaxies still made stars on their outskirts, but no longer in their interiors. The quenching of star formation seems to have started in the cores of the galaxies and then spread to the outer parts.
This diagram illustrates this process. Galaxies in the early Universe appear at the left. The blue regions are where star formation is in progress and the red regions are the “dead” regions where only older redder stars remain and there are no more young blue stars being formed. The resulting giant spheroidal galaxies in the modern Universe appear on the right.
Credit:
ESO

A major astrophysical mystery has centred on how massive, quiescent elliptical galaxies, common in the modern Universe, quenched their once furious rates of star formation. Such colossal galaxies, often also called spheroids because of their shape, typically pack in stars ten times as densely in the central regions as in our home galaxy, the Milky Way, and have about ten times its mass.

Astronomers refer to these big galaxies as red and dead as they exhibit an ample abundance of ancient red stars, but lack young blue stars and show no evidence of new star formation. The estimated ages of the red stars suggest that their host galaxies ceased to make new stars about ten billion years ago. This shutdown began right at the peak of star formation in the Universe, when many galaxies were still giving birth to stars at a pace about twenty times faster than nowadays.

“Massive dead spheroids contain about half of all the stars that the Universe has produced during its entire life,” said Sandro Tacchella of ETH Zurich in Switzerland, lead author of the article. “We cannot claim to understand how the Universe evolved and became as we see it today unless we understand how these galaxies come to be.”

Tacchella and colleagues observed a total of 22 galaxies, spanning a range of masses, from an era about three billion years after the Big Bang [1]. The SINFONI instrument on ESO’s Very Large Telescope (VLT) collected light from this sample of galaxies, showing precisely where they were churning out new stars. SINFONI could make these detailed measurements of distant galaxies thanks to its adaptive optics system, which largely cancels out the blurring effects of Earth’s atmosphere.

The researchers also trained the NASA/ESA Hubble Space Telescope on the same set of galaxies, taking advantage of the telescope’s location in space above our planet’s distorting atmosphere. Hubble’s WFC3 camera snapped images in the near-infrared, revealing the spatial distribution of older stars within the actively star-forming galaxies.

“What is amazing is that SINFONI’s adaptive optics system can largely beat down atmospheric effects and gather information on where the new stars are being born, and do so with precisely the same accuracy as Hubble allows for the stellar mass distributions,” commented Marcella Carollo, also of ETH Zurich and co-author of the study.

According to the new data, the most massive galaxies in the sample kept up a steady production of new stars in their peripheries. In their bulging, densely packed centres, however, star formation had already stopped.

“The newly demonstrated inside-out nature of star formation shutdown in massive galaxies should shed light on the underlying mechanisms involved, which astronomers have long debated,” says Alvio Renzini, Padova Observatory, of the Italian National Institute of Astrophysics.

A leading theory is that star-making materials are scattered by torrents of energy released by a galaxy’s central supermassive black hole as it sloppily devours matter. Another idea is that fresh gas stops flowing into a galaxy, starving it of fuel for new stars and transforming it into a red and dead spheroid.

“There are many different theoretical suggestions for the physical mechanisms that led to the death of the massive spheroids,” said co-author Natascha Förster Schreiber, at the Max-Planck-Institut für extraterrestrische Physik in Garching, Germany. “Discovering that the quenching of star formation started from the centres and marched its way outwards is a very important step towards understanding how the Universe came to look like it does now.”

Notes
[1] The Universe’s age is about 13.8 billion years, so the galaxies studied by Tacchella and colleagues are generally seen as they were more than 10 billion years ago.

Source: ESO


First Signs of Self-interacting Dark Matter?

Dark matter may not be completely dark after all


Based on our current scientific understanding of the universe and various surveys like the Cosmic Microwave Background observations by Planck or WMAP, we still only know about 4-5% of the visible or baryonic matter. Rest of the 96-94% is still a mystery. This huge unknown portion of the dark universe is known to be comprised of the dark energy (the source of accelerating expansion of the universe)  and dark matter (the extra un-explained mass of the galaxies). Despite having indirect signatures suggesting their presence, we still are not able to observe these phenomena.

For the first time dark matter may have been observed interacting with other dark matter in a way other than through the force of gravity. Observations of colliding galaxies made with ESO’s Very Large Telescope and the NASA/ESA Hubble Space Telescope have picked up the first intriguing hints about the nature of this mysterious component of the Universe.

This image from the NASA/ESA Hubble Space Telescope shows the rich galaxy cluster Abell 3827. The strange pale blue structures surrounding the central galaxies are gravitationally lensed views of a much more distant galaxy behind the cluster. The distribution of dark matter in the cluster is shown with blue contour lines. The dark matter clump for the galaxy at the left is significantly displaced from the position of the galaxy itself, possibly implying dark matter-dark matter interactions of an unknown nature are occuring. Credit: ESO/R. Massey
This image from the NASA/ESA Hubble Space Telescope shows the rich galaxy cluster Abell 3827. The strange pale blue structures surrounding the central galaxies are gravitationally lensed views of a much more distant galaxy behind the cluster.
The distribution of dark matter in the cluster is shown with blue contour lines. The dark matter clump for the galaxy at the left is significantly displaced from the position of the galaxy itself, possibly implying dark matter-dark matter interactions of an unknown nature are occuring.
Credit:
ESO/R. Massey

Using the MUSE instrument on ESO’s VLT in Chile, along with images from Hubble in orbit, a team of astronomers studied the simultaneous collision of four galaxies in the galaxy cluster Abell 3827. The team could trace out where the mass lies within the system and compare the distribution of the dark matter with the positions of the luminous galaxies.

Although dark matter cannot be seen, the team could deduce its location using a technique called gravitational lensing. The collision happened to take place directly in front of a much more distant, unrelated source. The mass of dark matter around the colliding galaxies severely distorted spacetime, deviating the path of light rays coming from the distant background galaxy — and distorting its image into characteristic arc shapes.

Our current understanding is that all galaxies exist inside clumps of dark matter. Without the constraining effect of dark matter’s gravity, galaxies like the Milky Way would fling themselves apart as they rotate. In order to prevent this, 85 percent of the Universe’s mass [1] must exist as dark matter, and yet its true nature remains a mystery.

In this study, the researchers observed the four colliding galaxies and found that one dark matter clump appeared to be lagging behind the galaxy it surrounds. The dark matter is currently 5000 light-years (50 000 million million kilometres) behind the galaxy — it would take NASA’s Voyager spacecraft 90 million years to travel that far.

A lag between dark matter and its associated galaxy is predicted during collisions if dark matter interacts with itself, even very slightly, through forces other than gravity [2]. Dark matter has never before been observed interacting in any way other than through the force of gravity.

Lead author Richard Massey at Durham University, explains: “We used to think that dark matter just sits around, minding its own business, except for its gravitational pull. But if dark matter were being slowed down during this collision, it could be the first evidence for rich physics in the dark sector — the hidden Universe all around us.”

The researchers note that more investigation will be needed into other effects that could also produce a lag. Similar observations of more galaxies, and computer simulations of galaxy collisions will need to be made.

Team member Liliya Williams of the University of Minnesota adds: “We know that dark matter exists because of the way that it interacts gravitationally, helping to shape the Universe, but we still know embarrassingly little about what dark matter actually is. Our observation suggests that dark matter might interact with forces other than gravity, meaning we could rule out some key theories about what dark matter might be.”

This result follows on from a recent result from the team which observed 72 collisions between galaxy clusters [3] and found that dark matter interacts very little with itself. The new work however concerns the motion of individual galaxies, rather than clusters of galaxies. Researchers say that the collision between these galaxies could have lasted longer than the collisions observed in the previous study — allowing the effects of even a tiny frictional force to build up over time and create a measurable lag [4].

Taken together, the two results bracket the behaviour of dark matter for the first time. Dark matter interacts more than this, but less than that. Massey added: “We are finally homing in on dark matter from above and below — squeezing our knowledge from two directions.”

Notes
[1] Astronomers have found that the total mass/energy content of the Universe is split in the proportions 68% dark energy, 27% dark matter and 5% “normal” matter. So the 85% figure relates to the fraction of “matter” that is dark.

[2] Computer simulations show that the extra friction from the collision would make the dark matter slow down. The nature of that interaction is unknown; it could be caused by well-known effects or some exotic unknown force. All that can be said at this point is that it is not gravity.

All four galaxies might have been separated from their dark matter. But we happen to have a very good measurement from only one galaxy, because it is by chance aligned so well with the background, gravitationally lensed object. With the other three galaxies, the lensed images are further away, so the constraints on the location of their dark matter too loose to draw statistically significant conclusions.

[3] Galaxy clusters contain up to a thousand individual galaxies.

[4] The main uncertainty in the result is the timespan for the collision: the friction that slowed the dark matter could have been a very weak force acting over about a billion years, or a relatively stronger force acting for “only” 100 million years.

Source: ESO

A cartoon illustration of a levitated drop of superfluid helium. A single photon circulating inside the drop (red arrow) will be used to produce the superposition. The drop's gravitational field (illustrated schematically in the background) may play a role in limiting the lifetime of such a superposition.

Credit: Yale News

Opening a window on quantum gravity

Yale University has received a grant from the W. M. Keck Foundation to fund experiments that researchers hope will provide new insights into quantum gravity. Jack Harris, associate professor of physics, will lead a Yale team that aims to address a long-standing question in physics — how the classical behavior of macroscopic objects emerges from microscopic constituents that obey the laws of quantum mechanics.

Very small objects like photons and electrons are known for their odd behavior. Thanks to the laws of quantum mechanics, they can act as particles or waves, appear in multiple places at once, and mysteriously interact over great distances. The question is why these behaviors are not observed in larger objects.

A cartoon illustration of a levitated drop of superfluid helium. A single photon circulating inside the drop (red arrow) will be used to produce the superposition. The drop's gravitational field (illustrated schematically in the background) may play a role in limiting the lifetime of such a superposition. Credit: Yale News
A cartoon illustration of a levitated drop of superfluid helium. A single photon circulating inside the drop (red arrow) will be used to produce the superposition. The drop’s gravitational field (illustrated schematically in the background) may play a role in limiting the lifetime of such a superposition.
Credit: Yale News

Scientists know that friction plays an important part in producing classical behavior in macroscopic objects, but many suspect that gravity also suppresses quantum effects. Unfortunately, there has been no practical way to test this possibility, and in the absence of a full quantum theory of gravity, it is difficult even to make any quantitative predictions.

To address this problem, Harris will create a novel instrument that will enable a drop of liquid helium to exhibit quantum mechanical effects. “A millimeter across,” Harris said, “our droplet will be five orders of magnitude more massive than any other object in which quantum effects have been observed. It will enable us to explore quantum behavior on unprecedentedly macroscopic scales and to provide the first experimental tests of leading models of gravity at the quantum level.”

Game-changing research

The W.M. Keck Foundation grant will fund five years of activity at the Harris lab, which is part of Yale’s Department of Physics. In the first year, Harris and his team will construct their apparatus, and in subsequent years they will use it to perform increasingly sophisticated experiments.

“We are extremely grateful to the W.M. Keck Foundation for this generous support,” said Steven Girvin, the Eugene Higgins Professor of Physics and deputy provost for research. “This is a forward-looking grant that will advance truly ground-breaking research.”

Girvin, whose own research interests include quantum computing, described the Harris project as a possible game-changer. “Truly quantum mechanical behaviors have been observed in the flight of molecules through a vacuum and in the flow of electrons through superconductive circuits, but nothing has been accomplished on this scale. If Jack succeeds, this would be the first time that an object visible to the naked eye has bulk motion that exhibits genuine quantum mechanical effects.”

Into the whispering gallery

To explain his project, Harris invokes an architectural quirk of St. Paul’s cathedral, a London landmark with a famous “whispering gallery.” High up in its main dome, a whisper uttered against one wall is easily audible at great distances, as the sound waves skim along the dome’s interior. Harris plans to create his own whispering gallery, albeit on a smaller scale, using a droplet of liquid helium suspended in a powerful magnetic field. Rather than sound waves, Harris’ gallery will bounce a single photon.

This approach is closely related to an idea proposed by Albert Einstein in the 1920s, but until now, it has remained beyond the technical capabilities of experimentalists. To complete the experiment, Harris will need to combine recent advances in three different areas of physics: the study of optical cavities (objects that can capture photons), magnetic levitation, and the strange, frictionless world of superfluid helium. “Superfluid liquid helium has particular properties, like absence of viscosity and near-absence of optical absorption,” Harris explained. “In our device, a drop of liquid helium will be made to capture a single photon, which will bounce around inside. We expect to see the drop respond to the photon. “A photon always behaves quantum mechanically,” he added. “If you have a macroscopic object — our helium drop — that responds appreciably to a photon, the quantum mechanical behavior can be transferred to the large object. Our device will be ideally suited to studying quantum effects in the drop’s motion.” Potential applications for Harris’ research include new approaches to computing, cryptography, and communications. But Harris is most excited about the implications for fundamental physics: “Finding a theory of quantum gravity has been an outstanding challenge in physics for several decades, and it has proceeded largely without input from experiments. We hope that our research can provide some empirical data in this arena.”

About the W.M. Keck Foundation

The W.M. Keck Foundation was established in 1954 by William Myron Keck, founder of the Superior Oil Company. The foundation supports pioneering research in science, engineering, and medicine and has provided generous funding for numerous research initiatives at Yale University. In 2014, the Keck Foundation awarded a separate grant to a team of scientists led by Corey O’Hern, associate professor of mechanical engineering at Yale, to explore the physics of systems composed of macro-sized particles. Source : Yale News

Fully experimental image of a nanoscaled and ultrafast optical rogue wave retrieved by Near-field Scanning Optical Microscope (NSOM). The flow lines visible in the image represent the direction of light energy. 
Credit: KAUST

Tsunami on demand: the power to harness catastrophic events

A new study published in Nature Physics features a nano-optical chip that makes possible generating and controlling nanoscale rogue waves. The innovative chip was developed by an international team of physicists, led by Andrea Fratalocchi from KAUST (Saudi Arabia), and is expected to have significant applications for energy research and environmental safety.

Can you imagine how much energy is in a tsunami wave, or in a tornado? Energy is all around us, but mainly contained in a quiet state. But there are moments in time when large amounts of energy build up spontaneously and create rare phenomena on a potentially disastrous scale. How these events occur, in many cases, is still a mystery.

To reveal the natural mechanisms behind such high-energy phenomena, Andrea Fratalocchi, assistant professor in the Computer, Electrical and Mathematical Science and Engineering Division of King Abdullah University of Science and Technology (KAUST), led a team of researchers from Saudi Arabia and three European universities and research centers to understand the dynamics of such destructive events and control their formation in new optical chips, which can open various technological applications. The results and implications of this study are published in the journal Nature Physics.

“I have always been fascinated by the unpredictability of nature,” Fratalocchi said. “And I believe that understanding this complexity is the next frontier that will open cutting edge pathways in science and offer novel applications in a variety of areas.”

Fratalocchi’s team began their research by developing new theoretical ideas to explain the formation of rare energetic natural events such as rogue waves — large surface waves that develop spontaneously in deep water and represent a potential risk for vessels and open-ocean oil platforms.”

“Our idea was something never tested before,” Fratalocchi continued. “We wanted to demonstrate that small perturbations of a chaotic sea of interacting waves could, contrary to intuition, control the formation of rare events of exceptional amplitude.”

Fully experimental image of a nanoscaled and ultrafast optical rogue wave retrieved by Near-field Scanning Optical Microscope (NSOM). The flow lines visible in the image represent the direction of light energy.  Credit: KAUST
Fully experimental image of a nanoscaled and ultrafast optical rogue wave retrieved by Near-field Scanning Optical Microscope (NSOM). The flow lines visible in the image represent the direction of light energy.
Credit: KAUST

A planar photonic crystal chip, fabricated at the University of St. Andrews and tested at the FOM institute AMOLF in the Amsterdam Science Park, was used to generate ultrafast (163 fs long) and subwavelength (203 nm wide) nanoscale rogue waves, proving that Fratalocchi’s theory was correct. The newly developed photonic chip offered an exceptional level of controllability over these rare events.

Thomas F. Krauss, head of the Photonics Group and Nanocentre Cleanroom at the University of York, UK, was involved in the development of the experiment and the analysis of the data. He shared, “By realizing a sea of interacting waves on a photonic chip, we were able study the formation of rare high energy events in a controlled environment. We noted that these events only happened when some sets of waves were missing, which is one of the key insights our study.”

Kobus Kuipers, head of nanophotonics at FOM institute AMOLF, NL, who was involved in the experimental visualization of the rogue waves, was fascinated by their dynamics: “We have developed a microscope that allows us to visualize optical behavior at the nanoscale. Unlike conventional wave behavior, it was remarkable to see the rogue waves suddenly appear, seemingly out of nowhere, and then disappear again…as if they had never been there.”

Andrea Di Falco, leader of the Synthetic Optics group at the University of St. Andrews said, “The advantage of using light confined in an optical chip is that we can control very carefully how the energy in a chaotic system is dissipated, giving rise to these rare and extreme events. It is as if we were able to produce a determined amount of waves of unusual height in a small lake, just by accurately landscaping its coasts and controlling the size and number of its emissaries.”

The outcomes of this project offer leading edge technological applications in energy research, high speed communication and in disaster preparedness.

Fratalocchi and the team believe their research represents a major milestone for KAUST and for the field. “This discovery can change once and for all the way we look at catastrophic events,” concludes Fratalocchi, “opening new perspectives in preventing their destructive appearance on large scales, or using their unique power for ideating new applications at the nanoscale.”The title of the Nature Physics paper is “Triggering extreme events at the nanoscale in photonic seas.” The paper is accessible on the Nature Photonics website: http://dx.doi.org/10.1038/nphys3263

Source : KAUST News

In the researchers' new system, a returning beam of light is mixed with a locally stored beam, and the correlation of their phase, or period of oscillation, helps remove noise caused by interactions with the environment.

Illustration: Jose-Luis Olivares/MIT

Quantum sensor’s advantages survive entanglement breakdown

Preserving the fragile quantum property known as entanglement isn’t necessary to reap benefits.

By Larry Hardesty 


CAMBRIDGE, Mass. – The extraordinary promise of quantum information processing — solving problems that classical computers can’t, perfectly secure communication — depends on a phenomenon called “entanglement,” in which the physical states of different quantum particles become interrelated. But entanglement is very fragile, and the difficulty of preserving it is a major obstacle to developing practical quantum information systems.

In a series of papers since 2008, members of the Optical and Quantum Communications Group at MIT’s Research Laboratory of Electronics have argued that optical systems that use entangled light can outperform classical optical systems — even when the entanglement breaks down.

Two years ago, they showed that systems that begin with entangled light could offer much more efficient means of securing optical communications. And now, in a paper appearing in Physical Review Letters, they demonstrate that entanglement can also improve the performance of optical sensors, even when it doesn’t survive light’s interaction with the environment.

In the researchers' new system, a returning beam of light is mixed with a locally stored beam, and the correlation of their phase, or period of oscillation, helps remove noise caused by interactions with the environment. Illustration: Jose-Luis Olivares/MIT
In the researchers’ new system, a returning beam of light is mixed with a locally stored beam, and the correlation of their phase, or period of oscillation, helps remove noise caused by interactions with the environment.
Illustration Credit: Jose-Luis Olivares/MIT

“That is something that has been missing in the understanding that a lot of people have in this field,” says senior research scientist Franco Wong, one of the paper’s co-authors and, together with Jeffrey Shapiro, the Julius A. Stratton Professor of Electrical Engineering, co-director of the Optical and Quantum Communications Group. “They feel that if unavoidable loss and noise make the light being measured look completely classical, then there’s no benefit to starting out with something quantum. Because how can it help? And what this experiment shows is that yes, it can still help.”

Phased in

Entanglement means that the physical state of one particle constrains the possible states of another. Electrons, for instance, have a property called spin, which describes their magnetic orientation. If two electrons are orbiting an atom’s nucleus at the same distance, they must have opposite spins. This spin entanglement can persist even if the electrons leave the atom’s orbit, but interactions with the environment break it down quickly.

In the MIT researchers’ system, two beams of light are entangled, and one of them is stored locally — racing through an optical fiber — while the other is projected into the environment. When light from the projected beam — the “probe” — is reflected back, it carries information about the objects it has encountered. But this light is also corrupted by the environmental influences that engineers call “noise.” Recombining it with the locally stored beam helps suppress the noise, recovering the information.

The local beam is useful for noise suppression because its phase is correlated with that of the probe. If you think of light as a wave, with regular crests and troughs, two beams are in phase if their crests and troughs coincide. If the crests of one are aligned with the troughs of the other, their phases are anti-correlated.

But light can also be thought of as consisting of particles, or photons. And at the particle level, phase is a murkier concept.

“Classically, you can prepare beams that are completely opposite in phase, but this is only a valid concept on average,” says Zheshen Zhang, a postdoc in the Optical and Quantum Communications Group and first author on the new paper. “On average, they’re opposite in phase, but quantum mechanics does not allow you to precisely measure the phase of each individual photon.”

Improving the odds

Instead, quantum mechanics interprets phase statistically. Given particular measurements of two photons, from two separate beams of light, there’s some probability that the phases of the beams are correlated. The more photons you measure, the greater your certainty that the beams are either correlated or not. With entangled beams, that certainty increases much more rapidly than it does with classical beams.

When a probe beam interacts with the environment, the noise it accumulates also increases the uncertainty of the ensuing phase measurements. But that’s as true of classical beams as it is of entangled beams. Because entangled beams start out with stronger correlations, even when noise causes them to fall back within classical limits, they still fare better than classical beams do under the same circumstances.

“Going out to the target and reflecting and then coming back from the target attenuates the correlation between the probe and the reference beam by the same factor, regardless of whether you started out at the quantum limit or started out at the classical limit,” Shapiro says. “If you started with the quantum case that’s so many times bigger than the classical case, that relative advantage stays the same, even as both beams become classical due to the loss and the noise.”

In experiments that compared optical systems that used entangled light and classical light, the researchers found that the entangled-light systems increased the signal-to-noise ratio — a measure of how much information can be recaptured from the reflected probe — by 20 percent. That accorded very well with their theoretical predictions.

But the theory also predicts that improvements in the quality of the optical equipment used in the experiment could double or perhaps even quadruple the signal-to-noise ratio. Since detection error declines exponentially with the signal-to-noise ratio, that could translate to a million-fold increase in sensitivity.

Source: MIT News Office

Illustration by Michael S. Helfenbein

Yale physicists find a new form of quantum friction


Physicists at Yale University have observed a new form of quantum friction that could serve as a basis for robust information storage in quantum computers in the future. The researchers are building upon decades of research, experimentally demonstrating a procedure theorized nearly 30 years ago.

The results appear in the journal Science and are based on work in the lab of Michel Devoret, the F.W. Beinecke Professor of Applied Physics.

Quantum computers, a technology still in development, would rely on the laws of quantum mechanics to solve certain problems exponentially faster than classical computers. They would store information in quantum systems, such as the spin of an electron or the energy levels of an artificial atom. Called “qubits,” these storage units are the quantum equivalent of classical “bits.” But while bits can be in states like 0 or 1, qubits can simultaneously be in the 0 and 1 state. This property is called quantum superposition; it is a powerful resource, but also very fragile. Ensuring the integrity of quantum information is a major challenge of the field.

 Illustration by Michael S. Helfenbein
Illustration by Michael S. Helfenbein

Zaki Leghtas, first author on the paper and a postdoctoral researcher at Yale, offered the following metaphor to explain this new form of quantum friction:

Imagine a hill surrounded by two basins. If you put a ball at the top of the hill, it will roll down the sides and settle in one of the basins. As it rolls, it loses energy due to the friction between the ball and the ground, and it slows down. This is why it stops at the bottom of the basin. But friction also causes the ball to leave a path in its wake. By looking at either side of the hill and seeing where grass is flattened and stones are pushed aside, you can tell whether the ball rolled into the right or left basin.

This figure depicts the position of a quantum particle over a time of 19 micro-seconds. Dark colors indicate high probability of the particle existing at the specified position. It is a plot of the time-evolution of the Winger function W (⍺) of the quantum system, with black corresponding to 1.0, white to 0, and blue to –0.05.
This figure depicts the position of a quantum particle over a time of 19 micro-seconds. Dark colors indicate high probability of the particle existing at the specified position. It is a plot of the time-evolution of the Winger function W (⍺) of the quantum system, with black corresponding to 1.0, white to 0, and blue to –0.05.

If you replace the ball with a quantum particle, however, you run into a problem. Quantum particles can exist in many states at the same time, so in theory, the particle could occupy both basins simultaneously. But as the particle is rolling down, the friction between the particle and the hill leaves an impact on the environment, which can be measured. The same friction that stops the particle at the bottom also carves the path. This destroys the superposition and forces the particle to exist in only one basin.

Previously, researchers had been able to take advantage of this friction to trap quantum particles in particular basins. But now, Devoret’s lab demonstrates a new type of friction — one that slows the particle as it rolls, but does not carve a path that tells which side it is choosing. This allows the particle to simultaneously exist in both the left and right basins at the same time.

Each of these “basin” states is both stable and steady. While the quantum particle might move around in the basins, small perturbations won’t kick it out of the basins. Furthermore, any superpositions of these two basin states are also stable and steady. This means they could be used as a basis for storing quantum information.

Technically, this is called a two-dimensional quantum steady-state manifold. Devoret and Leghtas point out that the next step is expanding this two-dimensional manifold to four dimensions — adding two more basins to the landscape. This will allow scientists to redundantly encode quantum information and to do error correction within the manifold. Error correction is one of the key components that must be developed in order to make a practical quantum computer feasible.

Additional authors are Steven Touzard, Ioan Pop, Angela Kou, Brian Vlastakis, Andrei Petrenko, Katrina Sliwa, Anirudh Narla, Shyam Shankar, Michael Hatridge, Matthew Reagor, Luigi Frunzio, Robert Schoelkopf, and Mazyar Mirrahimi of Yale. Mirrahimi also has an appointment at the Institut National de Recherche en Informatique et en Automatique Paris-Rocquencourt.

(Main illustration by Michael S. Helfenbein)

Source: Yale News

Timeline of the approach and departure phases — surrounding close approach on July 14, 2015 — of the New Horizons Pluto encounter.
Image Credit: NASA/JHU APL/SwRI

NASA’s New Horizons Spacecraft Begins First Stages of Pluto Encounter

NASA’s New Horizons spacecraft recently began its long-awaited, historic encounter with Pluto. The spacecraft is entering the first of several approach phases that culminate July 14 with the first close-up flyby of the dwarf planet, 4.67 billion miles (7.5 billion kilometers) from Earth.

“NASA first mission to distant Pluto will also be humankind’s first close up view of this cold, unexplored world in our solar system,” said Jim Green, director of NASA’s Planetary Science Division at the agency’s Headquarters in Washington. “The New Horizons team worked very hard to prepare for this first phase, and they did it flawlessly.”

The fastest spacecraft when it was launched, New Horizons lifted off in January 2006. It awoke from its final hibernation period last month after a voyage of more than 3 billion miles, and will soon pass close to Pluto, inside the orbits of its five known moons. In preparation for the close encounter, the mission’s science, engineering and spacecraft operations teams configured the piano-sized probe for distant observations of the Pluto system that start Sunday, Jan. 25 with a long-range photo shoot.

 

 

Timeline of the approach and departure phases — surrounding close approach on July 14, 2015 — of the New Horizons Pluto encounter. Image Credit: NASA/JHU APL/SwRI
Timeline of the approach and departure phases — surrounding close approach on July 14, 2015 — of the New Horizons Pluto encounter.
Image Credit: NASA/JHU APL/SwRI

The images captured by New Horizons’ telescopic Long-Range Reconnaissance Imager (LORRI) will give mission scientists a continually improving look at the dynamics of Pluto’s moons. The images also will play a critical role in navigating the spacecraft as it covers the remaining 135 million miles (220 million kilometers) to Pluto.

“We’ve completed the longest journey any spacecraft has flown from Earth to reach its primary target, and we are ready to begin exploring,” said Alan Stern, New Horizons principal investigator from Southwest Research Institute in Boulder, Colorado.

LORRI will take hundreds of pictures of Pluto over the next few months to refine current estimates of the distance between the spacecraft and the dwarf planet. Though the Pluto system will resemble little more than bright dots in the camera’s view until May, mission navigators will use the data to design course-correction maneuvers to aim the spacecraft toward its target point this summer. The first such maneuver could occur as early as March.

“We need to refine our knowledge of where Pluto will be when New Horizons flies past it,” said Mark Holdridge, New Horizons encounter mission manager at Johns Hopkins University’s Applied Physics Laboratory (APL) in Laurel, Maryland. “The flyby timing also has to be exact, because the computer commands that will orient the spacecraft and point the science instruments are based on precisely knowing the time we pass Pluto – which these images will help us determine.”

The “optical navigation” campaign that begins this month marks the first time pictures from New Horizons will be used to help pinpoint Pluto’s location.

Throughout the first approach phase, which runs until spring, New Horizons will conduct a significant amount of additional science. Spacecraft instruments will gather continuous data on the interplanetary environment where the planetary system orbits, including measurements of the high-energy particles streaming from the sun and dust-particle concentrations in the inner reaches of the Kuiper Belt. In addition to Pluto, this area, the unexplored outer region of the solar system, potentially includes thousands of similar icy, rocky small planets.

More intensive studies of Pluto begin in the spring, when the cameras and spectrometers aboard New Horizons will be able to provide image resolutions higher than the most powerful telescopes on Earth. Eventually, the spacecraft will obtain images good enough to map Pluto and its moons more accurately than achieved by previous planetary reconnaissance missions.

APL manages the New Horizons mission for NASA’s Science Mission Directorate in Washington. Alan Stern, of the Southwest Research Institute (SwRI), headquartered in San Antonio, is the principal investigator and leads the mission. SwRI leads the science team, payload operations, and encounter science planning. New Horizons is part of the New Frontiers Program managed by NASA’s Marshall Space Flight Center in Huntsville, Alabama. APL designed, built and operates the spacecraft.

For more information about the New Horizons mission, visit:

www.nasa.gov/newhorizons