Tag Archives: computers

Physicists solve quantum tunneling mystery

An international team of scientists studying ultrafast physics have solved a mystery of quantum mechanics, and found that quantum tunneling is an instantaneous process.

The new theory could lead to faster and smaller electronic components, for which quantum tunneling is a significant factor. It will also lead to a better understanding of diverse areas such as electron microscopy, nuclear fusion and DNA mutations.

“Timescales this short have never been explored before. It’s an entirely new world,” said one of the international team, Professor Anatoli Kheifets, from The Australian National University (ANU).

“We have modelled the most delicate processes of nature very accurately.”

At very small scales quantum physics shows that particles such as electrons have wave-like properties – their exact position is not well defined. This means they can occasionally sneak through apparently impenetrable barriers, a phenomenon called quantum tunneling.

Quantum tunneling plays a role in a number of phenomena, such as nuclear fusion in the sun, scanning tunneling microscopy, and flash memory for computers. However, the leakage of particles also limits the miniaturisation of electronic components.

Professor Kheifets and Dr. Igor Ivanov, from the ANU Research School of Physics and Engineering, are members of a team which studied ultrafast experiments at the attosecond scale (10-18 seconds), a field that has developed in the last 15 years.

Until their work, a number of attosecond phenomena could not be adequately explained, such as the time delay when a photon ionised an atom.

“At that timescale the time an electron takes to quantum tunnel out of an atom was thought to be significant. But the mathematics says the time during tunneling is imaginary – a complex number – which we realised meant it must be an instantaneous process,” said Professor Kheifets.

“A very interesting paradox arises, because electron velocity during tunneling may become greater than the speed of light. However, this does not contradict the special theory of relativity, as the tunneling velocity is also imaginary” said Dr Ivanov, who recently took up a position at the Center for Relativistic Laser Science in Korea.

The team’s calculations, which were made using the Raijin supercomputer, revealed that the delay in photoionisation originates not from quantum tunneling but from the electric field of the nucleus attracting the escaping electron.

The results give an accurate calibration for future attosecond-scale research, said Professor Kheifets.

“It’s a good reference point for future experiments, such as studying proteins unfolding, or speeding up electrons in microchips,” he said.

The research is published in Nature Physics.

Source: ANU

In the researchers' new system, a returning beam of light is mixed with a locally stored beam, and the correlation of their phase, or period of oscillation, helps remove noise caused by interactions with the environment.

Illustration: Jose-Luis Olivares/MIT

Quantum sensor’s advantages survive entanglement breakdown

Preserving the fragile quantum property known as entanglement isn’t necessary to reap benefits.

By Larry Hardesty 


CAMBRIDGE, Mass. – The extraordinary promise of quantum information processing — solving problems that classical computers can’t, perfectly secure communication — depends on a phenomenon called “entanglement,” in which the physical states of different quantum particles become interrelated. But entanglement is very fragile, and the difficulty of preserving it is a major obstacle to developing practical quantum information systems.

In a series of papers since 2008, members of the Optical and Quantum Communications Group at MIT’s Research Laboratory of Electronics have argued that optical systems that use entangled light can outperform classical optical systems — even when the entanglement breaks down.

Two years ago, they showed that systems that begin with entangled light could offer much more efficient means of securing optical communications. And now, in a paper appearing in Physical Review Letters, they demonstrate that entanglement can also improve the performance of optical sensors, even when it doesn’t survive light’s interaction with the environment.

In the researchers' new system, a returning beam of light is mixed with a locally stored beam, and the correlation of their phase, or period of oscillation, helps remove noise caused by interactions with the environment. Illustration: Jose-Luis Olivares/MIT
In the researchers’ new system, a returning beam of light is mixed with a locally stored beam, and the correlation of their phase, or period of oscillation, helps remove noise caused by interactions with the environment.
Illustration Credit: Jose-Luis Olivares/MIT

“That is something that has been missing in the understanding that a lot of people have in this field,” says senior research scientist Franco Wong, one of the paper’s co-authors and, together with Jeffrey Shapiro, the Julius A. Stratton Professor of Electrical Engineering, co-director of the Optical and Quantum Communications Group. “They feel that if unavoidable loss and noise make the light being measured look completely classical, then there’s no benefit to starting out with something quantum. Because how can it help? And what this experiment shows is that yes, it can still help.”

Phased in

Entanglement means that the physical state of one particle constrains the possible states of another. Electrons, for instance, have a property called spin, which describes their magnetic orientation. If two electrons are orbiting an atom’s nucleus at the same distance, they must have opposite spins. This spin entanglement can persist even if the electrons leave the atom’s orbit, but interactions with the environment break it down quickly.

In the MIT researchers’ system, two beams of light are entangled, and one of them is stored locally — racing through an optical fiber — while the other is projected into the environment. When light from the projected beam — the “probe” — is reflected back, it carries information about the objects it has encountered. But this light is also corrupted by the environmental influences that engineers call “noise.” Recombining it with the locally stored beam helps suppress the noise, recovering the information.

The local beam is useful for noise suppression because its phase is correlated with that of the probe. If you think of light as a wave, with regular crests and troughs, two beams are in phase if their crests and troughs coincide. If the crests of one are aligned with the troughs of the other, their phases are anti-correlated.

But light can also be thought of as consisting of particles, or photons. And at the particle level, phase is a murkier concept.

“Classically, you can prepare beams that are completely opposite in phase, but this is only a valid concept on average,” says Zheshen Zhang, a postdoc in the Optical and Quantum Communications Group and first author on the new paper. “On average, they’re opposite in phase, but quantum mechanics does not allow you to precisely measure the phase of each individual photon.”

Improving the odds

Instead, quantum mechanics interprets phase statistically. Given particular measurements of two photons, from two separate beams of light, there’s some probability that the phases of the beams are correlated. The more photons you measure, the greater your certainty that the beams are either correlated or not. With entangled beams, that certainty increases much more rapidly than it does with classical beams.

When a probe beam interacts with the environment, the noise it accumulates also increases the uncertainty of the ensuing phase measurements. But that’s as true of classical beams as it is of entangled beams. Because entangled beams start out with stronger correlations, even when noise causes them to fall back within classical limits, they still fare better than classical beams do under the same circumstances.

“Going out to the target and reflecting and then coming back from the target attenuates the correlation between the probe and the reference beam by the same factor, regardless of whether you started out at the quantum limit or started out at the classical limit,” Shapiro says. “If you started with the quantum case that’s so many times bigger than the classical case, that relative advantage stays the same, even as both beams become classical due to the loss and the noise.”

In experiments that compared optical systems that used entangled light and classical light, the researchers found that the entangled-light systems increased the signal-to-noise ratio — a measure of how much information can be recaptured from the reflected probe — by 20 percent. That accorded very well with their theoretical predictions.

But the theory also predicts that improvements in the quality of the optical equipment used in the experiment could double or perhaps even quadruple the signal-to-noise ratio. Since detection error declines exponentially with the signal-to-noise ratio, that could translate to a million-fold increase in sensitivity.

Source: MIT News Office

Illustration by Michael S. Helfenbein

Yale physicists find a new form of quantum friction


Physicists at Yale University have observed a new form of quantum friction that could serve as a basis for robust information storage in quantum computers in the future. The researchers are building upon decades of research, experimentally demonstrating a procedure theorized nearly 30 years ago.

The results appear in the journal Science and are based on work in the lab of Michel Devoret, the F.W. Beinecke Professor of Applied Physics.

Quantum computers, a technology still in development, would rely on the laws of quantum mechanics to solve certain problems exponentially faster than classical computers. They would store information in quantum systems, such as the spin of an electron or the energy levels of an artificial atom. Called “qubits,” these storage units are the quantum equivalent of classical “bits.” But while bits can be in states like 0 or 1, qubits can simultaneously be in the 0 and 1 state. This property is called quantum superposition; it is a powerful resource, but also very fragile. Ensuring the integrity of quantum information is a major challenge of the field.

 Illustration by Michael S. Helfenbein
Illustration by Michael S. Helfenbein

Zaki Leghtas, first author on the paper and a postdoctoral researcher at Yale, offered the following metaphor to explain this new form of quantum friction:

Imagine a hill surrounded by two basins. If you put a ball at the top of the hill, it will roll down the sides and settle in one of the basins. As it rolls, it loses energy due to the friction between the ball and the ground, and it slows down. This is why it stops at the bottom of the basin. But friction also causes the ball to leave a path in its wake. By looking at either side of the hill and seeing where grass is flattened and stones are pushed aside, you can tell whether the ball rolled into the right or left basin.

This figure depicts the position of a quantum particle over a time of 19 micro-seconds. Dark colors indicate high probability of the particle existing at the specified position. It is a plot of the time-evolution of the Winger function W (⍺) of the quantum system, with black corresponding to 1.0, white to 0, and blue to –0.05.
This figure depicts the position of a quantum particle over a time of 19 micro-seconds. Dark colors indicate high probability of the particle existing at the specified position. It is a plot of the time-evolution of the Winger function W (⍺) of the quantum system, with black corresponding to 1.0, white to 0, and blue to –0.05.

If you replace the ball with a quantum particle, however, you run into a problem. Quantum particles can exist in many states at the same time, so in theory, the particle could occupy both basins simultaneously. But as the particle is rolling down, the friction between the particle and the hill leaves an impact on the environment, which can be measured. The same friction that stops the particle at the bottom also carves the path. This destroys the superposition and forces the particle to exist in only one basin.

Previously, researchers had been able to take advantage of this friction to trap quantum particles in particular basins. But now, Devoret’s lab demonstrates a new type of friction — one that slows the particle as it rolls, but does not carve a path that tells which side it is choosing. This allows the particle to simultaneously exist in both the left and right basins at the same time.

Each of these “basin” states is both stable and steady. While the quantum particle might move around in the basins, small perturbations won’t kick it out of the basins. Furthermore, any superpositions of these two basin states are also stable and steady. This means they could be used as a basis for storing quantum information.

Technically, this is called a two-dimensional quantum steady-state manifold. Devoret and Leghtas point out that the next step is expanding this two-dimensional manifold to four dimensions — adding two more basins to the landscape. This will allow scientists to redundantly encode quantum information and to do error correction within the manifold. Error correction is one of the key components that must be developed in order to make a practical quantum computer feasible.

Additional authors are Steven Touzard, Ioan Pop, Angela Kou, Brian Vlastakis, Andrei Petrenko, Katrina Sliwa, Anirudh Narla, Shyam Shankar, Michael Hatridge, Matthew Reagor, Luigi Frunzio, Robert Schoelkopf, and Mazyar Mirrahimi of Yale. Mirrahimi also has an appointment at the Institut National de Recherche en Informatique et en Automatique Paris-Rocquencourt.

(Main illustration by Michael S. Helfenbein)

Source: Yale News

Quantum computer as detector shows space is not squeezed

 Robert Sanders


 

Ever since Einstein proposed his special theory of relativity in 1905, physics and cosmology have been based on the assumption that space looks the same in all directions – that it’s not squeezed in one direction relative to another.

A new experiment by UC Berkeley physicists used partially entangled atoms — identical to the qubits in a quantum computer — to demonstrate more precisely than ever before that this is true, to one part in a billion billion.

The classic experiment that inspired Albert Einstein was performed in Cleveland by Albert Michelson and Edward Morley in 1887 and disproved the existence of an “ether” permeating space through which light was thought to move like a wave through water. What it also proved, said Hartmut Häffner, a UC Berkeley assistant professor of physics, is that space is isotropic and that light travels at the same speed up, down and sideways.

“Michelson and Morley proved that space is not squeezed,” Häffner said. “This isotropy is fundamental to all physics, including the Standard Model of physics. If you take away isotropy, the whole Standard Model will collapse. That is why people are interested in testing this.”

The Standard Model of particle physics describes how all fundamental particles interact, and requires that all particles and fields be invariant under Lorentz transformations, and in particular that they behave the same no matter what direction they move.

Häffner and his team conducted an experiment analogous to the Michelson-Morley experiment, but with electrons instead of photons of light. In a vacuum chamber he and his colleagues isolated two calcium ions, partially entangled them as in a quantum computer, and then monitored the electron energies in the ions as Earth rotated over 24 hours.

As the Earth rotates every 24 hours, the orientation of the ions in the quantum computer/detector changes with respect to the Sun’s rest frame. If space were squeezed in one direction and not another, the energies of the electrons in the ions would have shifted with a 12-hour period. (Hartmut Haeffner image)
As the Earth rotates every 24 hours, the orientation of the ions in the quantum computer/detector changes with respect to the Sun’s rest frame. If space were squeezed in one direction and not another, the energies of the electrons in the ions would have shifted with a 12-hour period. (Hartmut Haeffner image)

If space were squeezed in one or more directions, the energy of the electrons would change with a 12-hour period. It didn’t, showing that space is in fact isotropic to one part in a billion billion (1018), 100 times better than previous experiments involving electrons, and five times better than experiments like Michelson and Morley’s that used light.

The results disprove at least one theory that extends the Standard Model by assuming some anisotropy of space, he said.

Häffner and his colleagues, including former graduate student Thaned Pruttivarasin, now at the Quantum Metrology Laboratory in Saitama, Japan, will report their findings in the Jan. 29 issue of the journal Nature.

Entangled qubits

Häffner came up with the idea of using entangled ions to test the isotropy of space while building quantum computers, which involve using ionized atoms as quantum bits, or qubits, entangling their electron wave functions, and forcing them to evolve to do calculations not possible with today’s digital computers. It occurred to him that two entangled qubits could serve as sensitive detectors of slight disturbances in space.

“I wanted to do the experiment because I thought it was elegant and that it would be a cool thing to apply our quantum computers to a completely different field of physics,” he said. “But I didn’t think we would be competitive with experiments being performed by people working in this field. That was completely out of the blue.”

He hopes to make more sensitive quantum computer detectors using other ions, such as ytterbium, to gain another 10,000-fold increase in the precision measurement of Lorentz symmetry. He is also exploring with colleagues future experiments to detect the spatial distortions caused by the effects of dark matter particles, which are a complete mystery despite comprising 27 percent of the mass of the universe.

“For the first time we have used tools from quantum information to perform a test of fundamental symmetries, that is, we engineered a quantum state which is immune to the prevalent noise but sensitive to the Lorentz-violating effects,” Häffner said. “We were surprised the experiment just worked, and now we have a fantastic new method at hand which can be used to make very precise measurements of perturbations of space.”

Other co-authors are UC Berkeley graduate student Michael Ramm, former UC Berkeley postdoc Michael Hohensee of Lawrence Livermore National Laboratory, and colleagues from the University of Delaware and Maryland and institutions in Russia. The work was supported by the National Science Foundation.

Source: UC Berkeley

Characteristics of a universal simulator|Study narrows the scope of research on quantum computing

Despite a lot of work being done by many research groups around the world, the field of Quantum computing is still in its early stages. We still need to cover a lot of grounds to achieve the goal of developing a working Quantum computer capable of doing the tasks which are expected or predicted. Recent research by a SISSA led team has tried to give the future research in the area of Quantum computing some direction based on the current state of research in the area.


“A quantum computer may be thought of as a ‘simulator of overall Nature,” explains Fabio Franchini, a researcher at the International School for Advanced Studies (SISSA) of Trieste, “in other words, it’s a machine capable of simulating Nature as a quantum system, something that classical computers cannot do”. Quantum computers are machines that carry out operations by exploiting the phenomena of quantum mechanics, and they are capable of performing different functions from those of current computers. This science is still very young and the systems produced to date are still very limited. Franchini is the first author of a study just published in Physical Review Xwhich establishes a basic characteristic that this type of machine should possess and in doing so guides the direction of future research in this field.

The study used analytical and numerical methods. “What we found” explains Franchini, “is that a system that does not exhibit ‘Majorana fermions’ cannot be a universal quantum simulator”. Majorana fermions were hypothesized by Ettore Majorana in a paper published 1937, and they display peculiar characteristics: a Majorana fermion is also its own antiparticle. “That means that if Majorana fermions meet they annihilate among themselves,” continues Franchini. “In recent years it has been suggested that these fermions could be found in states of matter useful for quantum computing, and our study confirms that they must be present, with a certain probability related to entanglement, in the material used to build the machine”.

Entanglement, or “action at a distance”, is a property of quantum systems whereby an action done on one part of the system has an effect on another part of the same system, even if the latter has been split into two parts that are located very far apart. “Entanglement is a fundamental phenomenon for quantum computers,” explains Franchini.

“Our study helps to understand what types of devices research should be focusing on to construct this universal simulator. Until now, given the lack of criteria, research has proceeded somewhat randomly, with a huge consumption of time and resources”.

The study was conducted with the participation of many other international research institutes in addition to SISSA, including the Massachusetts Institute of Technology (MIT) in Boston, the University of Oxford and many others.

More in detail…

“Having a quantum computer would open up new worlds. For example, if we had one today we would be able to break into any bank account,” jokes Franchini. “But don’t worry, we’re nowhere near that goal”.

At the present time, several attempts at quantum machines exist that rely on the properties of specific materials. Depending on the technology used, these computers have sizes varying from a small box to a whole room, but so far they are only able to process a limited number of information bits, an amount infinitely smaller than that processed by classical computers.

However, it’s not correct to say that quantum computers are, or will be, more powerful than traditional ones, points out Franchini. “There are several things that these devices are worse at. But, by exploiting quantum mechanics, they can perform operations that would be impossible for classical computers”.

Source: International School of Advanced Studies (SISSA)