Tag Archives: quantum

Illustration of superconducting detectors on arrayed waveguides on a photonic integrated circuit for detection of single photons.

Credit: F. Najafi/ MIT

Toward quantum chips

Packing single-photon detectors on an optical chip is a crucial step toward quantum-computational circuits.

By Larry Hardesty


CAMBRIDGE, Mass. – A team of researchers has built an array of light detectors sensitive enough to register the arrival of individual light particles, or photons, and mounted them on a silicon optical chip. Such arrays are crucial components of devices that use photons to perform quantum computations.

Single-photon detectors are notoriously temperamental: Of 100 deposited on a chip using standard manufacturing techniques, only a handful will generally work. In a paper appearing today in Nature Communications, the researchers at MIT and elsewhere describe a procedure for fabricating and testing the detectors separately and then transferring those that work to an optical chip built using standard manufacturing processes.

Illustration of superconducting detectors on arrayed waveguides on a photonic integrated circuit for detection of single photons. Credit: F. Najafi/ MIT
Illustration of superconducting detectors on arrayed waveguides on a photonic integrated circuit for detection of single photons.
Credit: F. Najafi/ MIT

In addition to yielding much denser and larger arrays, the approach also increases the detectors’ sensitivity. In experiments, the researchers found that their detectors were up to 100 times more likely to accurately register the arrival of a single photon than those found in earlier arrays.

“You make both parts — the detectors and the photonic chip — through their best fabrication process, which is dedicated, and then bring them together,” explains Faraz Najafi, a graduate student in electrical engineering and computer science at MIT and first author on the new paper.

Thinking small

According to quantum mechanics, tiny physical particles are, counterintuitively, able to inhabit mutually exclusive states at the same time. A computational element made from such a particle — known as a quantum bit, or qubit — could thus represent zero and one simultaneously. If multiple qubits are “entangled,” meaning that their quantum states depend on each other, then a single quantum computation is, in some sense, like performing many computations in parallel.

With most particles, entanglement is difficult to maintain, but it’s relatively easy with photons. For that reason, optical systems are a promising approach to quantum computation. But any quantum computer — say, one whose qubits are laser-trapped ions or nitrogen atoms embedded in diamond — would still benefit from using entangled photons to move quantum information around.

“Because ultimately one will want to make such optical processors with maybe tens or hundreds of photonic qubits, it becomes unwieldy to do this using traditional optical components,” says Dirk Englund, the Jamieson Career Development Assistant Professor in Electrical Engineering and Computer Science at MIT and corresponding author on the new paper. “It’s not only unwieldy but probably impossible, because if you tried to build it on a large optical table, simply the random motion of the table would cause noise on these optical states. So there’s been an effort to miniaturize these optical circuits onto photonic integrated circuits.”

The project was a collaboration between Englund’s group and the Quantum Nanostructures and Nanofabrication Group, which is led by Karl Berggren, an associate professor of electrical engineering and computer science, and of which Najafi is a member. The MIT researchers were also joined by colleagues at IBM and NASA’s Jet Propulsion Laboratory.

Relocation

The researchers’ process begins with a silicon optical chip made using conventional manufacturing techniques. On a separate silicon chip, they grow a thin, flexible film of silicon nitride, upon which they deposit the superconductor niobium nitride in a pattern useful for photon detection. At both ends of the resulting detector, they deposit gold electrodes.

Then, to one end of the silicon nitride film, they attach a small droplet of polydimethylsiloxane, a type of silicone. They then press a tungsten probe, typically used to measure voltages in experimental chips, against the silicone.

“It’s almost like Silly Putty,” Englund says. “You put it down, it spreads out and makes high surface-contact area, and when you pick it up quickly, it will maintain that large surface area. And then it relaxes back so that it comes back to one point. It’s like if you try to pick up a coin with your finger. You press on it and pick it up quickly, and shortly after, it will fall off.”

With the tungsten probe, the researchers peel the film off its substrate and attach it to the optical chip.

In previous arrays, the detectors registered only 0.2 percent of the single photons directed at them. Even on-chip detectors deposited individually have historically topped out at about 2 percent. But the detectors on the researchers’ new chip got as high as 20 percent. That’s still a long way from the 90 percent or more required for a practical quantum circuit, but it’s a big step in the right direction.

Source: MIT News Office

In a pioneering study, Professor Menon and his team were able to discover half-light, half-matter particles in atomically thin semiconductors (thickness ~ a millionth of a single sheet of paper) consisting of two-dimensional (2D) layer of molybdenum and sulfur atoms arranged similar to graphene. They sandwiched this 2D material in a light trapping structure to realize these composite quantum particles.

Credit: CCNY

Study Unveils New Half-Light Half-Matter Quantum Particles

Prospects of developing computing and communication technologies based on quantum properties of light and matter may have taken a major step forward thanks to research by City College of New York physicists led by Dr. Vinod Menon.

In a pioneering study, Professor Menon and his team were able to discover half-light, half-matter particles in atomically thin semiconductors (thickness ~ a millionth of a single sheet of paper) consisting of two-dimensional (2D) layer of molybdenum and sulfur atoms arranged similar to graphene. They sandwiched this 2D material in a light trapping structure to realize these composite quantum particles.

“Besides being a fundamental breakthrough, this opens up the possibility of making devices which take the benefits of both light and matter,” said Professor Menon.  

In a pioneering study, Professor Menon and his team were able to discover half-light, half-matter particles in atomically thin semiconductors (thickness ~ a millionth of a single sheet of paper) consisting of two-dimensional (2D) layer of molybdenum and sulfur atoms arranged similar to graphene. They sandwiched this 2D material in a light trapping structure to realize these composite quantum particles. Credit: CCNY
In a pioneering study, Professor Menon and his team were able to discover half-light, half-matter particles in atomically thin semiconductors (thickness ~ a millionth of a single sheet of paper) consisting of two-dimensional (2D) layer of molybdenum and sulfur atoms arranged similar to graphene. They sandwiched this 2D material in a light trapping structure to realize these composite quantum particles.
Credit: CCNY

For example one can start envisioning logic gates and signal processors that take on best of light and matter. The discovery is also expected to contribute to developing practical platforms for quantum computing. 

Dr. Dirk Englund, a professor at MIT whose research focuses on quantum technologies based on semiconductor and optical systems, hailed the City College study.

“What is so remarkable and exciting in the work by Vinod and his team is how readily this strong coupling regime could actually be achieved. They have shown convincingly that by coupling a rather standard dielectric cavity to exciton–polaritons in a monolayer of molybdenum disulphide, they could actually reach this strong coupling regime with a very large binding strength,” he said. 

Professor Menon’s research team included City College PhD students, Xiaoze Liu, Tal Galfsky and Zheng Sun, and scientists from Yale University, National Tsing Hua University (Taiwan) and Ecole Polytechnic -Montreal (Canada).

The study appears in the January issue of the journal “Nature Photonics.” It was funded by the U.S. Army Research Laboratory’s Army Research Office and the National Science Foundation through the Materials Research Science and Engineering Center – Center for Photonic and Multiscale Nanomaterials. 

Source: The City College New of York

Quantum physics breakthrough: Scientists solve 100-year-old puzzle

Two fundamental concepts of the quantum world are actually just different manifestations of the same thing, says Waterloo researcher.

By Jenny Hogan

Centre for Quantum Technologies


A Waterloo researcher is part of an international team that has proven that two peculiar features of the quantum world – long thought to be distinct – are actually different manifestations of the same thing.

The breakthrough findings are published today inNature Communications. The two distinct ideas in question have been fundamental concepts in quantum physics since the early 1900s. They are what is known as the wave-particle duality and the uncertainty principle.

“We were guided by a gut feeling, and only a gut feeling, that there should be a connection,” says Patrick Coles, now a postdoctoral fellow at the Institute for Quantum Computing and the Department of Physics and Astronomy at the University of Waterloo.

  • Wave-particle duality is the idea that a quantum particle can behave like a wave, but that the wave behavior disappears if you try to locate the object.
  • The uncertainty principle is the idea that it’s impossible to know certain pairs of things about a quantum particle at once. For example, the more precisely you know the position of an atom, the less precisely you can know the speed with which it’s moving.

Coles was part of the research team at the National University of Singapore that made the discovery that wave-particle duality is simply the quantum uncertainty principle in disguise.

Like discovering the Rosetta Stone of quantum physics

“It was like we had discovered the ‘Rosetta Stone’ that connected two different languages,” says Coles. “The literature on wave-particle duality was like hieroglyphics that we could translate into our native tongue. We had several eureka moments when we finally understood what people had done.”

The research team at Singapore’s Centre for Quantum Technologies, included Jedrzej Kaniewski and Stephanie Wehner, now both researchers at the Netherlands’ Delft University of Technology.

“The connection between uncertainty and wave-particle duality comes out very naturally when you consider them as questions about what information you can gain about a system. Our result highlights the power of thinking about physics from the perspective of information,” says Wehner.

The wave-particle duality is perhaps most simply seen in a double slit experiment, where single particles, electrons, say, are fired one by one at a screen containing two narrow slits. The particles pile up behind the slits not in two heaps as classical objects would, but in a stripy pattern like you’d expect for waves interfering. At least this is what happens until you sneak a look at which slit a particle goes through – do that and the interference pattern vanishes.

The discovery deepens our understanding of quantum physics and could prompt ideas for new applications of wave-particle duality.

New protocols for quantum cryptography possible

Coles, Kaniewski and Wehner are experts in a form of mathematical equations known as ‘entropic uncertainty relations.’ They discovered that all the maths previously used to describe wave-particle duality could be reformulated in terms of these relations.

Because the entropic uncertainty relations used in their translation have also been used in proving the security of quantum cryptography – schemes for secure communication using quantum particles – the researchers suggest the work could help inspire new cryptography protocols.

How is nature itself constructed?

In earlier papers, the researchers found connections between the uncertainty principle and other physics, namely quantum ‘non-locality’ and the second law of thermodynamics. The tantalizing next goal for the researchers is to think about how these pieces fit together and what bigger picture that paints of how nature is constructed.

Source: University of WaterLoo

Characteristics of a universal simulator|Study narrows the scope of research on quantum computing

Despite a lot of work being done by many research groups around the world, the field of Quantum computing is still in its early stages. We still need to cover a lot of grounds to achieve the goal of developing a working Quantum computer capable of doing the tasks which are expected or predicted. Recent research by a SISSA led team has tried to give the future research in the area of Quantum computing some direction based on the current state of research in the area.


“A quantum computer may be thought of as a ‘simulator of overall Nature,” explains Fabio Franchini, a researcher at the International School for Advanced Studies (SISSA) of Trieste, “in other words, it’s a machine capable of simulating Nature as a quantum system, something that classical computers cannot do”. Quantum computers are machines that carry out operations by exploiting the phenomena of quantum mechanics, and they are capable of performing different functions from those of current computers. This science is still very young and the systems produced to date are still very limited. Franchini is the first author of a study just published in Physical Review Xwhich establishes a basic characteristic that this type of machine should possess and in doing so guides the direction of future research in this field.

The study used analytical and numerical methods. “What we found” explains Franchini, “is that a system that does not exhibit ‘Majorana fermions’ cannot be a universal quantum simulator”. Majorana fermions were hypothesized by Ettore Majorana in a paper published 1937, and they display peculiar characteristics: a Majorana fermion is also its own antiparticle. “That means that if Majorana fermions meet they annihilate among themselves,” continues Franchini. “In recent years it has been suggested that these fermions could be found in states of matter useful for quantum computing, and our study confirms that they must be present, with a certain probability related to entanglement, in the material used to build the machine”.

Entanglement, or “action at a distance”, is a property of quantum systems whereby an action done on one part of the system has an effect on another part of the same system, even if the latter has been split into two parts that are located very far apart. “Entanglement is a fundamental phenomenon for quantum computers,” explains Franchini.

“Our study helps to understand what types of devices research should be focusing on to construct this universal simulator. Until now, given the lack of criteria, research has proceeded somewhat randomly, with a huge consumption of time and resources”.

The study was conducted with the participation of many other international research institutes in addition to SISSA, including the Massachusetts Institute of Technology (MIT) in Boston, the University of Oxford and many others.

More in detail…

“Having a quantum computer would open up new worlds. For example, if we had one today we would be able to break into any bank account,” jokes Franchini. “But don’t worry, we’re nowhere near that goal”.

At the present time, several attempts at quantum machines exist that rely on the properties of specific materials. Depending on the technology used, these computers have sizes varying from a small box to a whole room, but so far they are only able to process a limited number of information bits, an amount infinitely smaller than that processed by classical computers.

However, it’s not correct to say that quantum computers are, or will be, more powerful than traditional ones, points out Franchini. “There are several things that these devices are worse at. But, by exploiting quantum mechanics, they can perform operations that would be impossible for classical computers”.

Source: International School of Advanced Studies (SISSA)

 

By passing it through a special crystal, a light wave’s phase---denoting position along the wave’s cycle---can be delayed.  A delay of a certain amount can denote a piece of data.  In this experiment light pulses can be delayed by a zero amount, or by ¼ of a cycle, or 2/4, or ¾ of a cycle. -
Credit : JQI

Best Quantum Receiver

RECORD HIGH DATA ACCURACY RATES FOR PHASE-MODULATED TRANSMISSION

We want data.  Lots of it.  We want it now.  We want it to be cheap and accurate.

 Researchers try to meet the inexorable demands made on the telecommunications grid by improving various components.  In October 2014, for instance, scientists at the Eindhoven University of Technology in The Netherlands did their part by setting a new record for transmission down a single optical fiber: 255 terabits per second.

 Alan Migdall and Elohim Becerra and their colleagues at the Joint Quantum Institute do their part by attending to the accuracy at the receiving end of the transmission process.  They have devised a detection scheme with an error rate 25 times lower than the fundamental limit of the best conventional detector.  They did this by employing not passive detection of incoming light pulses.  Instead the light is split up and measured numerous times.

By passing it through a special crystal, a light wave’s phase---denoting position along the wave’s cycle---can be delayed.  A delay of a certain amount can denote a piece of data.  In this experiment light pulses can be delayed by a zero amount, or by ¼ of a cycle, or 2/4, or ¾ of a cycle. - Credit : JQI
By passing it through a special crystal, a light wave’s phase—denoting position along the wave’s cycle—can be delayed. A delay of a certain amount can denote a piece of data. In this experiment light pulses can be delayed by a zero amount, or by ¼ of a cycle, or 2/4, or ¾ of a cycle. -
Credit : JQI

 The new detector scheme is described in a paper published in the journal Nature Photonics.

 “By greatly reducing the error rate for light signals we can lessen the amount of power needed to send signals reliably,” says Migdall.  “This will be important for a lot practical applications in information technology, such as using less power in sending information to remote stations.  Alternatively, for the same amount of power, the signals can be sent over longer distances.”

Phase Coding

Most information comes to us nowadays in the form of light, whether radio waves sent through the air or infrared waves send up a fiber.  The information can be coded in several ways.  Amplitude modulation (AM) maps analog information onto a carrier wave by momentarily changing its amplitude.  Frequency modulation (FM) maps information by changing the instantaneous frequency of the wave.  On-off modulation is even simpler: quickly turn the wave off (0) and on (1) to convey a desired pattern of binary bits.

 Because the carrier wave is coherent—for laser light this means a predictable set of crests and troughs along the wave—a more sophisticated form of encoding data can be used.  In phase modulation (PM) data is encoded in the momentary change of the wave’s phase; that is, the wave can be delayed by a fraction of its cycle time to denote particular data.  How are light waves delayed?  Usually by sending the waves through special electrically controlled crystals.

 Instead of using just the two states (0 and 1) of binary logic, Migdall’s experiment waves are modulated to provide four states (1, 2, 3, 4), which correspond respectively to the wave being un-delayed, delayed by one-fourth of a cycle, two-fourths of a cycle, and three-fourths of a cycle.  The four phase-modulated states are more usefully depicted as four positions around a circle (figure 2).  The radius of each position corresponds to the amplitude of the wave, or equivalently the number of photons in the pulse of waves at that moment.  The angle around the graph corresponds to the signal’s phase delay.

 The imperfect reliability of any data encoding scheme reflects the fact that signals might be degraded or the detectors poor at their job.  If you send a pulse in the 3 state, for example, is it detected as a 3 state or something else?  Figure 2, besides showing the relation of the 4 possible data states, depicts uncertainty inherent in the measurement as a fuzzy cloud.  A narrow cloud suggests less uncertainty; a wide cloud more uncertainty.  False readings arise from the overlap of these uncertainty clouds.  If, say, the clouds for states 2 and 3 overlap a lot, then errors will be rife.

 In general the accuracy will go up if n, the mean number of photons (comparable to the intensity of the light pulse) goes up.  This principle is illustrated by the figure to the right, where now the clouds are farther apart than in the left panel.  This means there is less chance of mistaken readings.  More intense beams require more power, but this mitigates the chance of overlapping blobs.

Twenty Questions

So much for the sending of information pulses.  How about detecting and accurately reading that information?  Here the JQI detection approach resembles “20 questions,” the game in which a person identifies an object or person by asking question after question, thus eliminating all things the object is not.

In the scheme developed by Becerra (who is now at University of New Mexico), the arriving information is split by a special mirror that typically sends part of the waves in the pulse into detector 1.  There the waves are combined with a reference pulse.  If the reference pulse phase is adjusted so that the two wave trains interfere destructively (that is, they cancel each other out exactly), the detector will register a nothing.  This answers the question “what state was that incoming light pulse in?” When the detector registers nothing, then the phase of the reference light provides that answer, … probably.

That last caveat is added because it could also be the case that the detector (whose efficiency is less than 100%) would not fire even with incoming light present. Conversely, perfect destructive interference might have occurred, and yet the detector still fires—an eventuality called a “dark count.”  Still another possible glitch: because of optics imperfections even with a correct reference–phase setting, the destructive interference might be incomplete, allowing some light to hit the detector.

The way the scheme handles these real world problems is that the system tests a portion of the incoming pulse and uses the result to determine the highest probability of what the incoming state must have been. Using that new knowledge the system adjusts the phase of the reference light to make for better destructive interference and measures again. A new best guess is obtained and another measurement is made.

As the process of comparing portions of the incoming information pulse with the reference pulse is repeated, the estimation of the incoming signal’s true state was gets better and better.  In other words, the probability of being wrong decreases.

Encoding millions of pulses with known information values and then comparing to the measured values, the scientists can measure the actual error rates.  Moreover, the error rates can be determined as the input laser is adjusted so that the information pulse comprises a larger or smaller number of photons.  (Because of the uncertainties intrinsic to quantum processes, one never knows precisely how many photons are present, so the researchers must settle for knowing the mean number.)

A plot of the error rates shows that for a range of photon numbers, the error rates fall below the conventional limit, agreeing with results from Migdall’s experiment from two years ago. But now the error curve falls even more below the limit and does so for a wider range of photon numbers than in the earlier experiment. The difference with the present experiment is that the detectors are now able to resolve how many photons (particles of light) are present for each detection.  This allows the error rates to improve greatly.

For example, at a photon number of 4, the expected error rate of this scheme (how often does one get a false reading) is about 5%.  By comparison, with a more intense pulse, with a mean photon number of 20, the error rate drops to less than a part in a million.

The earlier experiment achieved error rates 4 times better than the “standard quantum limit,” a level of accuracy expected using a standard passive detection scheme.  The new experiment, using the same detectors as in the original experiment but in a way that could extract some photon-number-resolved information from the measurement, reaches error rates 25 times below the standard quantum limit.

“The detectors we used were good but not all that heroic,” says Migdall.  “With more sophistication the detectors can probably arrive at even better accuracy.”

The JQI detection scheme is an example of what would be called a “quantum receiver.”  Your radio receiver at home also detects and interprets waves, but it doesn’t merit the adjective quantum.  The difference here is single photon detection and an adaptive measurement strategy is used.  A stable reference pulse is required. In the current implementation that reference pulse has to accompany the signal from transmitter to detector.

Suppose you were sending a signal across the ocean in the optical fibers under the Atlantic.  Would a reference pulse have to be sent along that whole way?  “Someday atomic clocks might be good enough,” says Migdall, “that we could coordinate timing so that the clock at the far end can be read out for reference rather than transmitting a reference along with the signal.”

- See more at: http://jqi.umd.edu/news/best-quantum-receiver#sthash.SS5zfkis.dpuf

- Source: JQI 

The mass difference spectrum: the LHCb result shows strong evidence of the existence of two new particles the Xi_b'- (first peak) and Xi_b*- (second peak), with the very high-level confidence of 10 sigma. The black points are the signal sample and the hatched red histogram is a control sample. The blue curve represents a model including the two new particles, fitted to the data. Delta_m is the difference between the mass of the Xi_b0 pi- pair and the sum of the individual masses of the Xi_b0 and pi-.. INSET: Detail of the Xi_b'- region plotted with a finer binning.
Credit: CERN

LHCb experiment observes two new baryon particles never seen before

Geneva 19 November 2014. Today the collaboration for the LHCb experiment at CERN1’s Large Hadron Collider announced the discovery of two new particles in the baryon family. The particles, known as the Xi_b’- and Xi_b*-, were predicted to exist by the quark model but had never been seen before. A related particle, the Xi_b*0, was found by the CMS experiment at CERN in 2012. The LHCb collaboration submitted a paper reporting the finding to Physical Review Letters.

Like the well-known protons that the LHC accelerates, the new particles are baryons made from three quarks bound together by the strong force. The types of quarks are different, though: the new X_ib particles both contain one beauty (b), one strange (s), and one down (d) quark. Thanks to the heavyweight b quarks, they are more than six times as massive as the proton. But the particles are more than just the sum of their parts: their mass also depends on how they are configured. Each of the quarks has an attribute called “spin”. In the Xi_b’- state, the spins of the two lighter quarks point in the opposite direction to the b quark, whereas in the Xi_b*- state they are aligned. This difference makes the Xi_b*a little heavier.

“Nature was kind and gave us two particles for the price of one,” said Matthew Charles of the CNRS’s LPNHE laboratory at Paris VI University. “The Xi_b’is very close in mass to the sum of its decay products: if it had been just a little lighter, we wouldn’t have seen it at all using the decay signature that we were looking for.”

 The mass difference spectrum: the LHCb result shows strong evidence of the existence of two new particles the Xi_b'- (first peak) and Xi_b*- (second peak), with the very high-level confidence of 10 sigma. The black points are the signal sample and the hatched red histogram is a control sample. The blue curve represents a model including the two new particles, fitted to the data. Delta_m is the difference between the mass of the Xi_b0 pi- pair and the sum of the individual masses of the Xi_b0 and pi-.. INSET: Detail of the Xi_b'- region plotted with a finer binning. Credit: CERN
The mass difference spectrum: the LHCb result shows strong evidence of the existence of two new particles the Xi_b’- (first peak) and Xi_b*- (second peak), with the very high-level confidence of 10 sigma. The black points are the signal sample and the hatched red histogram is a control sample. The blue curve represents a model including the two new particles, fitted to the data. Delta_m is the difference between the mass of the Xi_b0 pi- pair and the sum of the individual masses of the Xi_b0 and pi-.. INSET: Detail of the Xi_b’- region plotted with a finer binning.
Credit: CERN

“This is a very exciting result. Thanks to LHCb’s excellent hadron identification, which is unique among the LHC experiments, we were able to separate a very clean and strong signal from the background,”said Steven Blusk from Syracuse University in New York. “It demonstrates once again the sensitivity and how precise the LHCb detector is.”

As well as the masses of these particles, the research team studied their relative production rates, their widths – a measure of how unstable they are – and other details of their decays. The results match up with predictions based on the theory of Quantum Chromodynamics (QCD).

QCD is part of the Standard Model of particle physics, the theory that describes the fundamental particles of matter, how they interact and the forces between them. Testing QCD at high precision is a key to refine our understanding of quark dynamics, models of which are tremendously difficult to calculate.

“If we want to find new physics beyond the Standard Model, we need first to have a sharp picture,” said LHCb’s physics coordinator Patrick Koppenburg from Nikhef Institute in Amsterdam. “Such high precision studies will help us to differentiate between Standard Model effects and anything new or unexpected in the future.”

The measurements were made with the data taken at the LHC during 2011-2012. The LHC is currently being prepared – after its first long shutdown – to operate at higher energies and with more intense beams. It is scheduled to restart by spring 2015.

Further information

Link to the paper on Arxiv: http://arxiv.org/abs/1411.4849(link is external)
More about the result on LHCb’s collaboration website: http://lhcb-public.web.cern.ch/lhcb-public/Welcome.html#StrBeaBa
Observation of a new Xi_b*0 beauty particle, on CMS’ collaboration website:http://cms.web.cern.ch/news/observation-new-xib0-beauty-particle

Footnote(s)

1. CERN, the European Organization for Nuclear Research, is the world’s leading laboratory for particle physics. It has its headquarters in Geneva. At present, its Member States are Austria, Belgium, Bulgaria, the Czech Republic, Denmark, Finland, France, Germany, Greece, Hungary, Israel, Italy, the Netherlands, Norway, Poland, Portugal, Slovakia, Spain, Sweden, Switzerland and the United Kingdom. Romania is a Candidate for Accession. Serbia is an Associate Member in the pre-stage to Membership. India, Japan, the Russian Federation, the United States of America, Turkey, the European Commission and UNESCO have Observer Status.

Source: CERN

Simulated view of a black hole (center) in front of the Large Magellanic Cloud. Note the gravitational lensing effect, which produces two enlarged but highly distorted views of the Cloud. Across the top, the Milky Way disk appears distorted into an arc.

Simulated view of a black hole in front of the Large Magellanic Cloud. The ratio between the black hole Schwarzschild radius and the observer distance to it is 1:9. Of note is the gravitational lensing effect known as an Einstein ring, which produces a set of two fairly bright and large but highly distorted images of the Cloud as compared to its actual angular size. Image Source: wikipedia.org By User:Alain r (Own work) [CC-BY-SA-2.5 (http://creativecommons.org/licenses/by-sa/2.5)], via Wikimedia Commons

Hawking Radiation Reportedly Observed in a Lab Experiment

Recently published paper in Nature Physics claims to have observed the Hawking radiation as predicted by one of the most respected theoretical astrophysicists of all times, Stephen Hawkings.

Author of the paper Dr. Jeff Steinhauer, physicist at the Technion-Israel Institute of Technology in Haifa, mimicked a charged black hole by creating a narrow, low density, very low temperature atomic Bose–Einstein condensate (BEC), containing an analogue black-hole horizon and an inner horizon.

Simulated view of a black hole (center) in front of the Large Magellanic Cloud. Note the gravitational lensing effect, which produces two enlarged but highly distorted views of the Cloud. Across the top, the Milky Way disk appears distorted into an arc. Simulated view of a black hole in front of the Large Magellanic Cloud. The ratio between the black hole Schwarzschild radius and the observer distance to it is 1:9. Of note is the gravitational lensing effect known as an Einstein ring, which produces a set of two fairly bright and large but highly distorted images of the Cloud as compared to its actual angular size. Image Source: wikipedia.org By User:Alain r (Own work) [CC-BY-SA-2.5 (http://creativecommons.org/licenses/by-sa/2.5)], via Wikimedia Commons
Simulated view of a black hole (center) in front of the Large Magellanic Cloud. Note the gravitational lensing effect, which produces two enlarged but highly distorted views of the Cloud. Across the top, the Milky Way disk appears distorted into an arc.
Simulated view of a black hole in front of the Large Magellanic Cloud. The ratio between the black hole Schwarzschild radius and the observer distance to it is 1:9. Of note is the gravitational lensing effect known as an Einstein ring, which produces a set of two fairly bright and large but highly distorted images of the Cloud as compared to its actual angular size. Image Source: wikipedia.org By User:Alain r (Own work) [CC-BY-SA-2.5 (http://creativecommons.org/licenses/by-sa/2.5)], via Wikimedia Commons
This model black hole produced the kind of emissions as predicted by Stephen Hawkings. The results may not fully confirm the existence of Hawking radiations but they will surely narrow down the possibilities or at least give some revolutionary boost to the search of the phenomenon.  The discovery can play a very important role in research areas related to quantum field theory and general relativity.

It will be interesting to see how scientific community especially other experimental physicists and observational astronomers see the findings.

The full abstract and paper are available at :

http://www.nature.com/nphys/journal/vaop/ncurrent/full/nphys3104.html

Magnetic states at oxide interfaces controlled by electricity. Top image show magnetic state with -3 volts applied, and bottom image shows nonmagnetic state with 0 volts applied. Credit: University of Pittsburgh

New Discovery Could Pave the Way for Spin-based Computing

Novel oxide-based magnetism follows electrical commands

PITTSBURGH—Electricity and magnetism rule our digital world. Semiconductors process electrical information, while magnetic materials enable long-term data storage. A University of Pittsburgh research team has discovered a way to fuse these two distinct properties in a single material, paving the way for new ultrahigh density storage and computing architectures.

Whilephones and laptops rely on electricity to process and temporarily store information, long-term data storage is still largely achieved via magnetism. Discs coated with magnetic material are locally oriented (e.g. North or South to represent “1” and “0”), and each independent magnet can be used to store a single bit of information. However, this information is not directly coupled to the semiconductors used to process information. Having a magnetic material that can store and process information would enable new forms of hybrid storage and processing capabilities.Such a material has been created by the Pitt research team led by Jeremy Levy, a Distinguished Professor of Condensed Matter Physics in Pitt’s Kenneth P. Dietrich School of Arts and Sciences and director of the Pittsburgh Quantum Institute.

Magnetic states at oxide interfaces controlled by electricity. Top image show magnetic state with -3 volts applied, and bottom image shows nonmagnetic state with 0 volts applied. Credit: University of Pittsburgh
Magnetic states at oxide interfaces controlled by electricity. Top image show magnetic state with -3 volts applied, and bottom image shows nonmagnetic state with 0 volts applied. Credit: University of Pittsburgh

Levy, other researchers at Pitt, and colleagues at the University of Wisconsin-Madison today published their work in Nature Communications, elucidating their discovery of a form of magnetism that can be stabilized with electric fields rather than magnetic fields. The University of Wisconsin-Madision researchers were led by Chang-Beom Eom, the Theodore H. Geballe Professor and Harvey D. Spangler Distinguished Professor in the Department of Materials Science and Engineering. Working with a material formed from a thick layer of one oxide—strontium titanate—and a thin layer of a second material—lanthanum aluminate—these researchers have found that the interface between these materials can exhibit magnetic behavior that is stable at room temperature. The interface is normally conducting, but by “chasing” away the electrons with an applied voltage (equivalent to that of two AA batteries), the material becomes insulating and magnetic. The magnetic properties are detected using “magnetic force microscopy,” an imaging technique that scans a tiny magnet over the material to gauge the relative attraction or repulsion from the magnetic layer.

The newly discovered magnetic properties come on the heels of a previous invention by Levy, so-called “Etch-a-Sketch Nanoelectronics” involving the same material. The discovery of magnetic properties can now be combined with ultra-small transistors, terahertz detectors, and single-electron devices previously demonstrated.

“This work is indeed very promising and may lead to a new type of magnetic storage,” says Stuart Wolf, head of the nanoSTAR Institute at the University of Virginia. Though not an author on this paper, Wolf is widely regarded as a pioneer in the area of spintronics.

“Magnetic materials tend to respond to magnetic fields and are not so sensitive to electrical influences,” Levy says. “What we have discovered is that a new family of oxide-based materials can completely change its behavior based on electrical input.”

This discovery was supported by grants from the National Science Foundation, the Air Force Office of Scientific Research, and the Army Research Office.

Source: University of Pittsburgh News

Carolina’s Laura Mersini-Houghton shows that black holes do not exist

Carolina’s Laura Mersini-Houghton shows that black holes do not exist

 

The term black hole is entrenched in the English language. Can we let it go?

(Chapel Hill, N.C. – Sept. 23, 2014) Black holes have long captured the public imagination and been the subject of popular culture, from Star Trek to Hollywood. They are the ultimate unknown – the blackest and most dense objects in the universe that do not even let light escape. And as if they weren’t bizarre enough to begin with, now add this to the mix: they don’t exist.

By merging two seemingly conflicting theories, Laura Mersini-Houghton, a physics professor at UNC-Chapel Hill in the College of Arts and Sciences, has proven, mathematically, that black holes can never come into being in the first place. The work not only forces scientists to reimagine the fabric of space-time, but also rethink the origins of the universe.

“I’m still not over the shock,” said Mersini-Houghton. “We’ve been studying this problem for a more than 50 years and this solution gives us a lot to think about.”

For decades, black holes were thought to form when a massive star collapses under its own gravity to a single point in space – imagine the Earth being squished into a ball the size of a peanut – called a singularity. So the story went, an invisible membrane known as the event horizon surrounds the singularity and crossing this horizon means that you could never cross back. It’s the point where a black hole’s gravitational pull is so strong that nothing can escape it.

The reason black holes are so bizarre is that it pits two fundamental theories of the universe against each other. Einstein’s theory of gravity predicts the formation of black holes but a fundamental law of quantum theory states that no information from the universe can ever disappear. Efforts to combine these two theories lead to mathematical nonsense, and became known as the information loss paradox.

In 1974, Stephen Hawking used quantum mechanics to show that black holes emit radiation. Since then, scientists have detected fingerprints in the cosmos that are consistent with this radiation, identifying an ever-increasing list of the universe’s black holes.

But now Mersini-Houghton describes an entirely new scenario. She and Hawking both agree that as a star collapses under its own gravity, it produces Hawking radiation. However, in her new work, Mersini-Houghton shows that by giving off this radiation, the star also sheds mass. So much so that as it shrinks it no longer has the density to become a black hole.

Before a black hole can form, the dying star swells one last time and then explodes. A singularity never forms and neither does an event horizon. The take home message of her work is clear: there is no such thing as a black hole.

The paper, which was recently submitted to ArXiv, an online repository of physics papers that is not peer-reviewed, offers exact numerical solutions to this problem and was done in collaboration with Harald Peiffer, an expert on numerical relativity at the University of Toronto. An earlier paper, by Mersini-Houghton, originally submitted to ArXiv in June, was published in the journal Physics Letters B, and offers approximate solutions to the problem.

Experimental evidence may one day provide physical proof as to whether or not black holes exist in the universe. But for now, Mersini-Houghton says the mathematics are conclusive.

Many physicists and astronomers believe that our universe originated from a singularity that began expanding with the Big Bang. However, if singularities do not exist, then physicists have to rethink their ideas of the Big Bang and whether it ever happened.

“Physicists have been trying to merge these two theories – Einstein’s theory of gravity and quantum mechanics – for decades, but this scenario brings these two theories together, into harmony,” said Mersini-Houghton. “And that’s a big deal.”

-Carolina-

Mersini-Houghton’s ArXiv papers:

Approximate solutions:http://arxiv.org/abs/arXiv:1406.1525

Exact solutions:http://arxiv.org/abs/arXiv:1409.1837

Source: UNC News

Light enters a two-dimensional ring-resonator array from the lower left and exits at the lower right. Light that follows the edge of the array (blue) does not suffer energy loss and exits after a consistent amount of delay. Light that travels into the interior of the array (green) suffers energy loss. 
Credit: Sean Kelley/JQI

On-chip Topological Light

FIRST MEASUREMENTS OF TRANSMISSION AND DELAY

Topological transport of light is the photonic analog of topological electron flow in certain semiconductors. In the electron case, the current flows around the edge of the material but not through the bulk. It is “topological” in that even if electrons encounter impurities in the material the electrons will continue to flow without losing energy.

Light enters a two-dimensional ring-resonator array from the lower left and exits at the lower right. Light that follows the edge of the array (blue) does not suffer energy loss and exits after a consistent amount of delay. Light that travels into the interior of the array (green) suffers energy loss.  Credit: Sean Kelley/JQI
Light enters a two-dimensional ring-resonator array from the lower left and exits at the lower right. Light that follows the edge of the array (blue) does not suffer energy loss and exits after a consistent amount of delay. Light that travels into the interior of the array (green) suffers energy loss.
Credit: Sean Kelley/JQI

In the photonic equivalent, light flows not through and around a regular material but in a meta-material consisting of an array of tiny glass loops fabricated on a silicon substrate. If the loops are engineered just right, the topological feature appears: light sent into the array easily circulates around the edge with very little energy loss (even if some of the loops aren’t working) while light taking an interior route suffers loss.

Mohammad Hafezi and his colleagues at the Joint Quantum Institute have published a series of papers on the subject of topological light. The first pointed out the potential application of robustness in delay lines and conceived a scheme to implement quantum Hall models in arrays of photonic loops. In photonics, signals sometimes need to be delayed, usually by sending light into a kilometers-long loop of optical fiber. In an on-chip scheme, such delays could be accomplished on the microscale; this is in addition to the energy-loss reduction made possible by topological robustness (see Miniaturizing Delay Lines below).

The 2D array consists of resonator rings, where light spends more time, and link rings, where light spends little time. Undergoing a circuit around a complete unit cell of rings, light will return to the starting point with a slight change in phase, phi. Credit: Sean Kelley/JQI
The 2D array consists of resonator rings, where light spends more time, and link rings, where light spends little time. Undergoing a circuit around a complete unit cell of rings, light will return to the starting point with a slight change in phase, phi.
Credit: Sean Kelley/JQI

The next paper reported on results from an actual experiment. Since the tiny loops aren’t perfect, they do allow a bit of light to escape vertically out of the plane of the array (see Topological Light below). This faint light allowed the JQI experimenters to image the course of light. This confirmed the plan that light persists when it goes around the edge of the array but suffers energy loss when traveling through the bulk.

The third paper, appearing now in Physical Review Letters, and highlighted in a Viewpoint, actually delivers detailed measurements of the transmission (how much energy is lost) and delay for edge-state light and for bulk-route light (see reference publication below). The paper is notable enough to have received an “editor’s suggestion” designation. “Apart from the potential photonic-chip applications of this scheme,” said Hafezi, “this photonic platform could allow us to investigate fundamental quantum transport properties.”

Another measured quality is consistency. Sunil Mittal, a graduate student at the University of Maryland and first author on the paper, points out that microchip manufacturing is not a perfect process. “Irregularities in integrated photonic device fabrication usually result in device-to-device performance variations,” he said. And this usually undercuts the microchip performance. But with topological protection (photons traveling at the edge of the array are practically invulnerable to impurities) at work, consistency is greatly strengthened.

Indeed, the authors, reporting trials with numerous array samples, reveal that for light taking the bulk (interior) route in the array, the delay and transmission of light can vary a lot, whereas for light making the edge route, the amount of energy loss is regularly less and the time delay for signals more consistent. Robustness and consistency are vital if you want to integrate such arrays into photonic schemes for processing quantum information.

How does the topological property emerge at the microscopic level? First, look at the electron topological behavior, which is an offshoot of the quantum Hall effect. Electrons, under the influence of an applied magnetic field can execute tiny cyclonic orbits. In some materials, called topological insulators, no external magnetic field is needed since the necessary field is supplied by spin-orbit interactions — that is, the coupling between the orbital motion of electrons and their spins. In these materials the conduction regime is topological: the material is conductive around the edge but is an insulator in the interior.

And now for the photonic equivalent. Light waves do not usually feel magnetic fields, and if they do it is very weak. In the photonic case, the equivalent of a magnetic field is supplied by a subtle phase shift imposed on the light as it circulates around the loops. Actually the loops in the array are of two kinds: resonator loops designed to exactly accommodate light at a certain frequency, allowing the waves to circle the loop many times. Link loops, by contrast, are not exactly suited to the waves, and are designed chiefly to pass the light onto the neighboring resonator loop.

Light that circulates around one unit cell of the loop array will undergo a slight phase change, an amount signified by the letter phi. That is, the light signal, in coming around the unit cell, re-arrives where it started advanced or retarded just a bit from its original condition. Just this amount of change imparts the topological robustness to the global transmission of the light in the array.

In summary, documented on-chip light delay and a robust, consistent, low-loss transport of light has now been demonstrated. The transport of light is tunable to a range of frequencies and the chip can be manufactured using standard micro-fabrications techniques.

REFERENCE PUBLICATION
RESEARCH CONTACT
Mohammad Hafezi

|

|

Sunil Mittal

|

|

MEDIA CONTACT
Phillip F. Schewe

|

|

(301) 405-0989

- See more at: http://jqi.umd.edu/news/on-chip-topological-light#sthash.nXI5fKGs.dpuf

Source: Joint Quantum Institute