The remains of a fatal interaction between a dead star and its asteroid supper have been studied in detail for the first time by an international team of astronomers using the Very Large Telescope at ESO’s Paranal Observatory in Chile. This gives a glimpse of the far-future fate of the Solar System.
Led by Christopher Manser, a PhD student at the University of Warwick in the United Kingdom, the team used data from ESO’s Very Large Telescope (VLT) and other observatories to study the shattered remains of an asteroid around a stellar remnant — a white dwarf called SDSS J1228+1040 .
Using several instruments, including the Ultraviolet and Visual Echelle Spectrograph (UVES) and X-shooter, both attached to the VLT, the team obtained detailed observations of the light coming from the white dwarf and its surrounding material over an unprecedented period of twelve years between 2003 and 2015. Observations over periods of years were needed to probe the system from multiple viewpoints .
“The image we get from the processed data shows us that these systems are truly disc-like, and reveals many structures that we cannot detect in a single snapshot,” explained lead author Christopher Manser.
The team used a technique called Doppler tomography — similar in principle to medical tomographic scans of the human body — which allowed them to map out in detail the structure of the glowing gaseous remains of the dead star’s meal orbiting J1228+1040 for the first time.
While large stars — those more massive than around ten times the mass of the Sun — suffer a spectacularly violent climax as a supernova explosion at the ends of their lives, smaller stars are spared such dramatic fates. When stars like the Sun come to the ends of their lives they exhaust their fuel, expand as red giants and later expel their outer layers into space. The hot and very dense core of the former star — a white dwarf — is all that remains.
But would the planets, asteroids and other bodies in such a system survive this trial by fire? What would be left? The new observations help to answer these questions.
It is rare for white dwarfs to be surrounded by orbiting discs of gaseous material — only seven have ever been found. The team concluded that an asteroid had strayed dangerously close to the dead star and been ripped apart by the immense tidal forces it experienced to form the disc of material that is now visible.
The orbiting disc was formed in similar ways to the photogenic rings seen around planets closer to home, such as Saturn. However, while J1228+1040 is more than seven times smaller in diameter than the ringed planet, it has a mass over 2500 times greater. The team learned that the distance between the white dwarf and its disc is also quite different — Saturn and its rings could comfortably sit in the gap between them .
The new long-term study with the VLT has now allowed the team to watch the disc precess under the influence of the very strong gravitational field of the white dwarf. They also find that the disc is somewhat lopsided and has not yet become circular.
“When we discovered this debris disc orbiting the white dwarf back in 2006, we could not have imagined the exquisite details that are now visible in this image, constructed from twelve years of data — it was definitely worth the wait,” added Boris Gänsicke, a co-author of the study.
Remnants such as J1228+1040 can provide key clues to understanding the environments that exist as stars reach the ends of their lives. This can help astronomers to understand the processes that occur in exoplanetary systems and even forecast the fate of the Solar System when the Sun meets its demise in about seven billion years.
 The white dwarf’s full designation is SDSS J122859.93+104032.9.
 The team identified the unmistakable trident-like spectral signature from ionised calcium, called the calcium (Ca II) triplet. The difference between the observed and known wavelengths of these three lines can determine the velocity of the gas with considerable precision.
 Although the disc around this white dwarf is much bigger than Saturn’s ring system in the Solar System, it is tiny compared to the debris discs that form planets around young stars.
On Aug. 7, 1972, in the heart of the Apollo era, an enormous solar flare exploded from the sun’s atmosphere. Along with a gigantic burst of light in nearly all wavelengths, this event accelerated a wave of energetic particles. Mostly protons, with a few electrons and heavier elements mixed in, this wash of quick-moving particles would have been dangerous to anyone outside Earth’s protective magnetic bubble. Luckily, the Apollo 16 crew had returned to Earth just five months earlier, narrowly escaping this powerful event.
In the early days of human space flight, scientists were only just beginning to understand how events on the sun could affect space, and in turn how that radiation could affect humans and technology. Today, as a result of extensive space radiation research, we have a much better understanding of our space environment, its effects, and the best ways to protect astronauts—all crucial parts of NASA’s mission to send humans to Mars.
“The Martian” film highlights the radiation dangers that could occur on a round trip to Mars. While the mission in the film is fictional, NASA has already started working on the technology to enable an actual trip to Mars in the 2030s. In the film, the astronauts’ habitat on Mars shields them from radiation, and indeed, radiation shielding will be a crucial technology for the voyage. From better shielding to advanced biomedical countermeasures, NASA currently studies how to protect astronauts and electronics from radiation – efforts that will have to be incorporated into every aspect of Mars mission planning, from spacecraft and habitat design to spacewalk protocols.
“The space radiation environment will be a critical consideration for everything in the astronauts’ daily lives, both on the journeys between Earth and Mars and on the surface,” said Ruthan Lewis, an architect and engineer with the human spaceflight program at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. “You’re constantly being bombarded by some amount of radiation.”
Radiation, at its most basic, is simply waves or sub-atomic particles that transports energy to another entity – whether it is an astronaut or spacecraft component. The main concern in space is particle radiation. Energetic particles can be dangerous to humans because they pass right through the skin, depositing energy and damaging cells or DNA along the way. This damage can mean an increased risk for cancer later in life or, at its worst, acute radiation sickness during the mission if the dose of energetic particles is large enough.
Fortunately for us, Earth’s natural protections block all but the most energetic of these particles from reaching the surface. A huge magnetic bubble, called the magnetosphere, which deflects the vast majority of these particles, protects our planet. And our atmosphere subsequently absorbs the majority of particles that do make it through this bubble. Importantly, since the International Space Station (ISS) is in low-Earth orbit within the magnetosphere, it also provides a large measure of protection for our astronauts.
“We have instruments that measure the radiation environment inside the ISS, where the crew are, and even outside the station,” said Kerry Lee, a scientist at NASA’s Johnson Space Center in Houston.
This ISS crew monitoring also includes tracking of the short-term and lifetime radiation doses for each astronaut to assess the risk for radiation-related diseases. Although NASA has conservative radiation limits greater than allowed radiation workers on Earth, the astronauts are able to stay well under NASA’s limit while living and working on the ISS, within Earth’s magnetosphere.
But a journey to Mars requires astronauts to move out much further, beyond the protection of Earth’s magnetic bubble.
“There’s a lot of good science to be done on Mars, but a trip to interplanetary space carries more radiation risk than working in low-Earth orbit,” said Jonathan Pellish, a space radiation engineer at Goddard.
A human mission to Mars means sending astronauts into interplanetary space for a minimum of a year, even with a very short stay on the Red Planet. Nearly all of that time, they will be outside the magnetosphere, exposed to the harsh radiation environment of space. Mars has no global magnetic field to deflect energetic particles, and its atmosphere is much thinner than Earth’s, so they’ll get only minimal protection even on the surface of Mars.
Throughout the entire trip, astronauts must be protected from two sources of radiation. The first comes from the sun, which regularly releases a steady stream of solar particles, as well as occasional larger bursts in the wake of giant explosions, such as solar flares and coronal mass ejections, on the sun. These energetic particles are almost all protons, and, though the sun releases an unfathomably large number of them, the proton energy is low enough that they can almost all be physically shielded by the structure of the spacecraft.
Since solar activity strongly contributes to the deep-space radiation environment, a better understanding of the sun’s modulation of this radiation environment will allow mission planners to make better decisions for a future Mars mission. NASA currently operates a fleet of spacecraft studying the sun and the space environment throughout the solar system. Observations from this area of research, known as heliophysics, help us better understand the origin of solar eruptions and what effects these events have on the overall space radiation environment.
“If we know precisely what’s going on, we don’t have to be as conservative with our estimates, which gives us more flexibility when planning the mission,” said Pellish.
The second source of energetic particles is harder to shield. These particles come from galactic cosmic rays, often known as GCRs. They’re particles accelerated to near the speed of light that shoot into our solar system from other stars in the Milky Way or even other galaxies. Like solar particles, galactic cosmic rays are mostly protons. However, some of them are heavier elements, ranging from helium up to the heaviest elements. These more energetic particles can knock apart atoms in the material they strike, such as in the astronaut, the metal walls of a spacecraft, habitat, or vehicle, causing sub-atomic particles to shower into the structure. This secondary radiation, as it is known, can reach a dangerous level.
There are two ways to shield from these higher-energy particles and their secondary radiation: use a lot more mass of traditional spacecraft materials, or use more efficient shielding materials.
The sheer volume of material surrounding a structure would absorb the energetic particles and their associated secondary particle radiation before they could reach the astronauts. However, using sheer bulk to protect astronauts would be prohibitively expensive, since more mass means more fuel required to launch.
Using materials that shield more efficiently would cut down on weight and cost, but finding the right material takes research and ingenuity. NASA is currently investigating a handful of possibilities that could be used in anything from the spacecraft to the Martian habitat to space suits.
“The best way to stop particle radiation is by running that energetic particle into something that’s a similar size,” said Pellish. “Otherwise, it can be like you’re bouncing a tricycle off a tractor-trailer.”
Because protons and neutrons are similar in size, one element blocks both extremely well—hydrogen, which most commonly exists as just a single proton and an electron. Conveniently, hydrogen is the most abundant element in the universe, and makes up substantial parts of some common compounds, such as water and plastics like polyethylene. Engineers could take advantage of already-required mass by processing the astronauts’ trash into plastic-filled tiles used to bolster radiation protection. Water, already required for the crew, could be stored strategically to create a kind of radiation storm shelter in the spacecraft or habitat. However, this strategy comes with some challenges—the crew would need to use the water and then replace it with recycled water from the advanced life support systems.
Polyethylene, the same plastic commonly found in water bottles and grocery bags, also has potential as a candidate for radiation shielding. It is very high in hydrogen and fairly cheap to produce—however, it’s not strong enough to build a large structure, especially a spacecraft, which goes through high heat and strong forces during launch. And adding polyethylene to a metal structure would add quite a bit of mass, meaning that more fuel would be required for launch.
“We’ve made progress on reducing and shielding against these energetic particles, but we’re still working on finding a material that is a good shield and can act as the primary structure of the spacecraft,” said Sheila Thibeault, a materials researcher at NASA’s Langley Research Center in Hampton, Virginia.
One material in development at NASA has the potential to do both jobs: Hydrogenated boron nitride nanotubes—known as hydrogenated BNNTs—are tiny, nanotubes made of carbon, boron, and nitrogen, with hydrogen interspersed throughout the empty spaces left in between the tubes. Boron is also an excellent absorber secondary neutrons, making hydrogenated BNNTs an ideal shielding material.
“This material is really strong—even at high heat—meaning that it’s great for structure,” said Thibeault.
Remarkably, researchers have successfully made yarn out of BNNTs, so it’s flexible enough to be woven into the fabric of space suits, providing astronauts with significant radiation protection even while they’re performing spacewalks in transit or out on the harsh Martian surface. Though hydrogenated BNNTs are still in development and testing, they have the potential to be one of our key structural and shielding materials in spacecraft, habitats, vehicles, and space suits that will be used on Mars.
Physical shields aren’t the only option for stopping particle radiation from reaching astronauts: Scientists are also exploring the possibility of building force fields. Force fields aren’t just the realm of science fiction: Just like Earth’s magnetic field protects us from energetic particles, a relatively small, localized electric or magnetic field would—if strong enough and in the right configuration—create a protective bubble around a spacecraft or habitat. Currently, these fields would take a prohibitive amount of power and structural material to create on a large scale, so more work is needed for them to be feasible.
The risk of health effects can also be reduced in operational ways, such as having a special area of the spacecraft or Mars habitat that could be a radiation storm shelter; preparing spacewalk and research protocols to minimize time outside the more heavily-shielded spacecraft or habitat; and ensuring that astronauts can quickly return indoors in the event of a radiation storm.
Radiation risk mitigation can also be approached from the human body level. Though far off, a medication that would counteract some or all of the health effects of radiation exposure would make it much easier to plan for a safe journey to Mars and back.
“Ultimately, the solution to radiation will have to be a combination of things,” said Pellish. “Some of the solutions are technology we have already, like hydrogen-rich materials, but some of it will necessarily be cutting edge concepts that we haven’t even thought of yet.”
Researchers combine two types of photovoltaic material to make a cell that harnesses more sunlight.
By David Chandler
CAMBRIDGE, Mass–Researchers at MIT and Stanford University have developed a new kind of solar cell that combines two different layers of sunlight-absorbing material in order to harvest a broader range of the sun’s energy. The development could lead to photovoltaic cells that are more efficient than those currently used in solar-power installations, the researchers say.
The new cell uses a layer of silicon — which forms the basis for most of today’s solar panels — but adds a semi-transparent layer of a material called perovskite, which can absorb higher-energy particles of light. Unlike an earlier “tandem” solar cell reported by members of the same team earlier this year — in which the two layers were physically stacked, but each had its own separate electrical connections — the new version has both layers connected together as a single device that needs only one control circuit.
The new findings are reported in the journal Applied Physics Letters by MIT graduate student Jonathan Mailoa; associate professor of mechanical engineering Tonio Buonassisi; Colin Bailie and Michael McGehee at Stanford; and four others.
“Different layers absorb different portions of the sunlight,” Mailoa explains. In the earlier tandem solar cell, the two layers of photovoltaic material could be operated independently of each other and required their own wiring and control circuits, allowing each cell to be tuned independently for optimal performance.
By contrast, the new combined version should be much simpler to make and install, Mailoa says. “It has advantages in terms of simplicity, because it looks and operates just like a single silicon cell,” he says, with only a single electrical control circuit needed.
One tradeoff is that the current produced is limited by the capacity of the lesser of the two layers. Electrical current, Buonassisi explains, can be thought of as analogous to the volume of water passing through a pipe, which is limited by the diameter of the pipe: If you connect two lengths of pipe of different diameters, one after the other, “the amount of water is limited by the narrowest pipe,” he says. Combining two solar cell layers in series has the same limiting effect on current.
To address that limitation, the team aims to match the current output of the two layers as precisely as possible. In this proof-of-concept solar cell, this means the total power output is about the same as that of conventional solar cells; the team is now working to optimize that output.
Perovskites have been studied for potential electronic uses including solar cells, but this is the first time they have been successfully paired with silicon cells in this configuration, a feat that posed numerous technical challenges. Now the team is focusing on increasing the power efficiency — the percentage of sunlight’s energy that gets converted to electricity — that is possible from the combined cell. In this initial version, the efficiency is 13.7 percent, but the researchers say they have identified low-cost ways of improving this to about 30 percent — a substantial improvement over today’s commercial silicon-based solar cells — and they say this technology could ultimately achieve a power efficiency of more than 35 percent.
They will also explore how to easily manufacture the new type of device, but Buonassisi says that should be relatively straightforward, since the materials lend themselves to being made through methods very similar to conventional silicon-cell manufacturing.
One hurdle is making the material durable enough to be commercially viable: The perovskite material degrades quickly in open air, so it either needs to be modified to improve its inherent durability or encapsulated to prevent exposure to air — without adding significantly to manufacturing costs and without degrading performance.
This exact formulation may not turn out to be the most advantageous for better solar cells, Buonassisi says, but is one of several pathways worth exploring. “Our job at this point is to provide options to the world,” he says. “The market will select among them.”
The research team also included Eric Johlin PhD ’14 and postdoc Austin Akey at MIT, and Eric Hoke and William Nguyen of Stanford. It was supported by the Bay Area Photovoltaic Consortium and the U.S. Department of Energy.
CAMBRIDGE, Mass–Researchers at MIT have developed a method to stimulate brain tissue using external magnetic fields and injected magnetic nanoparticles — a technique allowing direct stimulation of neurons, which could be an effective treatment for a variety of neurological diseases, without the need for implants or external connections.
The research, conducted by Polina Anikeeva, an assistant professor of materials science and engineering, graduate student Ritchie Chen, and three others, has been published in the journal Science.
Previous efforts to stimulate the brain using pulses of electricity have proven effective in reducing or eliminating tremors associated with Parkinson’s disease, but the treatment has remained a last resort because it requires highly invasive implanted wires that connect to a power source outside the brain.
“In the future, our technique may provide an implant-free means to provide brain stimulation and mapping,” Anikeeva says.
In their study, the team injected magnetic iron oxide particles just 22 nanometers in diameter into the brain. When exposed to an external alternating magnetic field — which can penetrate deep inside biological tissues — these particles rapidly heat up.
The resulting local temperature increase can then lead to neural activation by triggering heat-sensitive capsaicin receptors — the same proteins that the body uses to detect both actual heat and the “heat” of spicy foods. (Capsaicin is the chemical that gives hot peppers their searing taste.) Anikeeva’s team used viral gene delivery to induce the sensitivity to heat in selected neurons in the brain.
The particles, which have virtually no interaction with biological tissues except when heated, tend to remain where they’re placed, allowing for long-term treatment without the need for further invasive procedures.
“The nanoparticles integrate into the tissue and remain largely intact,” Anikeeva says. “Then, that region can be stimulated at will by externally applying an alternating magnetic field. The goal for us was to figure out whether we could deliver stimuli to the nervous system in a wireless and noninvasive way.”
The new work has proven that the approach is feasible, but much work remains to turn this proof-of-concept into a practical method for brain research or clinical treatment.
The use of magnetic fields and injected particles has been an active area of cancer research; the thought is that this approach could destroy cancer cells by heating them. “The new technique is derived, in part, from that research,” Anikeeva says. “By calibrating the delivered thermal dosage, we can excite neurons without killing them. The magnetic nanoparticles also have been used for decades as contrast agents in MRI scans, so they are considered relatively safe in the human body.”
The team developed ways to make the particles with precisely controlled sizes and shapes, in order to maximize their interaction with the applied alternating magnetic field. They also developed devices to deliver the applied magnetic field: Existing devices for cancer treatment — intended to produce much more intense heating — were far too big and energy-inefficient for this application.
The next step toward making this a practical technology for clinical use in humans “is to understand better how our method works through neural recordings and behavioral experiments, and assess whether there are any other side effects to tissues in the affected area,” Anikeeva says.
In addition to Anikeeva and Chen, the research team also included postdoc Gabriela Romero, graduate student Michael Christiansen, and undergraduate Alan Mohr. The work was funded by the Defense Advanced Research Projects Agency, MIT’s McGovern Institute for Brain Research, and the National Science Foundation.
Light behaves both as a particle and as a wave. Since the days of Einstein, scientists have been trying to directly observe both of these aspects of light at the same time. Now, scientists at EPFL have succeeded in capturing the first-ever snapshot of this dual behavior.
Quantum mechanics tells us that light can behave simultaneously as a particle or a wave. However, there has never been an experiment able to capture both natures of light at the same time; the closest we have come is seeing either wave or particle, but always at different times. Taking a radically different experimental approach, EPFL scientists have now been able to take the first ever snapshot of light behaving both as a wave and as a particle. The breakthrough work is published in Nature Communications.
When UV light hits a metal surface, it causes an emission of electrons. Albert Einstein explained this “photoelectric” effect by proposing that light – thought to only be a wave – is also a stream of particles. Even though a variety of experiments have successfully observed both the particle- and wave-like behaviors of light, they have never been able to observe both at the same time.
A research team led by Fabrizio Carbone at EPFL has now carried out an experiment with a clever twist: using electrons to image light. The researchers have captured, for the first time ever, a single snapshot of light behaving simultaneously as both a wave and a stream of particles particle.
The experiment is set up like this: A pulse of laser light is fired at a tiny metallic nanowire. The laser adds energy to the charged particles in the nanowire, causing them to vibrate. Light travels along this tiny wire in two possible directions, like cars on a highway. When waves traveling in opposite directions meet each other they form a new wave that looks like it is standing in place. Here, this standing wave becomes the source of light for the experiment, radiating around the nanowire.
This is where the experiment’s trick comes in: The scientists shot a stream of electrons close to the nanowire, using them to image the standing wave of light. As the electrons interacted with the confined light on the nanowire, they either sped up or slowed down. Using the ultrafast microscope to image the position where this change in speed occurred, Carbone’s team could now visualize the standing wave, which acts as a fingerprint of the wave-nature of light.
While this phenomenon shows the wave-like nature of light, it simultaneously demonstrates its particle aspect as well. As the electrons pass close to the standing wave of light, they “hit” the light’s particles, the photons. As mentioned above, this affects their speed, making them move faster or slower. This change in speed appears as an exchange of energy “packets” (quanta) between electrons and photons. The very occurrence of these energy packets shows that the light on the nanowire behaves as a particle.
“This experiment demonstrates that, for the first time ever, we can film quantum mechanics – and its paradoxical nature – directly,” says Fabrizio Carbone. In addition, the importance of this pioneering work can extend beyond fundamental science and to future technologies. As Carbone explains: “Being able to image and control quantum phenomena at the nanometer scale like this opens up a new route towards quantum computing.”
This work represents a collaboration between the Laboratory for Ultrafast Microscopy and Electron Scattering of EPFL, the Department of Physics of Trinity College (US) and the Physical and Life Sciences Directorate of the Lawrence Livermore National Laboratory. The imaging was carried out EPFL’s ultrafast energy-filtered transmission electron microscope – one of the two in the world.
Piazza L, Lummen TTA, Quiñonez E, Murooka Y, Reed BW, Barwick B, Carbone F.Simultaneous observation of the quantization and the interference pattern of a plasmonic near-field.Nature Communications 02 March 2015. DOI: 10.1038/ncomms7407
New understanding of how to halt photons could lead to miniature particle accelerators, improved data transmission.
By David L. Chandler
Researchers at MIT who succeeded last year in creating a material that could trap light and stop it in its tracks have now developed a more fundamental understanding of the process. The new work — which could help explain some basic physical mechanisms — reveals that this behavior is connected to a wide range of other seemingly unrelated phenomena.
The findings are reported in a paper in the journal Physical Review Letters, co-authored by MIT physics professor Marin Soljačić; postdocs Bo Zhen, Chia Wei Hsu, and Ling Lu; and Douglas Stone, a professor of applied physics at Yale University.
Light can usually be confined only with mirrors, or with specialized materials such as photonic crystals. Both of these approaches block light beams; last year’s finding demonstrated a new method in which the waves cancel out their own radiation fields. The new work shows that this light-trapping process, which involves twisting the polarization direction of the light, is based on a kind of vortex — the same phenomenon behind everything from tornadoes to water swirling down a drain.
In addition to revealing the mechanism responsible for trapping the light, the new analysis shows that this trapped state is much more stable than had been thought, making it easier to produce and harder to disturb.
“People think of this [trapped state] as very delicate,” Zhen says, “and almost impossible to realize. But it turns out it can exist in a robust way.”
In most natural light, the direction of polarization — which can be thought of as the direction in which the light waves vibrate — remains fixed. That’s the principle that allows polarizing sunglasses to work: Light reflected from a surface is selectively polarized in one direction; that reflected light can then be blocked by polarizing filters oriented at right angles to it.
But in the case of these light-trapping crystals, light that enters the material becomes polarized in a way that forms a vortex, Zhen says, with the direction of polarization changing depending on the beam’s direction.
Because the polarization is different at every point in this vortex, it produces a singularity — also called a topological defect, Zhen says — at its center, trapping the light at that point.
Hsu says the phenomenon makes it possible to produce something called a vector beam, a special kind of laser beam that could potentially create small-scale particle accelerators. Such devices could use these vector beams to accelerate particles and smash them into each other — perhaps allowing future tabletop devices to carry out the kinds of high-energy experiments that today require miles-wide circular tunnels.
The finding, Soljačić says, could also enable easy implementation of super-resolution imaging (using a method called stimulated emission depletion microscopy) and could allow the sending of far more channels of data through a single optical fiber.
“This work is a great example of how supposedly well-studied physical systems can contain rich and undiscovered phenomena, which can be unearthed if you dig in the right spot,” says Yidong Chong, an assistant professor of physics and applied physics at Nanyang Technological University in Singapore who was not involved in this research.
Chong says it is remarkable that such surprising findings have come from relatively well-studied materials. “It deals with photonic crystal slabs of the sort that have been extensively analyzed, both theoretically and experimentally, since the 1990s,” he says. “The fact that the system is so unexotic, together with the robustness associated with topological phenomena, should give us confidence that these modes will not simply
be theoretical curiosities, but can be exploited in technologies such as microlasers.”
The research was partly supported by the U.S. Army Research Office through MIT’s Institute for Soldier Nanotechnologies, and by the Department of Energy and the National Science Foundation.
Despite a lot of work being done by many research groups around the world, the field of Quantum computing is still in its early stages. We still need to cover a lot of grounds to achieve the goal of developing a working Quantum computer capable of doing the tasks which are expected or predicted. Recent research by a SISSA led team has tried to give the future research in the area of Quantum computing some direction based on the current state of research in the area.
“A quantum computer may be thought of as a ‘simulator of overall Nature,” explains Fabio Franchini, a researcher at the International School for Advanced Studies (SISSA) of Trieste, “in other words, it’s a machine capable of simulating Nature as a quantum system, something that classical computers cannot do”. Quantum computers are machines that carry out operations by exploiting the phenomena of quantum mechanics, and they are capable of performing different functions from those of current computers. This science is still very young and the systems produced to date are still very limited. Franchini is the first author of a study just published in Physical Review Xwhich establishes a basic characteristic that this type of machine should possess and in doing so guides the direction of future research in this field.
The study used analytical and numerical methods. “What we found” explains Franchini, “is that a system that does not exhibit ‘Majorana fermions’ cannot be a universal quantum simulator”. Majorana fermions were hypothesized by Ettore Majorana in a paper published 1937, and they display peculiar characteristics: a Majorana fermion is also its own antiparticle. “That means that if Majorana fermions meet they annihilate among themselves,” continues Franchini. “In recent years it has been suggested that these fermions could be found in states of matter useful for quantum computing, and our study confirms that they must be present, with a certain probability related to entanglement, in the material used to build the machine”.
Entanglement, or “action at a distance”, is a property of quantum systems whereby an action done on one part of the system has an effect on another part of the same system, even if the latter has been split into two parts that are located very far apart. “Entanglement is a fundamental phenomenon for quantum computers,” explains Franchini.
“Our study helps to understand what types of devices research should be focusing on to construct this universal simulator. Until now, given the lack of criteria, research has proceeded somewhat randomly, with a huge consumption of time and resources”.
The study was conducted with the participation of many other international research institutes in addition to SISSA, including the Massachusetts Institute of Technology (MIT) in Boston, the University of Oxford and many others.
More in detail…
“Having a quantum computer would open up new worlds. For example, if we had one today we would be able to break into any bank account,” jokes Franchini. “But don’t worry, we’re nowhere near that goal”.
At the present time, several attempts at quantum machines exist that rely on the properties of specific materials. Depending on the technology used, these computers have sizes varying from a small box to a whole room, but so far they are only able to process a limited number of information bits, an amount infinitely smaller than that processed by classical computers.
However, it’s not correct to say that quantum computers are, or will be, more powerful than traditional ones, points out Franchini. “There are several things that these devices are worse at. But, by exploiting quantum mechanics, they can perform operations that would be impossible for classical computers”.
Five new NASA airborne field campaigns will take to the skies starting in 2015 to investigate how long-range air pollution, warming ocean waters, and fires in Africa affect our climate.
These studies into several incompletely understood Earth system processes were competitively-selected as part of NASA’s Earth Venture-class projects. Each project is funded at a total cost of no more than $30 million over five years. This funding includes initial development, field campaigns and analysis of data.
This is NASA’s second series of Earth Venture suborbital investigations — regularly solicited, quick-turnaround projects recommended by the National Research Council in 2007. The first series of five projects was selected in 2010.
“These new investigations address a variety of key scientific questions critical to advancing our understanding of how Earth works,” said Jack Kaye, associate director for research in NASA’s Earth Science Division in Washington. “These innovative airborne experiments will let us probe inside processes and locations in unprecedented detail that complements what we can do with our fleet of Earth-observing satellites.”
The five selected Earth Venture investigations are:
Atmospheric chemistry and air pollution – Steven Wofsy of Harvard University in Cambridge, Massachusetts, will lead the Atmospheric Tomography project to study the impact of human-produced air pollution on certain greenhouse gases. Airborne instruments will look at how atmospheric chemistry is transformed by various air pollutants and at the impact on methane and ozone which affect climate. Flights aboard NASA’s DC-8 will originate from the Armstrong Flight Research Center in Palmdale, California, fly north to the western Arctic, south to the South Pacific, east to the Atlantic, north to Greenland, and return to California across central North America.
Ecosystem changes in a warming ocean – Michael Behrenfeld of Oregon State University in Corvallis, Oregon, will lead the North Atlantic Aerosols and Marine Ecosystems Study, which seeks to improve predictions of how ocean ecosystems would change with ocean warming. The mission will study the annual life cycle of phytoplankton and the impact small airborne particles derived from marine organisms have on climate in the North Atlantic. The large annual phytoplankton bloom in this region may influence the Earth’s energy budget. Research flights by NASA’s C-130 aircraft from Wallops Flight Facility, Virginia, will be coordinated with a University-National Oceanographic Laboratory System (UNOLS) research vessel. UNOLS, located at the University of Rhode Island’s Graduate School of Oceanography in Narragansett, Rhode Island, is an organization of 62 academic institutions and national laboratories involved in oceanographic research.
Greenhouse gas sources – Kenneth Davis of Pennsylvania State University in University Park, will lead the Atmospheric Carbon and Transport-America project to quantify the sources of regional carbon dioxide, methane and other gases, and document how weather systems transport these gases in the atmosphere. The research goal is to improve identification and predictions of carbon dioxide and methane sources and sinks using spaceborne, airborne and ground-based data over the eastern United States. Research flights will use NASA’s C-130 from Wallops and the UC-12 from Langley Research Center in Hampton, Virginia.
African fires and Atlantic clouds – Jens Redemann of NASA’s Ames Research Center in Mountain View, California, will lead the Observations of Aerosols above Clouds and their Interactions project to probe how smoke particles from massive biomass burning in Africa influences cloud cover over the Atlantic. Particles from this seasonal burning that are lofted into the mid-troposphere and transported westward over the southeast Atlantic interact with permanent stratocumulus “climate radiators,” which are critical to the regional and global climate system. NASA aircraft, including a Wallops P-3 and an Armstrong ER-2, will be used to conduct the investigation flying out of Walvis Bay, Namibia.
Melting Greenland glaciers – Josh Willis of NASA’s Jet Propulsion Laboratory in Pasadena, California, will lead the Oceans Melting Greenland mission to investigate the role of warmer saltier Atlantic subsurface waters in Greenland glacier melting. The study will help pave the way for improved estimates of future sea level rise by observing changes in glacier melting where ice contacts seawater. Measurements of the ocean bottom as well as seawater properties around Greenland will be taken from ships and the air using several aircraft including a NASA S-3 from Glenn Research Center in Cleveland, Ohio, and Gulfstream III from Armstrong.
Seven NASA centers, 25 educational institutions, three U.S. government agencies and two industry partners are involved in these Earth Venture projects. The five investigations were selected from 33 proposals.
Earth Venture investigations are part of NASA’s Earth System Science Pathfinder program managed at Langley for NASA’s Science Mission Directorate in Washington. The missions in this program provide an innovative approach to address Earth science research with periodic windows of opportunity to accommodate new scientific priorities.
NASA monitors Earth’s vital signs from land, sea, air and space with a fleet of satellites and ambitious airborne and surface-based observation campaigns. With this information and computer analysis tools, NASA studies Earth’s interconnected systems to better see how our planet is changing. The agency shares this unique knowledge with the global community and works with institutions in the United States and around the world that contribute to understanding and protecting our home planet.
For more information about NASA’s Earth science activities, visit:
Geneva 19 November 2014. Today the collaboration for the LHCb experiment at CERN1’s Large Hadron Collider announced the discovery of two new particles in the baryon family. The particles, known as the Xi_b’- and Xi_b*-, were predicted to exist by the quark model but had never been seen before. A related particle, the Xi_b*0, was found by the CMS experiment at CERN in 2012. The LHCb collaboration submitted a paper reporting the finding to Physical Review Letters.
Like the well-known protons that the LHC accelerates, the new particles are baryons made from three quarks bound together by the strong force. The types of quarks are different, though: the new X_ib particles both contain one beauty (b), one strange (s), and one down (d) quark. Thanks to the heavyweight b quarks, they are more than six times as massive as the proton. But the particles are more than just the sum of their parts: their mass also depends on how they are configured. Each of the quarks has an attribute called “spin”. In the Xi_b’- state, the spins of the two lighter quarks point in the opposite direction to the b quark, whereas in the Xi_b*- state they are aligned. This difference makes the Xi_b*- a little heavier.
“Nature was kind and gave us two particles for the price of one,” said Matthew Charles of the CNRS’s LPNHE laboratory at Paris VI University. “The Xi_b’- is very close in mass to the sum of its decay products: if it had been just a little lighter, we wouldn’t have seen it at all using the decay signature that we were looking for.”
“This is a very exciting result. Thanks to LHCb’s excellent hadron identification, which is unique among the LHC experiments, we were able to separate a very clean and strong signal from the background,”said Steven Blusk from Syracuse University in New York. “It demonstrates once again the sensitivity and how precise the LHCb detector is.”
As well as the masses of these particles, the research team studied their relative production rates, their widths – a measure of how unstable they are – and other details of their decays. The results match up with predictions based on the theory of Quantum Chromodynamics (QCD).
QCD is part of the Standard Model of particle physics, the theory that describes the fundamental particles of matter, how they interact and the forces between them. Testing QCD at high precision is a key to refine our understanding of quark dynamics, models of which are tremendously difficult to calculate.
“If we want to find new physics beyond the Standard Model, we need first to have a sharp picture,” said LHCb’s physics coordinator Patrick Koppenburg from Nikhef Institute in Amsterdam. “Such high precision studies will help us to differentiate between Standard Model effects and anything new or unexpected in the future.”
The measurements were made with the data taken at the LHC during 2011-2012. The LHC is currently being prepared – after its first long shutdown – to operate at higher energies and with more intense beams. It is scheduled to restart by spring 2015.
1. CERN, the European Organization for Nuclear Research, is the world’s leading laboratory for particle physics. It has its headquarters in Geneva. At present, its Member States are Austria, Belgium, Bulgaria, the Czech Republic, Denmark, Finland, France, Germany, Greece, Hungary, Israel, Italy, the Netherlands, Norway, Poland, Portugal, Slovakia, Spain, Sweden, Switzerland and the United Kingdom. Romania is a Candidate for Accession. Serbia is an Associate Member in the pre-stage to Membership. India, Japan, the Russian Federation, the United States of America, Turkey, the European Commission and UNESCO have Observer Status.
CAMBRIDGE, Mass–A surprising phenomenon has been found in metal nanoparticles: They appear, from the outside, to be liquid droplets, wobbling and readily changing shape, while their interiors retain a perfectly stable crystal configuration.
The research team behind the finding, led by MIT professor Ju Li, says the work could have important implications for the design of components in nanotechnology, such as metal contacts for molecular electronic circuits.
The results, published in the journal Nature Materials, come from a combination of laboratory analysis and computer modeling, by an international team that included researchers in China, Japan, and Pittsburgh, as well as at MIT.
The experiments were conducted at room temperature, with particles of pure silver less than 10 nanometers across — less than one-thousandth of the width of a human hair. But the results should apply to many different metals, says Li, senior author of the paper and the BEA Professor of Nuclear Science and Engineering.
Silver has a relatively high melting point — 962 degrees Celsius, or 1763 degrees Fahrenheit — so observation of any liquidlike behavior in its nanoparticles was “quite unexpected,” Li says. Hints of the new phenomenon had been seen in earlier work with tin, which has a much lower melting point, he says.
The use of nanoparticles in applications ranging from electronics to pharmaceuticals is a lively area of research; generally, Li says, these researchers “want to form shapes, and they want these shapes to be stable, in many cases over a period of years.” So the discovery of these deformations reveals a potentially serious barrier to many such applications: For example, if gold or silver nanoligaments are used in electronic circuits, these deformations could quickly cause electrical connections to fail.
Only skin deep
The researchers’ detailed imaging with a transmission electron microscope and atomistic modeling revealed that while the exterior of the metal nanoparticles appears to move like a liquid, only the outermost layers — one or two atoms thick — actually move at any given time. As these outer layers of atoms move across the surface and redeposit elsewhere, they give the impression of much greater movement — but inside each particle, the atoms stay perfectly lined up, like bricks in a wall.
“The interior is crystalline, so the only mobile atoms are the first one or two monolayers,” Li says. “Everywhere except the first two layers is crystalline.”
By contrast, if the droplets were to melt to a liquid state, the orderliness of the crystal structure would be eliminated entirely — like a wall tumbling into a heap of bricks.
Technically, the particles’ deformation is pseudoelastic, meaning that the material returns to its original shape after the stresses are removed — like a squeezed rubber ball — as opposed to plasticity, as in a deformable lump of clay that retains a new shape.
The phenomenon of plasticity by interfacial diffusion was first proposed by Robert L. Coble, a professor of ceramic engineering at MIT, and is known as “Coble creep.” “What we saw is aptly called Coble pseudoelasticity,” Li says.
Now that the phenomenon has been understood, researchers working on nanocircuits or other nanodevices can quite easily compensate for it, Li says. If the nanoparticles are protected by even a vanishingly thin layer of oxide, the liquidlike behavior is almost completely eliminated, making stable circuits possible.
On the other hand, for some applications this phenomenon might be useful: For example, in circuits where electrical contacts need to withstand rotational reconfiguration, particles designed to maximize this effect might prove useful, using noble metals or a reducing atmosphere, where the formation of an oxide layer is destabilized, Li says.
The new finding flies in the face of expectations — in part, because of a well-understood relationship, in most materials, in which mechanical strength increases as size is reduced.
“In general, the smaller the size, the higher the strength,” Li says, but “at very small sizes, a material component can get very much weaker. The transition from ‘smaller is stronger’ to ‘smaller is much weaker’ can be very sharp.”
That crossover, he says, takes place at about 10 nanometers at room temperature — a size that microchip manufacturers are approaching as circuits shrink. When this threshold is reached, Li says, it causes “a very precipitous drop” in a nanocomponent’s strength.
The findings could also help explain a number of anomalous results seen in other research on small particles, Li says.
The research team included Jun Sun, Longbing He, Tao Xu, Hengchang Bi, and Litao Sun, all of Southeast University in Nanjing, China; Yu-Chieh Lo of MIT and Kyoto University; Ze Zhang of Zhejiang University; and Scott Mao of the University of Pittsburgh. It was supported by the National Basic Research Program of China; the National Natural Science Foundation of China; the Chinese Ministry of Education; the National Science Foundation of Jiangsu Province, China; and the U.S. National Science Foundation.