Monthly Archives: November 2014

The MPG/ESO 2.2-metre telescope at ESO’s La Silla Observatory in Chile captured this richly colourful view of the bright star cluster NGC 3532. Some of the stars still shine with a hot bluish colour, but many of the more massive ones have become red giants and glow with a rich orange hue.

Credit:

ESO/G. Beccari

A Colourful Gathering of Middle-aged Stars

The MPG/ESO 2.2-metre telescope at ESO’s La Silla Observatory in Chile has captured a richly colourful view of the bright star cluster NGC 3532. Some of the stars still shine with a hot bluish colour, but many of the more massive ones have become red giants and glow with a rich orange hue.

NGC 3532 is a bright open cluster located some 1300 light-years away in the constellation of Carina (The Keel of the ship Argo). It is informally known as the Wishing Well Cluster, as it resembles scattered silver coins which have been dropped into a well. It is also referred to as the Football Cluster, although how appropriate this is depends on which side of the Atlantic you live. It acquired the name because of its oval shape, which citizens of rugby-playing nations might see as resembling a rugby ball.

This very bright star cluster is easily seen with the naked eye from the southern hemisphere. It was discovered by French astronomer Nicolas Louis de Lacaille whilst observing from South Africa in 1752 and was catalogued three years later in 1755. It is one of the most spectacular open star clusters in the whole sky.

The MPG/ESO 2.2-metre telescope at ESO’s La Silla Observatory in Chile captured this richly colourful view of the bright star cluster NGC 3532. Some of the stars still shine with a hot bluish colour, but many of the more massive ones have become red giants and glow with a rich orange hue. Credit: ESO/G. Beccari
The MPG/ESO 2.2-metre telescope at ESO’s La Silla Observatory in Chile captured this richly colourful view of the bright star cluster NGC 3532. Some of the stars still shine with a hot bluish colour, but many of the more massive ones have become red giants and glow with a rich orange hue.
Credit:
ESO/G. Beccari

NGC 3532 covers an area of the sky that is almost twice the size of the full Moon. It was described as a binary-rich cluster by John Herschel who observed “several elegant double stars” here during his stay in southern Africa in the 1830s. Of additional, much more recent, historical relevance, NGC 3532 was the first target to be observed by the NASA/ESA Hubble Space Telescope, on 20 May 1990.

This grouping of stars is about 300 million years old. This makes it middle-aged by open star cluster standards [1]. The cluster stars that started off with moderate masses are still shining brightly with blue-white colours, but the more massive ones have already exhausted their supplies of hydrogen fuel and have become red giant stars. As a result the cluster appears rich in both blue and orange stars. The most massive stars in the original cluster will have already run through their brief but brilliant lives and exploded as supernovae long ago. There are also numerous less conspicuous fainter stars of lower mass that have longer lives and shine with yellow or red hues. NGC 3532 consists of around 400 stars in total.

The background sky here in a rich part of the Milky Way is very crowded with stars. Some glowing red gas is also apparent, as well as subtle lanes of dust that block the view of more distant stars. These are probably not connected to the cluster itself, which is old enough to have cleared away any material in its surroundings long ago.

This image of NGC 3532 was captured by the Wide Field Imager instrument at ESO’s La Silla Observatory in February 2013.

Notes

[1] Stars with masses many times greater than the Sun have lives of just a few million years, the Sun is expected to live for about ten billion years and low-mass stars have expected lives of hundreds of billions of years — much greater than the current age of the Universe.

More information

ESO is the foremost intergovernmental astronomy organisation in Europe and the world’s most productive ground-based astronomical observatory by far. It is supported by 15 countries: Austria, Belgium, Brazil, the Czech Republic, Denmark, France, Finland, Germany, Italy, the Netherlands, Portugal, Spain, Sweden, Switzerland and the United Kingdom. ESO carries out an ambitious programme focused on the design, construction and operation of powerful ground-based observing facilities enabling astronomers to make important scientific discoveries. ESO also plays a leading role in promoting and organising cooperation in astronomical research. ESO operates three unique world-class observing sites in Chile: La Silla, Paranal and Chajnantor. At Paranal, ESO operates the Very Large Telescope, the world’s most advanced visible-light astronomical observatory and two survey telescopes. VISTA works in the infrared and is the world’s largest survey telescope and the VLT Survey Telescope is the largest telescope designed to exclusively survey the skies in visible light. ESO is the European partner of a revolutionary astronomical telescope ALMA, the largest astronomical project in existence. ESO is currently planning the 39-metre European Extremely Large optical/near-infrared Telescope, the E-ELT, which will become “the world’s biggest eye on the sky”.

Links

The mass difference spectrum: the LHCb result shows strong evidence of the existence of two new particles the Xi_b'- (first peak) and Xi_b*- (second peak), with the very high-level confidence of 10 sigma. The black points are the signal sample and the hatched red histogram is a control sample. The blue curve represents a model including the two new particles, fitted to the data. Delta_m is the difference between the mass of the Xi_b0 pi- pair and the sum of the individual masses of the Xi_b0 and pi-.. INSET: Detail of the Xi_b'- region plotted with a finer binning.
Credit: CERN

CERN makes public first data of LHC experiments

CERN1 launched today its Open Data Portal where data from real collision events, produced by the LHC experiments will for the first time be made openly available to all. It is expected that these data will be of high value for the research community, and also be used for education purposes.

”Launching the CERN Open Data Portal is an important step for our Organization. Data from the LHC programme are among the most precious assets of the LHC experiments, that today we start sharing openly with the world. We hope these open data will support and inspire the global research community, including students and citizen scientists,” said CERN Director General Rolf Heuer.

The principle of openness is enshrined in CERN’s founding Convention, and all LHC publications have been published Open Access, free for all to read and re-use. Widening the scope, the LHC collaborations recently approved Open Data policies and will release collision data over the coming years.

The first high-level and analysable collision data openly released come from the CMS experiment and were originally collected in 2010 during the first LHC run. This data set is now publicly available on the CERN Open Data Portal. Open source software to read and analyse the data is also available, together with the corresponding documentation. The CMS collaboration is committed to releasing its data three years after collection, after they have been thoroughly studied by the collaboration.

“This is all new and we are curious to see how the data will be re-used,” said CMS data preservation coordinator Kati Lassila-Perini. “We’ve prepared tools and examples of different levels of complexity from simplified analysis to ready-to-use online applications. We hope these examples will stimulate the creativity of external users.”

 The mass difference spectrum: the LHCb result shows strong evidence of the existence of two new particles the Xi_b'- (first peak) and Xi_b*- (second peak), with the very high-level confidence of 10 sigma. The black points are the signal sample and the hatched red histogram is a control sample. The blue curve represents a model including the two new particles, fitted to the data. Delta_m is the difference between the mass of the Xi_b0 pi- pair and the sum of the individual masses of the Xi_b0 and pi-.. INSET: Detail of the Xi_b'- region plotted with a finer binning. Credit: CERN
The mass difference spectrum: the LHCb result shows strong evidence of the existence of two new particles the Xi_b’- (first peak) and Xi_b*- (second peak), with the very high-level confidence of 10 sigma. The black points are the signal sample and the hatched red histogram is a control sample. The blue curve represents a model including the two new particles, fitted to the data. Delta_m is the difference between the mass of the Xi_b0 pi- pair and the sum of the individual masses of the Xi_b0 and pi-.. INSET: Detail of the Xi_b’- region plotted with a finer binning.
Credit: CERN

In parallel, the CERN Open Data Portal gives access to additional event data sets from the ALICE, ATLAS, CMS and LHCb collaborations, which have been specifically prepared for educational purposes, such as the international masterclasses in particle physics2 benefiting over ten thousand high-school students every year. These resources are accompanied by visualisation tools.

“Our own data policy foresees data preservation and its sharing. We have seen that students are fascinated by being able to analyse LHC data in the past and so, we are very happy to take the first steps and make available some selected data for education” said Silvia Amerio, data preservation coordinator of the LHCb experiment.

“The development of this Open Data Portal represents a first milestone in our mission to serve our users in preserving and sharing their research materials. It will ensure that the data and tools can be accessed and used, now and in the future,” said Tim Smith from CERN’s IT Department.

All data on OpenData.cern.ch are shared under a Creative Commons CC03 public domain dedication; data and software are assigned unique DOI identifiers to make them citable in scientific articles; and software is released under open source licenses. The CERN Open Data Portal is built on the open-source Invenio Digital Library software, which powers other CERN Open Science tools and initiatives.

Further information:

Open data portal

Open data policies

CMS Open Data

 

Footnote(s):

1. CERN, the European Organization for Nuclear Research, is the world’s leading laboratory for particle physics. It has its headquarters in Geneva. At present, its Member States are Austria, Belgium, Bulgaria, the Czech Republic, Denmark, Finland, France, Germany, Greece, Hungary, Israel, Italy, the Netherlands, Norway, Poland, Portugal, Slovakia, Spain, Sweden, Switzerland and the United Kingdom. Romania is a Candidate for Accession. Serbia is an Associate Member in the pre-stage to Membership. India, Japan, the Russian Federation, the United States of America, Turkey, the European Commission and UNESCO have Observer Status.

2. http://www.physicsmasterclasses.org(link is external)

3. http://creativecommons.org/publicdomain/zero/1.0/

The DC-8 airborne laboratory is one of several NASA aircraft that will fly in support of five new investigations into how different aspects of the interconnected Earth system influence climate change.
Image Credit: NASA

NASA Airborne Campaigns Tackle Climate Questions from Africa to Arctic

Five new NASA airborne field campaigns will take to the skies starting in 2015 to investigate how long-range air pollution, warming ocean waters, and fires in Africa affect our climate.

These studies into several incompletely understood Earth system processes were competitively-selected as part of NASA’s Earth Venture-class projects. Each project is funded at a total cost of no more than $30 million over five years. This funding includes initial development, field campaigns and analysis of data.

This is NASA’s second series of Earth Venture suborbital investigations — regularly solicited, quick-turnaround projects recommended by the National Research Council in 2007. The first series of five projects was selected in 2010.

“These new investigations address a variety of key scientific questions critical to advancing our understanding of how Earth works,” said Jack Kaye, associate director for research in NASA’s Earth Science Division in Washington. “These innovative airborne experiments will let us probe inside processes and locations in unprecedented detail that complements what we can do with our fleet of Earth-observing satellites.”

The DC-8 airborne laboratory is one of several NASA aircraft that will fly in support of five new investigations into how different aspects of the interconnected Earth system influence climate change. Image Credit: NASA
The DC-8 airborne laboratory is one of several NASA aircraft that will fly in support of five new investigations into how different aspects of the interconnected Earth system influence climate change.
Image Credit: NASA

The five selected Earth Venture investigations are:

  • Atmospheric chemistry and air pollution – Steven Wofsy of Harvard University in Cambridge, Massachusetts, will lead the Atmospheric Tomography project to study the impact of human-produced air pollution on certain greenhouse gases. Airborne instruments will look at how atmospheric chemistry is transformed by various air pollutants and at the impact on methane and ozone which affect climate. Flights aboard NASA’s DC-8 will originate from the Armstrong Flight Research Center in Palmdale, California, fly north to the western Arctic, south to the South Pacific, east to the Atlantic, north to Greenland, and return to California across central North America.
  • Ecosystem changes in a warming ocean – Michael Behrenfeld of Oregon State University in Corvallis, Oregon, will lead the North Atlantic Aerosols and Marine Ecosystems Study, which seeks to improve predictions of how ocean ecosystems would change with ocean warming. The mission will study the annual life cycle of phytoplankton and the impact small airborne particles derived from marine organisms have on climate in the North Atlantic. The large annual phytoplankton bloom in this region may influence the Earth’s energy budget. Research flights by NASA’s C-130 aircraft from Wallops Flight Facility, Virginia, will be coordinated with a University-National Oceanographic Laboratory System (UNOLS) research vessel. UNOLS, located at the University of Rhode Island’s Graduate School of Oceanography in Narragansett, Rhode Island, is an organization of 62 academic institutions and national laboratories involved in oceanographic research.
  • Greenhouse gas sources – Kenneth Davis of Pennsylvania State University in University Park, will lead the Atmospheric Carbon and Transport-America project to quantify the sources of regional carbon dioxide, methane and other gases, and document how weather systems transport these gases in the atmosphere. The research goal is to improve identification and predictions of carbon dioxide and methane sources and sinks using spaceborne, airborne and ground-based data over the eastern United States. Research flights will use NASA’s C-130 from Wallops and the UC-12 from Langley Research Center in Hampton, Virginia.
  • African fires and Atlantic clouds – Jens Redemann of NASA’s Ames Research Center in Mountain View, California, will lead the Observations of Aerosols above Clouds and their Interactions project to probe how smoke particles from massive biomass burning in Africa influences cloud cover over the Atlantic. Particles from this seasonal burning that are lofted into the mid-troposphere and transported westward over the southeast Atlantic interact with permanent stratocumulus “climate radiators,” which are critical to the regional and global climate system. NASA aircraft, including a Wallops P-3 and an Armstrong ER-2, will be used to conduct the investigation flying out of Walvis Bay, Namibia.
  • Melting Greenland glaciers – Josh Willis of NASA’s Jet Propulsion Laboratory in Pasadena, California, will lead the Oceans Melting Greenland mission to investigate the role of warmer saltier Atlantic subsurface waters in Greenland glacier melting. The study will help pave the way for improved estimates of future sea level rise by observing changes in glacier melting where ice contacts seawater. Measurements of the ocean bottom as well as seawater properties around Greenland will be taken from ships and the air using several aircraft including a NASA S-3 from Glenn Research Center in Cleveland, Ohio, and Gulfstream III from Armstrong.

Seven NASA centers, 25 educational institutions, three U.S. government agencies and two industry partners are involved in these Earth Venture projects. The five investigations were selected from 33 proposals.

Earth Venture investigations are part of NASA’s Earth System Science Pathfinder program managed at Langley for NASA’s Science Mission Directorate in Washington. The missions in this program provide an innovative approach to address Earth science research with periodic windows of opportunity to accommodate new scientific priorities.

NASA monitors Earth’s vital signs from land, sea, air and space with a fleet of satellites and ambitious airborne and surface-based observation campaigns. With this information and computer analysis tools, NASA studies Earth’s interconnected systems to better see how our planet is changing. The agency shares this unique knowledge with the global community and works with institutions in the United States and around the world that contribute to understanding and protecting our home planet.

For more information about NASA’s Earth science activities, visit:

http://www.nasa.gov/earthrightnow

Source: NASA

Electrical and computer engineering Professor Barry Van Veen wears an electrode net used to monitor brain activity via EEG signals. His research could help untangle what happens in the brain during sleep and dreaming.

Photo Credit: Nick Berard/UW-Madison

Stanford scientists seek to map origins of mental illness and develop noninvasive treatment

An interdisciplinary team of scientists has convened to map the origins of mental illnesses in the brain and develop noninvasive technologies to treat the conditions. The collaboration could lead to improved treatments for depression, anxiety and post-traumatic stress disorder.

BY AMY ADAMS


Over the years imaging technologies have revealed a lot about what’s happening in our brains, including which parts are active in people with conditions like depression, anxiety or post-traumatic stress disorder. But here’s the secret Amit Etkin wants the world to know about those tantalizing images: they show the result of a brain state, not what caused it.

This is important because until we know how groups of neurons, called circuits, are causing these conditions – not just which are active later – scientists will never be able to treat them in a targeted way.

“You see things activated in brain images but you can’t tell just by watching what is cause and what is effect,” said Amit Etkin, an assistant professor of psychiatry and behavioral sciences. Etkin is co-leader of a new interdisciplinary initiative to understand what brain circuits underlie mental health conditions and then direct noninvasive treatments to those locations.

“Right now, if a patient with a mental illness goes to see their doctor they would likely be given a medication that goes all over the brain and body,” Etkin said. “While medications can work well, they do so for only a portion of people and often only partially.” Medications don’t specifically act on the brain circuits critically affected in that illness or individual.

The Big Idea: treat roots of mental illness

The new initiative, called NeuroCircuit, has the goal of finding the brain circuits that are responsible for mental health conditions and then developing ways of remotely stimulating those circuits and, the team hopes, potentially treating those conditions.

The initiative is part of the Stanford Neurosciences Institute‘s Big Ideas, which bring together teams of researchers from across disciplines to solve major problems in neuroscience and society. Stephen Baccus, an associate professor of neurobiology who co-leads the initiative with Etkin, said that what makes NeuroCircuit a big idea is the merging of teams trying to map circuits responsible for mental health conditions and teams developing new technologies to remotely access those circuits.

“Many psychiatric disorders, especially disorders of mood, probably involve malfunction within specific brain circuits that regulate emotion and motivation, yet our current pharmaceutical treatments affect circuits all over the brain,” said William Newsome, director of the Stanford Neurosciences Institute. “The ultimate goal of NeuroCircuit is more precise treatments, with minimal side effects, for specific psychiatric disorders.”

“The connection between the people who develop the technology and carry out research with the clinical goal, that’s what’s really come out of the Big Ideas,” Baccus said.

Brain control

Etkin has been working with a technology called transcranial magnetic stimulation, or TMS, to map and remotely stimulate parts of the brain. The device, which looks like a pair of doughnuts on a stick, generates a strong magnetic current that stimulates circuits near the surface of the brain.

TMS is currently used as a way of treating depression and anxiety, but Etkin said the brain regions being targeted are the ones available to TMS, not necessarily the ones most likely to treat a person’s condition. They are also not personalized for the individual.

Pairing TMS with another technology that shows which brain regions are active, Etkin and his team can stimulate one part of the brain with TMS and look for a reaction elsewhere. These studies can eventually help map the relationships between brain circuits and identify the circuits that underlie mental health conditions.

In parallel, the team is working to improve TMS to make it more useful as a therapy. TMS currently only reaches the surface of the brain and is not very focused. The goal is to improve the technology so that it can reach structures deeper in the brain in a more targeted way. “Right now they are hitting the only accessible target,” he said. “The parts we really want to hit for depression, anxiety or PTSD are likely deeper in the brain.”

Technology of the future

In parallel with the TMS work, Baccus and a team of engineers, radiologists and physiologists have been developing a way of using ultrasound to stimulate the brain. Ultrasound is widely used to image the body, most famously for producing images of developing babies in the womb. But in recent years scientists have learned that at the right frequency and focus, ultrasound can also stimulate nerves to fire.

In his lab, Baccus has been using ultrasound to stimulate nerve cells of the retina – the light-sensing structure at the back of the eye – as part of an effort to develop a prosthetic retina. He is also teaming up with colleagues to understand how ultrasound might be triggering that stimulation. It appears to compress the nerve cells in a way that could lead to activation, but the connection is far from clear.

Other members of the team are modifying existing ultrasound technology to direct it deep within the brain at a frequency that can stimulate nerves without harming them. If the team is successful, ultrasound could be a more targeted and focused tool than TMS for remotely stimulating circuits that underlie mental health conditions.

The group has been working together for about five years, and in 2012 got funding from Bio-X NeuroVentures, which eventually gave rise to the Stanford Neurosciences Institute, to pursue this technology. Baccus said that before merging with Etkin’s team they had been focusing on the technology without specific brain diseases in mind. “This merger really gives a target and a focus to the technology,” he said.

Etkin and Baccus said that if they are successful, they hope to have both a better understanding of how the brain functions and new tools for treating disabling mental health conditions.

Source: Stanford News

This map of Turkey shows the artists' interpretation of the North Anatolian Fault (blue line) and the possible site of an earthquake (white lines) that could strike beneath the Sea of Marmara.

Image: NASA, and Christine Daniloff and Jose-Luis Olivares/MIT

Groundwater composition as potential precursor to earthquakes

By Meres J. Weche


 

The world experiences over 13,000 earthquakes per year reaching a Richter magnitude of 4.0 or greater. But what if there was a way to predict these oft-deadly earthquakes and, through a reliable process, mitigate loss of life and damage to vital urban infrastructures?

Earthquake prediction is the “holy grail” of geophysics, says KAUST’s Dr. Sigurjón Jónsson, Associate Professor of Earth Science and Engineering and Principal Investigator of the Crustal Deformation and InSAR Group. But after some initial optimism among scientists in the 1970′s about the reality of predicting earthquakes, ushered in by the successful prediction within hours of a major earthquake in China in 1975, several failed predictions have since then moved the pendulum towards skepticism from the 1990′s onwards.

This map of Turkey shows the artists' interpretation of the North Anatolian Fault (blue line) and the possible site of an earthquake (white lines) that could strike beneath the Sea of Marmara. Image: NASA, and Christine Daniloff and Jose-Luis Olivares/MIT
This map of Turkey shows the artists’ interpretation of the North Anatolian Fault (blue line) and the possible site of an earthquake (white lines) that could strike beneath the Sea of Marmara.
Image: NASA, and Christine Daniloff and Jose-Luis Olivares/MIT

In a study recently published in Nature Geoscience by a group of Icelandic and Swedish researchers, including Prof. Sigurjón Jónsson, an interesting correlation was established between two earthquakes greater than magnitude 5 in North Iceland, in 2012 and 2013, and the observed changing chemical composition of area groundwater prior to these tectonic events. The changes included variations in dissolved element concentrations and fluctuations in the proportion of stable isotopes of oxygen and hydrogen.

Can We Really Predict Earthquakes?

The basic common denominator guiding scientists and general observers investigating the predictability of earthquakes is the detection of these noticeable changes before seismic events. Some of these observable precursors are changes in groundwater level, radon gas sometimes coming out from the ground, smaller quakes called foreshocks, and even strange behavior by some animals before large earthquakes.

There are essentially three prevailing schools of thought in earthquake prediction among scientists. There’s a first group of scientists who believe that earthquake prediction is achievable but we simply don’t yet know how to do it reliably. They believe that we may, at some point in the future, be able to give short-term predictions.

Then there’s another class of scientists who believe that we will never be able to predict earthquakes. Their philosophy is that the exact start of earthquakes is simply randomly occurring and that the best thing we can do is to retrofit our houses and make probability forecasts — but no short-term warnings.

The last group, which currently represents a minority of scientists who are not often taken seriously, believes that earthquakes are indeed predictable and that we have the tools to do it.

Following the wave of optimism in the ’70s and ’80s, the interest and confidence of scientists in predicting earthquakes have generally subsided, along with the funding. Scientists now tend to focus mainly on understanding the physics behind earthquakes. As Prof. Jónsson summarizes:

“From geology and from earthquake occurrence today we can more or less see where in the world we have large earthquakes and where we have areas which are relatively safe. Although we cannot make short-term predictions we can make what we call forecasts. We can give probabilities. But short-term predictions are not achievable and may never be. We will see.”

The Message from the Earth’s Cracking Crust

Iceland was an ideal location to conduct the collaborative study undertaken by the scientists from Akureyri University, the University of Iceland, Landsvirkjun (the National Power Company of Iceland), the University of Stockholm, the University of Gothenburg and Karolinska Institutet in Stockholm, and KAUST.

“Iceland is a good testing ground because, geologically speaking, it’s very active. It has erupting volcanoes and it has large earthquakes also happening relatively often compared to many other places. And these areas that are active are relatively accessible,” said Prof. Jónsson.

The team of researchers monitored the chemistry, temperature and pressure in a few water wells in north Iceland for a period of five years more or less continuously. “They have been doing this to form an understanding of the variability of these chemical compounds in the wells; and then possibly associate significant changes to tectonic or major events,” he adds.

Through the five-year data collection period, which began in 2008, they were able to detect perceptible changes in the aquifer system as much as four to six months prior to the two recorded earthquakes: one of a magnitude 5.6 in October 2012 and a second one of magnitude 5.5 in April 2013. Their main observation was that the proportion of young local precipitation water in the geothermal water increased – in proportion to water that fell as rain thousands of years ago (aquifer systems are typically a mix of these two). At the same time, alterations were evident in the dissolved chemicals like sodium, calcium and silicon during that precursor period. Interestingly, the proportion went back to its previous state about three months after the quakes.

While the scientists are cautioning that this is not a confirmation that earthquake predictions are now feasible, the observations are promising and worthy of further investigation involving more exhaustive monitoring in additional locations. But, statistically speaking, it would be very difficult to disassociate these changes in the groundwater chemical composition from the two earthquakes.

The reason why a change in the ratio between old and new water in the aquifer system is important is because it points to the development of small fractures from the build-up of stress on the rocks before an earthquake. So the new rainwater seeps through the newly formed cracks, or microfracturing, in the rocky soil. Prof. Sigurjón Jónsson illustrates this as follows:

“It’s similar to when you take a piece of wood and you start to bend it. At some point before it snaps it starts to crack a little; and then poof it snaps. Something similar might be happening in the earth. Meaning that just before an earthquake happens, if you start to have a lot of micro-fracturing you will have water having an easier time to move around in the rocks.”

The team will be presenting their findings at the American Geophysical Union (AGU) meeting in San Francisco in December 2014. “It will be interesting to see the reaction there,” said Prof. Jónsson

Source: KAUST News

Electrical and computer engineering Professor Barry Van Veen wears an electrode net used to monitor brain activity via EEG signals. His research could help untangle what happens in the brain during sleep and dreaming.

Photo Credit: Nick Berard/UW-Madison

Imagination, reality flow in opposite directions in the brain

By Scott Gordon


As real as that daydream may seem, its path through your brain runs opposite reality.

Aiming to discern discrete neural circuits, researchers at the University of Wisconsin–Madison have tracked electrical activity in the brains of people who alternately imagined scenes or watched videos.

“A really important problem in brain research is understanding how different parts of the brain are functionally connected. What areas are interacting? What is the direction of communication?” says Barry Van Veen, a UW-Madison professor of electrical and computer engineering. “We know that the brain does not function as a set of independent areas, but as a network of specialized areas that collaborate.”

Van Veen, along with Giulio Tononi, a UW-Madison psychiatry professor and neuroscientist, Daniela Dentico, a scientist at UW–Madison’s Waisman Center, and collaborators from the University of Liege in Belgium, published results recently in the journalNeuroImage. Their work could lead to the development of new tools to help Tononi untangle what happens in the brain during sleep and dreaming, while Van Veen hopes to apply the study’s new methods to understand how the brain uses networks to encode short-term memory.

During imagination, the researchers found an increase in the flow of information from the parietal lobe of the brain to the occipital lobe — from a higher-order region that combines inputs from several of the senses out to a lower-order region.

Electrical and computer engineering Professor Barry Van Veen wears an electrode net used to monitor brain activity via EEG signals. His research could help untangle what happens in the brain during sleep and dreaming. Photo Credit: Nick Berard/UW-Madison
Electrical and computer engineering Professor Barry Van Veen wears an electrode net used to monitor brain activity via EEG signals. His research could help untangle what happens in the brain during sleep and dreaming.
Photo Credit: Nick Berard/UW-Madison

In contrast, visual information taken in by the eyes tends to flow from the occipital lobe — which makes up much of the brain’s visual cortex — “up” to the parietal lobe.

“There seems to be a lot in our brains and animal brains that is directional, that neural signals move in a particular direction, then stop, and start somewhere else,” says. “I think this is really a new theme that had not been explored.”

The researchers approached the study as an opportunity to test the power of electroencephalography (EEG) — which uses sensors on the scalp to measure underlying electrical activity — to discriminate between different parts of the brain’s network.

Brains are rarely quiet, though, and EEG tends to record plenty of activity not necessarily related to a particular process researchers want to study.

To zero in on a set of target circuits, the researchers asked their subjects to watch short video clips before trying to replay the action from memory in their heads. Others were asked to imagine traveling on a magic bicycle — focusing on the details of shapes, colors and textures — before watching a short video of silent nature scenes.

Using an algorithm Van Veen developed to parse the detailed EEG data, the researchers were able to compile strong evidence of the directional flow of information.

“We were very interested in seeing if our signal-processing methods were sensitive enough to discriminate between these conditions,” says Van Veen, whose work is supported by the National Institute of Biomedical Imaging and Bioengineering. “These types of demonstrations are important for gaining confidence in new tools.”

Source: UW-Madison News

Live longer? Save the planet? Better diet could nail both

New study shows healthier food choices could dramatically decrease environmental costs of agriculture


As cities and incomes increase around the world, so does consumption of refined sugars, refined fats, oils and resource- and land-intense agricultural products such as beef. A new study led by University of Minnesota ecologist David Tilman shows how a shift away from this trajectory and toward healthier traditional Mediterranean, pescatarian or vegetarian diets could not only boost human lifespan and quality of life, but also slash greenhouse gas emissions and save habitat for endangered species.

The study, published in the November 12 online edition of Nature by Tilman and graduate student Michael Clark, synthesized data on environmental costs of food production, diet trends, relationships between diet and health, and population growth. Their integrated analysis painted a striking picture of the human and environmental health costs of our current diet trajectory as well as how strategically modifying food choices could reduce not only incidence of type II diabetes, coronary heart disease and other chronic diseases, but global agricultural greenhouse gas emissions and habitat degradation, as well.

“We showed that the same dietary changes that can add about a decade to our lives can also prevent massive environmental damage,” said Tilman, a professor in the University’s College of Biological Sciences and resident fellow at the Institute on the Environment. “In particular, if the world were to adopt variations on three common diets, health would be greatly increased at the same time global greenhouse gas emissions were reduced by an amount equal to the current greenhouse gas emissions of all cars, trucks, planes, trains and ships. In addition, this dietary shift would prevent the destruction of an area of tropical forests and savannas as large as half of the United States.”

The researchers found that, as incomes increased between 1961 and 2009, people consumed more meat protein, empty calories and total calories per person. When these trends were combined with forecasts of population growth and income growth for the coming decades, the study predicted that diets in 2050 would contain fewer servings of fruits and vegetables, but about 60 percent more empty calories and 25 to 50 percent more pork, poultry, beef, dairy and eggs — a suite of changes that would increase incidence of type II diabetes, coronary heart disease and some cancers. Using life-cycle analyses of various food production systems, the study also calculated that, if current trends prevail, these 2050 diets would also lead to an 80 percent increase in global greenhouse gas emissions from food production as well as habitat destruction due to land clearing for agriculture around the world.

The study then compared health impacts of the global omnivorous diet with those reported for traditional Mediterranean, pescatarian and vegetarian diets. Adopting these alternative diets could reduce incidence of type II diabetes by about 25 percent, cancer by about 10 percent and death from heart disease by about 20 percent relative to the omnivore diet. Additionally, the adoption of these or similar alternative diets would prevent most or all of the increased greenhouse gas emissions and habitat destruction that would otherwise be caused by both current diet trends and increased global population.

The authors acknowledged that numerous factors go into diet choice, but also pointed out that the alternative diets already are part of the lives of countless people around the world. Noting that variations on the diets used in the scenario could potentially show even greater benefit, they concluded that “the evaluation and implementation of dietary solutions to the tightly linked diet-environment-health trilemma is a global challenge, and opportunity, of great environmental and public health importance.”

Tilman is a Regents Professor and McKnight Presidential Chair in Ecology in the College of Biological Sciences’ Department of Ecology, Evolution and Behavior and a resident fellow in the University of Minnesota’s Institute on the Environment, which seeks lasting solutions to Earth’s biggest challenges through research, partnerships and leadership development. Clark is currently a doctoral student in the College of Food, Agricultural and Natural Resource Sciences.

Source: University of Minnesota

Time to Wake Up: Artist’s impression of NASA’s New Horizons spacecraft, currently en route to Pluto. Operators at the Johns Hopkins University Applied Physics Laboratory are preparing to “wake” the spacecraft from electronic hibernation on Dec. 6, when the probe will be more than 2.9 billion miles from Earth. (Credit: NASA/Johns Hopkins University Applied Physics Laboratory/Southwest Research Institute)

New Horizons Set to Wake Up for Pluto Encounter

NASA’s New Horizons spacecraft comes out of hibernation for the last time on Dec. 6. Between now and then, while the Pluto-bound probe enjoys three more weeks of electronic slumber, work on Earth is well under way to prepare the spacecraft for a six-month encounter with the dwarf planet that begins in January.

“New Horizons is healthy and cruising quietly through deep space – nearly three billion miles from home – but its rest is nearly over,” says Alice Bowman, New Horizons mission operations manager at the Johns Hopkins University Applied Physics Laboratory (APL) in Laurel, Md. “It’s time for New Horizons to wake up, get to work, and start making history.”

Since launching in January 2006, New Horizons has spent 1,873 days in hibernation – about two-thirds of its flight time – spread over 18 separate hibernation periods from mid-2007 to late 2014 that ranged from 36 days to 202 days long.

In hibernation mode much of the spacecraft is unpowered; the onboard flight computer monitors system health and broadcasts a weekly beacon-status tone back to Earth. On average, operators woke New Horizons just over twice each year to check out critical systems, calibrate instruments, gather science data, rehearse Pluto-encounter activities and perform course corrections when necessary.

New Horizons pioneered routine cruise-flight hibernation for NASA. Not only has hibernation reduced wear and tear on the spacecraft’s electronics, it lowered operations costs and freed up NASA Deep Space Network tracking and communication resources for other missions.

Ready to Go

Next month’s wake-up call was preprogrammed into New Horizons’ on-board computer in August, commanding it come out of hibernation at 3 p.m. EST on Dec. 6. About 90 minutes later New Horizons will transmit word to Earth that it’s in “active” mode; those signals, even traveling at light speed, will need four hours and 25 minutes to reach home. Confirmation should reach the mission operations team at APL around 9:30 p.m. EST. At the time New Horizons will be more than 2.9 billion miles from Earth, and just 162 million miles – less than twice the distance between Earth and the sun – from Pluto.

Time to Wake Up: Artist’s impression of NASA’s New Horizons spacecraft, currently en route to Pluto. Operators at the Johns Hopkins University Applied Physics Laboratory are preparing to “wake” the spacecraft from electronic hibernation on Dec. 6, when the probe will be more than 2.9 billion miles from Earth. (Credit: NASA/Johns Hopkins University Applied Physics Laboratory/Southwest Research Institute)
Time to Wake Up: Artist’s impression of NASA’s New Horizons spacecraft, currently en route to Pluto. Operators at the Johns Hopkins University Applied Physics Laboratory are preparing to “wake” the spacecraft from electronic hibernation on Dec. 6, when the probe will be more than 2.9 billion miles from Earth. (Credit: NASA/Johns Hopkins University Applied Physics Laboratory/Southwest Research Institute)

After several days of collecting navigation-tracking data, downloading and analyzing the cruise science and spacecraft housekeeping data stored on New Horizons’ digital recorders, the mission team will begin activities that include conducting final tests on the spacecraft’s science instruments and operating systems, and building and testing the computer-command sequences that will guide New Horizons through its flight to and reconnaissance of the Pluto system. Tops on the mission’s science list are characterizing the global geology and topography of Pluto and its large moon Charon, mapping their surface compositions and temperatures, examining Pluto’s atmospheric composition and structure, studying Pluto’s smaller moons and searching for new moons and rings.

New Horizons’ seven-instrument science payload, developed under direction of Southwest Research Institute, includes advanced imaging infrared and ultraviolet spectrometers, a compact multicolor camera, a high-resolution telescopic camera, two powerful particle spectrometers, a space-dust detector (designed and built by students at the University of Colorado) and two radio science experiments. The entire spacecraft, drawing electricity from a single radioisotope thermoelectric generator, operates on less power than a pair of 100-watt light bulbs.

Distant observations of the Pluto system begin Jan. 15 and will continue until late July 2015; closest approach to Pluto is July 14.

“We’ve worked years to prepare for this moment,” says Mark Holdridge, New Horizons encounter mission manager at APL. “New Horizons might have spent most of its cruise time across nearly three billion miles of space sleeping, but our team has done anything but, conducting a flawless flight past Jupiter just a year after launch, putting the spacecraft through annual workouts, plotting out each step of the Pluto flyby and even practicing the entire Pluto encounter on the spacecraft. We are ready to go.”

“The final hibernation wake up Dec. 6 signifies the end of an historic cruise across the entirety of our planetary system,” added New Horizons Principal Investigator Alan Stern, of the Southwest Research Institute. “We are almost on Pluto’s doorstep!”

The Johns Hopkins Applied Physics Laboratory manages the New Horizons mission for NASA’s Science Mission Directorate. Alan Stern, of the Southwest Research Institute (SwRI) is the principal investigator and leads the mission; SwRI leads the science team, payload operations, and encounter science planning. New Horizons is part of the New Frontiers Program managed by NASA’s Marshall Space Flight Center in Huntsville, Ala. APL designed, built and operates the New Horizons spacecraft.

Source: JHUAPL

This artist's impression shows schematically the mysterious alignments between the spin axes of quasars and the large-scale structures that they inhabit that observations with ESO’s Very Large Telescope have revealed. These alignments are over billions of light-years and are the largest known in the Universe.

The large-scale structure is shown in blue and quasars are marked in white with the rotation axes of their black holes indicated with a line.

This picture is for illustration only and does not depict the real distribution of galaxies and quasars.

Credit:

ESO/M. Kornmesser

Spooky Alignment of Quasars Across Billions of Light-years

VLT reveals alignments between supermassive black hole axes and large-scale structure


New observations with ESO’s Very Large Telescope (VLT) in Chile have revealed alignments over the largest structures ever discovered in the Universe. A European research team has found that the rotation axes of the central supermassive black holes in a sample of quasars are parallel to each other over distances of billions of light-years. The team has also found that the rotation axes of these quasars tend to be aligned with the vast structures in the cosmic web in which they reside.

Quasars are galaxies with very active supermassive black holes at their centres. These black holes are surrounded by spinning discs of extremely hot material that is often spewed out in long jets along their axes of rotation. Quasars can shine more brightly than all the stars in the rest of their host galaxies put together.

This artist's impression shows schematically the mysterious alignments between the spin axes of quasars and the large-scale structures that they inhabit that observations with ESO’s Very Large Telescope have revealed. These alignments are over billions of light-years and are the largest known in the Universe. The large-scale structure is shown in blue and quasars are marked in white with the rotation axes of their black holes indicated with a line. This picture is for illustration only and does not depict the real distribution of galaxies and quasars. Credit: ESO/M. Kornmesser
This artist’s impression shows schematically the mysterious alignments between the spin axes of quasars and the large-scale structures that they inhabit that observations with ESO’s Very Large Telescope have revealed. These alignments are over billions of light-years and are the largest known in the Universe.
The large-scale structure is shown in blue and quasars are marked in white with the rotation axes of their black holes indicated with a line.
This picture is for illustration only and does not depict the real distribution of galaxies and quasars.
Credit:
ESO/M. Kornmesser

A team led by Damien Hutsemékers from the University of Liège in Belgium used the FORS instrument on the VLT to study 93 quasars that were known to form huge groupings spread over billions of light-years, seen at a time when the Universe was about one third of its current age.

The first odd thing we noticed was that some of the quasars’ rotation axes were aligned with each other — despite the fact that these quasars are separated by billions of light-years,” said Hutsemékers.

The team then went further and looked to see if the rotation axes were linked, not just to each other, but also to the structure of the Universe on large scales at that time.

When astronomers look at the distribution of galaxies on scales of billions of light-years they find that they are not evenly distributed. They form a cosmic web of filaments and clumps around huge voids where galaxies are scarce. This intriguing and beautiful arrangement of material is known as large-scale structure.

The new VLT results indicate that the rotation axes of the quasars tend to be parallel to the large-scale structures in which they find themselves. So, if the quasars are in a long filament then the spins of the central black holes will point along the filament. The researchers estimate that the probability that these alignments are simply the result of chance is less than 1%.

A correlation between the orientation of quasars and the structure they belong to is an important prediction of numerical models of evolution of our Universe. Our data provide the first observational confirmation of this effect, on scales much larger that what had been observed to date for normal galaxies,” adds Dominique Sluse of the Argelander-Institut für Astronomie in Bonn, Germany and University of Liège.

The team could not see the rotation axes or the jets of the quasars directly. Instead they measured the polarisation of the light from each quasar and, for 19 of them, found a significantly polarised signal. The direction of this polarisation, combined with other information, could be used to deduce the angle of the accretion disc and hence the direction of the spin axis of the quasar.

The alignments in the new data, on scales even bigger than current predictions from simulations, may be a hint that there is a missing ingredient in our current models of the cosmos,” concludes Dominique Sluse.

More information

This research was presented in a paper entitled “Alignment of quasar polarizations with large-scale structures“, by D. Hutsemékers et al., to appear in the journal Astronomy & Astrophysics on 19 November 2014.

The team is composed of D. Hutsemékers (Institut d’Astrophysique et de Géophysique, Université de Liège, Liège, Belgium), L. Braibant (Liège), V. Pelgrims (Liège) and D. Sluse (Argelander-Institut für Astronomie, Bonn, Germany; Liège).

ESO is the foremost intergovernmental astronomy organisation in Europe and the world’s most productive ground-based astronomical observatory by far. It is supported by 15 countries: Austria, Belgium, Brazil, the Czech Republic, Denmark, France, Finland, Germany, Italy, the Netherlands, Portugal, Spain, Sweden, Switzerland and the United Kingdom. ESO carries out an ambitious programme focused on the design, construction and operation of powerful ground-based observing facilities enabling astronomers to make important scientific discoveries. ESO also plays a leading role in promoting and organising cooperation in astronomical research. ESO operates three unique world-class observing sites in Chile: La Silla, Paranal and Chajnantor. At Paranal, ESO operates the Very Large Telescope, the world’s most advanced visible-light astronomical observatory and two survey telescopes. VISTA works in the infrared and is the world’s largest survey telescope and the VLT Survey Telescope is the largest telescope designed to exclusively survey the skies in visible light. ESO is the European partner of a revolutionary astronomical telescope ALMA, the largest astronomical project in existence. ESO is currently planning the 39-metre European Extremely Large optical/near-infrared Telescope, the E-ELT, which will become “the world’s biggest eye on the sky”.

Source: ESO


The mass difference spectrum: the LHCb result shows strong evidence of the existence of two new particles the Xi_b'- (first peak) and Xi_b*- (second peak), with the very high-level confidence of 10 sigma. The black points are the signal sample and the hatched red histogram is a control sample. The blue curve represents a model including the two new particles, fitted to the data. Delta_m is the difference between the mass of the Xi_b0 pi- pair and the sum of the individual masses of the Xi_b0 and pi-.. INSET: Detail of the Xi_b'- region plotted with a finer binning.
Credit: CERN

LHCb experiment observes two new baryon particles never seen before

Geneva 19 November 2014. Today the collaboration for the LHCb experiment at CERN1’s Large Hadron Collider announced the discovery of two new particles in the baryon family. The particles, known as the Xi_b’- and Xi_b*-, were predicted to exist by the quark model but had never been seen before. A related particle, the Xi_b*0, was found by the CMS experiment at CERN in 2012. The LHCb collaboration submitted a paper reporting the finding to Physical Review Letters.

Like the well-known protons that the LHC accelerates, the new particles are baryons made from three quarks bound together by the strong force. The types of quarks are different, though: the new X_ib particles both contain one beauty (b), one strange (s), and one down (d) quark. Thanks to the heavyweight b quarks, they are more than six times as massive as the proton. But the particles are more than just the sum of their parts: their mass also depends on how they are configured. Each of the quarks has an attribute called “spin”. In the Xi_b’- state, the spins of the two lighter quarks point in the opposite direction to the b quark, whereas in the Xi_b*- state they are aligned. This difference makes the Xi_b*a little heavier.

“Nature was kind and gave us two particles for the price of one,” said Matthew Charles of the CNRS’s LPNHE laboratory at Paris VI University. “The Xi_b’is very close in mass to the sum of its decay products: if it had been just a little lighter, we wouldn’t have seen it at all using the decay signature that we were looking for.”

 The mass difference spectrum: the LHCb result shows strong evidence of the existence of two new particles the Xi_b'- (first peak) and Xi_b*- (second peak), with the very high-level confidence of 10 sigma. The black points are the signal sample and the hatched red histogram is a control sample. The blue curve represents a model including the two new particles, fitted to the data. Delta_m is the difference between the mass of the Xi_b0 pi- pair and the sum of the individual masses of the Xi_b0 and pi-.. INSET: Detail of the Xi_b'- region plotted with a finer binning. Credit: CERN
The mass difference spectrum: the LHCb result shows strong evidence of the existence of two new particles the Xi_b’- (first peak) and Xi_b*- (second peak), with the very high-level confidence of 10 sigma. The black points are the signal sample and the hatched red histogram is a control sample. The blue curve represents a model including the two new particles, fitted to the data. Delta_m is the difference between the mass of the Xi_b0 pi- pair and the sum of the individual masses of the Xi_b0 and pi-.. INSET: Detail of the Xi_b’- region plotted with a finer binning.
Credit: CERN

“This is a very exciting result. Thanks to LHCb’s excellent hadron identification, which is unique among the LHC experiments, we were able to separate a very clean and strong signal from the background,”said Steven Blusk from Syracuse University in New York. “It demonstrates once again the sensitivity and how precise the LHCb detector is.”

As well as the masses of these particles, the research team studied their relative production rates, their widths – a measure of how unstable they are – and other details of their decays. The results match up with predictions based on the theory of Quantum Chromodynamics (QCD).

QCD is part of the Standard Model of particle physics, the theory that describes the fundamental particles of matter, how they interact and the forces between them. Testing QCD at high precision is a key to refine our understanding of quark dynamics, models of which are tremendously difficult to calculate.

“If we want to find new physics beyond the Standard Model, we need first to have a sharp picture,” said LHCb’s physics coordinator Patrick Koppenburg from Nikhef Institute in Amsterdam. “Such high precision studies will help us to differentiate between Standard Model effects and anything new or unexpected in the future.”

The measurements were made with the data taken at the LHC during 2011-2012. The LHC is currently being prepared – after its first long shutdown – to operate at higher energies and with more intense beams. It is scheduled to restart by spring 2015.

Further information

Link to the paper on Arxiv: http://arxiv.org/abs/1411.4849(link is external)
More about the result on LHCb’s collaboration website: http://lhcb-public.web.cern.ch/lhcb-public/Welcome.html#StrBeaBa
Observation of a new Xi_b*0 beauty particle, on CMS’ collaboration website:http://cms.web.cern.ch/news/observation-new-xib0-beauty-particle

Footnote(s)

1. CERN, the European Organization for Nuclear Research, is the world’s leading laboratory for particle physics. It has its headquarters in Geneva. At present, its Member States are Austria, Belgium, Bulgaria, the Czech Republic, Denmark, Finland, France, Germany, Greece, Hungary, Israel, Italy, the Netherlands, Norway, Poland, Portugal, Slovakia, Spain, Sweden, Switzerland and the United Kingdom. Romania is a Candidate for Accession. Serbia is an Associate Member in the pre-stage to Membership. India, Japan, the Russian Federation, the United States of America, Turkey, the European Commission and UNESCO have Observer Status.

Source: CERN