Tag Archives: cosmology

The powerful gravity of a galaxy embedded in a massive cluster of galaxies in this Hubble Space Telescope image is producing multiple images of a single distant supernova far behind it. Both the galaxy and the galaxy cluster are acting like a giant cosmic lens, bending and magnifying light from the supernova behind them, an effect called gravitational lensing.

The image shows the galaxy's location within a hefty cluster of galaxies called MACS J1149.6+2223, located more than 5 billion light-years away. In the enlarged inset view of the galaxy, the arrows point to the multiple copies of the exploding star, dubbed Supernova Refsdal, located 9.3 billion light-years from Earth. The images are arranged around the galaxy in a cross-shaped pattern called an Einstein Cross. The blue streaks wrapping around the galaxy are the stretched images of the supernova's host spiral galaxy, which has been distorted by the warping of space.

The four images were spotted on Nov. 11, 2014. This Hubble image combines data from three months of observations taken in visible light by the Advanced Camera for Surveys and in near-infrared light by the Wide Field Camera 3.

Object Names: SN Refsdal, MACS J1149.6+2223


Credit: NASA, ESA, and S. Rodney (JHU) and the FrontierSN team; T. Treu (UCLA), P. Kelly (UC Berkeley), and the GLASS team; J. Lotz (STScI) and the Frontier Fields team; M. Postman (STScI) and the CLASH team; and Z. Levay (STScI)

Significant progress in dark matter studies: Hubble Sees Supernova Split into Four Images by Cosmic Lens

Some of astronomy’s biggest goals include the study of dark matter and dark energy. These two phenomena were indirectly observed in 20th century and the questions about their nature still puzzle us. Astronomers, cosmologists, particle physicists, theoretical physicists and researchers in other related areas are trying hard to find more and more clues about the nature of dark matter and dark energy which comprise of around 95% of our universe.

The powerful gravity of a galaxy embedded in a massive cluster of galaxies in this Hubble Space Telescope image is producing multiple images of a single distant supernova far behind it. Both the galaxy and the galaxy cluster are acting like a giant cosmic lens, bending and magnifying light from the supernova behind them, an effect called gravitational lensing. The image shows the galaxy’s location within a hefty cluster of galaxies called MACS J1149.6+2223, located more than 5 billion light-years away. In the enlarged inset view of the galaxy, the arrows point to the multiple copies of the exploding star, dubbed Supernova Refsdal, located 9.3 billion light-years from Earth.
The images are arranged around the galaxy in a cross-shaped pattern called an Einstein Cross. The blue streaks wrapping around the galaxy are the stretched images of the supernova’s host spiral galaxy, which has been distorted by the warping of space. The four images were spotted on Nov. 11, 2014. This Hubble image combines data from three months of observations taken in visible light by the Advanced Camera for Surveys and in near-infrared light by the Wide Field Camera 3.
Object Names: SN Refsdal, MACS J1149.6+2223
Credit: NASA, ESA, and S. Rodney (JHU) and the FrontierSN team; T. Treu (UCLA), P. Kelly (UC Berkeley), and the GLASS team; J. Lotz (STScI) and the Frontier Fields team; M. Postman (STScI) and the CLASH team; and Z. Levay (STScI)

Astronomers using NASA’s Hubble Space Telescope have spotted for the first time a distant supernova split into four images. The multiple images of the exploding star are caused by the powerful gravity of a foreground elliptical galaxy embedded in a massive cluster of galaxies.

This unique observation will help astronomers refine their estimates of the amount and distribution of dark matter in the lensing galaxy and cluster. Dark matter cannot be seen directly but is believed to make up most of the universe’s mass.

The gravity from both the elliptical galaxy and the galaxy cluster distorts and magnifies the light from the supernova behind them, an effect called gravitational lensing. First predicted by Albert Einstein, this effect is similar to a glass lens bending light to magnify and distort the image of an object behind it. The multiple images are arranged around the elliptical galaxy in a cross-shaped pattern called an Einstein Cross, a name originally given to a particular multiply imaged quasar, the bright core of an active galaxy.

The elliptical galaxy and its cluster, MACS J1149.6+2223, are 5 billion light-years from Earth. The supernova behind it is 9.3 billion light-years away.

Although astronomers have discovered dozens of multiply imaged galaxies and quasars, they have never seen a stellar explosion resolved into several images. “It really threw me for a loop when I spotted the four images surrounding the galaxy — it was a complete surprise,” said Patrick Kelly of the University of California, Berkeley, a member of the Grism Lens Amplified Survey from Space (GLASS) collaboration. The GLASS group is working with the Frontier Field Supernova (FrontierSN) team to analyze the exploding star. Kelly is also the lead author on the science paper, which will appear on March 6 in a special issue of the journal Science celebrating the centenary of Albert Einstein’s Theory of General Relativity.

When the four images fade away, astronomers predict they will have a rare opportunity to catch a rerun of the supernova. This is because the current four-image pattern is only one part of the lensing display. The supernova may have appeared as a single image some 20 years ago elsewhere in the cluster field, and it is expected to reappear once more within the next five years.

This prediction is based on computer models of the cluster, which describe the various paths the supernova light is taking through the maze of clumpy dark matter in the galactic grouping. Each image takes a different route through the cluster and arrives at a different time, due, in part, to differences in the length of the pathways the light follows to reach Earth. The four supernova images captured by Hubble, for example, appeared within a few days or weeks of each other.

The supernova’s various light paths are analogous to several trains that leave a station at the same time, all traveling at the same speed and bound for the same location. Each train, however, takes a different route, and the distance for each route is not the same. Some trains travel over hills. Others go through valleys, and still others chug around mountains. Because the trains travel over different track lengths across different terrain, they do not arrive at their destination at the same time. Similarly, the supernova images do not appear at the same time because some of the light is delayed by traveling around bends created by the gravity of dense dark matter in the intervening galaxy cluster.

“Our model for the dark matter in the cluster gives us the prediction of when the next image will appear because it tells us how long each train track is, which correlates with time,” said Steve Rodney of the Johns Hopkins University in Baltimore, Maryland, leader of the FrontierSN team. “We already missed one that we think appeared about 20 years ago, and we found these four images after they had already appeared. The prediction of this future image is the one that is most exciting because we might be able to catch it. We hope to come back to this field with Hubble, and we’ll keep looking to see when that expected next image appears.”

Measuring the time delays between images offers clues to the type of warped-space terrain the supernova’s light had to cover and will help the astronomers fine-tune the models that map out the cluster’s mass. “We will measure the time delays, and we’ll go back to the models and compare them to the model predictions of the light paths,” Kelly said. “The lens modelers, such as Adi Zitrin (California Institute of Technology) from our team, will then be able to adjust their models to more accurately recreate the landscape of dark matter, which dictates the light travel time.”

While making a routine search of the GLASS team’s data, Kelly spotted the four images of the exploding star on Nov. 11, 2014. The FrontierSN and GLASS teams have been searching for such highly magnified explosions since 2013, and this object is their most spectacular discovery. The supernova appears about 20 times brighter than its natural brightness, due to the combined effects of two overlapping lenses. The dominant lensing effect is from the massive galaxy cluster, which focuses the supernova light along at least three separate paths. A secondary lensing effect occurs when one of those light paths happens to be precisely aligned with a specific elliptical galaxy within the cluster. “The dark matter of that individual galaxy then bends and refocuses the light into four more paths,” Rodney explained, “generating the rare Einstein Cross pattern we are currently observing.”

The two teams spent a week analyzing the object’s light, confirming it was the signature of a supernova. They then turned to the W.M. Keck Observatory on Mauna Kea, in Hawaii, to measure the distance to the supernova’s host galaxy.

The astronomers nicknamed the supernova Refsdal in honor of Norwegian astronomer Sjur Refsdal, who, in 1964, first proposed using time-delayed images from a lensed supernova to study the expansion of the universe. “Astronomers have been looking to find one ever since,” said Tommaso Treu of the University of California, Los Angeles, the GLASS project’s principal investigator. “The long wait is over!”

The Frontier Fields survey is a three-year program that uses Hubble and the gravitational-lensing effects of six massive galaxy clusters to probe not only what is inside the clusters but also what is beyond them. The three-year FrontierSN program studies supernovae that appear in and around the galaxy clusters of the Frontier Fields and GLASS surveys. The GLASS survey is using Hubble’s spectroscopic capabilities to study remote galaxies through the cosmic telescopes of 10 massive galaxy clusters, including the six in the Frontier Fields.

Supernova Refsdal and Galaxy Cluster MACS J1149.6+2223
Source: Hubblesite.org

Source: Hubble Site

Polarisation of the Cosmic Microwave Background

Credit: ESA/PLANCK

PLANCK Reveals First Starts Were Formed Much Later Than Previously Thought

New maps from ESA’s Planck satellite uncover the ‘polarised’ light from the early Universe across the entire sky, revealing that the first stars formed much later than previously thought.

The history of our Universe is a 13.8 billion-year tale that scientists endeavour to read by studying the planets, asteroids, comets and other objects in our Solar System, and gathering light emitted by distant stars, galaxies and the matter spread between them.

Polarisation of the Cosmic Microwave Background Credit: ESA/PLANCK
Polarisation of the Cosmic Microwave Background
Credit: ESA/PLANCK

A major source of information used to piece together this story is the Cosmic Microwave Background, or CMB, the fossil light resulting from a time when the Universe was hot and dense, only 380 000 years after the Big Bang.

Thanks to the expansion of the Universe, we see this light today covering the whole sky at microwave wavelengths.

Between 2009 and 2013, Planck surveyed the sky to study this ancient light in unprecedented detail. Tiny differences in the background’s temperature trace regions of slightly different density in the early cosmos, representing the seeds of all future structure, the stars and galaxies of today.

Scientists from the Planck collaboration have published the results from the analysis of these data in a large number of scientific papers over the past two years, confirming the standard cosmological picture of our Universe with ever greater accuracy.

History of the Universe Credit:ESA
History of the Universe
Credit:ESA

“But there is more: the CMB carries additional clues about our cosmic history that are encoded in its ‘polarisation’,” explains Jan Tauber, ESA’s Planck project scientist.

“Planck has measured this signal for the first time at high resolution over the entire sky, producing the unique maps released today.”

Light is polarised when it vibrates in a preferred direction, something that may arise as a result of photons – the particles of light – bouncing off other particles. This is exactly what happened when the CMB originated in the early Universe.

Initially, photons were trapped in a hot, dense soup of particles that, by the time the Universe was a few seconds old, consisted mainly of electrons, protons and neutrinos. Owing to the high density, electrons and photons collided with one another so frequently that light could not travel any significant distant before bumping into another electron, making the early Universe extremely ‘foggy’.

Slowly but surely, as the cosmos expanded and cooled, photons and the other particles grew farther apart, and collisions became less frequent.

This had two consequences: electrons and protons could finally combine and form neutral atoms without them being torn apart again by an incoming photon, and photons had enough room to travel, being no longer trapped in the cosmic fog.

Once freed from the fog, the light was set on its cosmic journey that would take it all the way to the present day, where telescopes like Planck detect it as the CMB. But the light also retains a memory of its last encounter with the electrons, captured in its polarisation.

“The polarisation of the CMB also shows minuscule fluctuations from one place to another across the sky: like the temperature fluctuations, these reflect the state of the cosmos at the time when light and matter parted company,” says François Bouchet of the Institut d’Astrophysique de Paris, France.

“This provides a powerful tool to estimate in a new and independent way parameters such as the age of the Universe, its rate of expansion and its essential composition of normal matter, dark matter and dark energy.”

Planck’s polarisation data confirm the details of the standard cosmological picture determined from its measurement of the CMB temperature fluctuations, but add an important new answer to a fundamental question: when were the first stars born?

“After the CMB was released, the Universe was still very different from the one we live in today, and it took a long time until the first stars were able to form,” explains Marco Bersanelli of Università degli Studi di Milano, Italy.

“Planck’s observations of the CMB polarisation now tell us that these ‘Dark Ages’ ended some 550 million years after the Big Bang – more than 100 million years later than previously thought.

“While these 100 million years may seem negligible compared to the Universe’s age of almost 14 billion years, they make a significant difference when it comes to the formation of the first stars.”

The Dark Ages ended as the first stars began to shine. And as their light interacted with gas in the Universe, more and more of the atoms were turned back into their constituent particles: electrons and protons.

This key phase in the history of the cosmos is known as the ‘epoch of reionisation’.

The newly liberated electrons were once again able to collide with the light from the CMB, albeit much less frequently now that the Universe had significantly expanded. Nevertheless, just as they had 380 000 years after the Big Bang, these encounters between electrons and photons left a tell-tale imprint on the polarisation of the CMB.

“From our measurements of the most distant galaxies and quasars, we know that the process of reionisation was complete by the time that the Universe was about 900 million years old,” says George Efstathiou of the University of Cambridge, UK.

“But, at the moment, it is only with the CMB data that we can learn when this process began.”

Planck’s new results are critical, because previous studies of the CMB polarisation seemed to point towards an earlier dawn of the first stars, placing the beginning of reionisation about 450 million years after the Big Bang.

This posed a problem. Very deep images of the sky from the NASA–ESA Hubble Space Telescope have provided a census of the earliest known galaxies in the Universe, which started forming perhaps 300–400 million years after the Big Bang.

However, these would not have been powerful enough to succeed at ending the Dark Ages within 450 million years.

“In that case, we would have needed additional, more exotic sources of energy to explain the history of reionisation,” says Professor Efstathiou.

The new evidence from Planck significantly reduces the problem, indicating that reionisation started later than previously believed, and that the earliest stars and galaxies alone might have been enough to drive it.

This later end of the Dark Ages also implies that it might be easier to detect the very first generation of galaxies with the next generation of observatories, including the James Webb Space Telescope.

But the first stars are definitely not the limit. With the new Planck data released today, scientists are also studying the polarisation of foreground emission from gas and dust in the Milky Way to analyse the structure of the Galactic magnetic field.

The data have also enabled new important insights into the early cosmos and its components, including the intriguing dark matter and the elusive neutrinos, as described in papers also released today.

The Planck data have delved into the even earlier history of the cosmos, all the way to inflation – the brief era of accelerated expansion that the Universe underwent when it was a tiny fraction of a second old. As the ultimate probe of this epoch, astronomers are looking for a signature of gravitational waves triggered by inflation and later imprinted on the polarisation of the CMB.

No direct detection of this signal has yet been achieved, as reported last week. However, when combining the newest all-sky Planck data with those latest results, the limits on the amount of primordial gravitational waves are pushed even further down to achieve the best upper limits yet.

“These are only a few highlights from the scrutiny of Planck’s observations of the CMB polarisation, which is revealing the sky and the Universe in a brand new way,” says Jan Tauber.

“This is an incredibly rich data set and the harvest of discoveries has just begun.”

A series of scientific papers describing the new results was published on 5 February and it can be downloaded here.

The new results from Planck are based on the complete surveys of the entire sky, performed between 2009 and 2013. New data, including temperature maps of the CMB at all nine frequencies observed by Planck and polarisation maps at four frequencies (30, 44, 70 and 353 GHz), are also released today.

The three principal scientific leaders of the Planck mission, Nazzareno Mandolesi, Jean-Loup Puget and Jan Tauber, were recently awarded the 2015 EPS Edison Volta Prize for “directing the development of the Planck payload and the analysis of its data, resulting in the refinement of our knowledge of the temperature fluctuations in the Cosmic Microwave Background as a vastly improved tool for doing precision cosmology at unprecedented levels of accuracy, and consolidating our understanding of the very early universe.

More about Planck

Launched in 2009, Planck was designed to map the sky in nine frequencies using two state-of-the-art instruments: the Low Frequency Instrument (LFI), which includes three frequency bands in the range 30–70 GHz, and the High Frequency Instrument (HFI), which includes six frequency bands in the range 100–857 GHz.

HFI completed its survey in January 2012, while LFI continued to make science observations until 3 October 2013, before being switched off on 19 October 2013. Seven of Planck’s nine frequency channels were equipped with polarisation-sensitive detectors.

The Planck Scientific Collaboration consists of all the scientists who have contributed to the development of the mission, and who participate in the scientific exploitation of the data during the proprietary period.

These scientists are members of one or more of four consortia: the LFI Consortium, the HFI Consortium, the DK-Planck Consortium, and ESA’s Planck Science Office. The two European-led Planck Data Processing Centres are located in Paris, France and Trieste, Italy.

The LFI consortium is led by N. Mandolesi, Università degli Studi di Ferrara, Italy (deputy PI: M. Bersanelli, Università degli Studi di Milano, Italy), and was responsible for the development and operation of LFI. The HFI consortium is led by J.L. Puget, Institut d’Astrophysique Spatiale in Orsay (CNRS/Université Paris-Sud), France (deputy PI: F. Bouchet, Institut d’Astrophysique de Paris (CNRS/UPMC), France), and was responsible for the development and operation of HFI.

Source: ESA

Quantum computer as detector shows space is not squeezed

 Robert Sanders


 

Ever since Einstein proposed his special theory of relativity in 1905, physics and cosmology have been based on the assumption that space looks the same in all directions – that it’s not squeezed in one direction relative to another.

A new experiment by UC Berkeley physicists used partially entangled atoms — identical to the qubits in a quantum computer — to demonstrate more precisely than ever before that this is true, to one part in a billion billion.

The classic experiment that inspired Albert Einstein was performed in Cleveland by Albert Michelson and Edward Morley in 1887 and disproved the existence of an “ether” permeating space through which light was thought to move like a wave through water. What it also proved, said Hartmut Häffner, a UC Berkeley assistant professor of physics, is that space is isotropic and that light travels at the same speed up, down and sideways.

“Michelson and Morley proved that space is not squeezed,” Häffner said. “This isotropy is fundamental to all physics, including the Standard Model of physics. If you take away isotropy, the whole Standard Model will collapse. That is why people are interested in testing this.”

The Standard Model of particle physics describes how all fundamental particles interact, and requires that all particles and fields be invariant under Lorentz transformations, and in particular that they behave the same no matter what direction they move.

Häffner and his team conducted an experiment analogous to the Michelson-Morley experiment, but with electrons instead of photons of light. In a vacuum chamber he and his colleagues isolated two calcium ions, partially entangled them as in a quantum computer, and then monitored the electron energies in the ions as Earth rotated over 24 hours.

As the Earth rotates every 24 hours, the orientation of the ions in the quantum computer/detector changes with respect to the Sun’s rest frame. If space were squeezed in one direction and not another, the energies of the electrons in the ions would have shifted with a 12-hour period. (Hartmut Haeffner image)
As the Earth rotates every 24 hours, the orientation of the ions in the quantum computer/detector changes with respect to the Sun’s rest frame. If space were squeezed in one direction and not another, the energies of the electrons in the ions would have shifted with a 12-hour period. (Hartmut Haeffner image)

If space were squeezed in one or more directions, the energy of the electrons would change with a 12-hour period. It didn’t, showing that space is in fact isotropic to one part in a billion billion (1018), 100 times better than previous experiments involving electrons, and five times better than experiments like Michelson and Morley’s that used light.

The results disprove at least one theory that extends the Standard Model by assuming some anisotropy of space, he said.

Häffner and his colleagues, including former graduate student Thaned Pruttivarasin, now at the Quantum Metrology Laboratory in Saitama, Japan, will report their findings in the Jan. 29 issue of the journal Nature.

Entangled qubits

Häffner came up with the idea of using entangled ions to test the isotropy of space while building quantum computers, which involve using ionized atoms as quantum bits, or qubits, entangling their electron wave functions, and forcing them to evolve to do calculations not possible with today’s digital computers. It occurred to him that two entangled qubits could serve as sensitive detectors of slight disturbances in space.

“I wanted to do the experiment because I thought it was elegant and that it would be a cool thing to apply our quantum computers to a completely different field of physics,” he said. “But I didn’t think we would be competitive with experiments being performed by people working in this field. That was completely out of the blue.”

He hopes to make more sensitive quantum computer detectors using other ions, such as ytterbium, to gain another 10,000-fold increase in the precision measurement of Lorentz symmetry. He is also exploring with colleagues future experiments to detect the spatial distortions caused by the effects of dark matter particles, which are a complete mystery despite comprising 27 percent of the mass of the universe.

“For the first time we have used tools from quantum information to perform a test of fundamental symmetries, that is, we engineered a quantum state which is immune to the prevalent noise but sensitive to the Lorentz-violating effects,” Häffner said. “We were surprised the experiment just worked, and now we have a fantastic new method at hand which can be used to make very precise measurements of perturbations of space.”

Other co-authors are UC Berkeley graduate student Michael Ramm, former UC Berkeley postdoc Michael Hohensee of Lawrence Livermore National Laboratory, and colleagues from the University of Delaware and Maryland and institutions in Russia. The work was supported by the National Science Foundation.

Source: UC Berkeley

Researchers use real data rather than theory to measure the cosmos

For the first time researchers have measured large distances in the Universe using data, rather than calculations related to general relativity.

A research team from Imperial College London and the University of Barcelona has used data from astronomical surveys to measure a standard distance that is central to our understanding of the expansion of the universe.

Previously the size of this ‘standard ruler’ has only been predicted from theoretical models that rely on general relativity to explain gravity at large scales. The new study is the first to measure it using observed data. A standard ruler is an object which consistently has the same physical size so that a comparison of its actual size to its size in the sky will provide a measurement of its distance to earth.

“Our research suggests that current methods for measuring distance in the Universe are more complicated than they need to be,” said Professor Alan Heavens from the Department of Physics, Imperial College London who led the study. “Traditionally in cosmology, general relativity plays a central role in most models and interpretations. We have demonstrated that current data are powerful enough to measure the geometry and expansion history of the Universe without relying on calculations relating to general relativity.

“We hope this more data-driven approach, combined with an ever increasing wealth of observational data, could provide more precise measurements that will be useful for future projects that are planning to answer major questions around the acceleration of the Universe and dark energy.”

The standard ruler measured in the research is the baryon acoustic oscillation scale. This is a pattern of a specific length which is imprinted in the clustering of matter created by small variations in density in the very early Universe (about 400,000 years after the Big Bang). The length of this pattern, which is the same today as it was then, is the baryon acoustic oscillation scale.

The team calculated the length to be 143 Megaparsecs (nearly 480 million light years) which is similar to accepted predictions for this distance from models based on general relativity.

Published in Physical Review Letters, the findings of the research suggest it is possible to measure cosmological distances independently from models that rely on general relativity.

Einstein’s theory of general relativity replaced Newton’s law to become the accepted explanation of how gravity behaves at large scales. Many important astrophysics models are based on general relativity, including those dealing with the expansion of the Universe and black holes. However some unresolved issues surround general relativity. These include its lack of reconciliation with the laws of quantum physics and the need for it to be extrapolated many orders of magnitude in scales in order to apply it in cosmological settings. No other physics law have been extrapolated that much without needing any adjustment, so its assumptions are still open to question.

Co-author of the study, Professor Raul Jimenez from the University of Barcelona said: “The uncertainties around general relativity have motivated us to develop methods to derive more direct measurements of the cosmos, rather than relying so heavily on inferences from models. For our study we only made some minimal theoretical assumptions such as the symmetry of the Universe and a smooth expansion history.”

Co-author Professor Licia Verde from the University of Barcelona added: “There is a big difference between measuring distance and inferring its value indirectly. Usually in cosmology we can only do the latter and this is one of these rare and precious cases where we can directly measure distance. Most statements in cosmology assume general relativity works and does so on extremely large scales, which means we are often extrapolating figures out of our comfort zone. So it is reassuring to discover that we can make strong and important statements without depending on general relativity and which match previous statements. It gives one confidence that the observations we have of the Universe, as strange and puzzling as they might be, are realistic and sound!”

The research used current data from astronomical surveys on the brightness of exploding stars (supernovae) and on the regular pattern in the clustering of matter (baryonic acoustic oscillations) to measure the size of this ‘standard ruler’. The matter that created this standard ruler formed about 400,000 years after the Big Bang. This period was a time when the physics of the Universe was still relatively simple so the researchers did not need to consider more ‘exotic’ concepts such as dark energy in their measurements.

“In this study we have used measurements that are very clean,” Professor Heavens explained, “And the theory that we do apply comes from a time relatively soon after the Big Bang when the physics was also clean. This means we have what we believe to be a precise method of measurement based on observations of the cosmos. Astrophysics is an incredibly active but changeable field and the support for the different models is liable to change. Even when models are abandoned, measurements of the cosmos will survive. If we can rely on direct measurements based on real observations rather than theoretical models then this is good news for cosmology and astrophysics.”

The research was supported by the Royal Society and the European Research Council.

Source : Imperial College

This artist’s impression depicts the formation of a galaxy cluster in the early Universe. The galaxies are vigorously forming new stars and interacting with each other. Such a scene closely resembles the Spiderweb Galaxy (formally known as MRC 1138-262) and its surroundings, which is one of the best-studied protoclusters.

Credit:

ESO/M. Kornmesser

Universe may face a darker future

Since the discovery of the accelerated expansion of the universe in 1997 by High-Z Supernova Team led by Prof. Brian Schmidt and Adam Rees, and by Supernova Cosmology Project Team led by Prof. Saul Perlmutter, the question of the nature of this expansion and the role of the mysterious dark energy has puzzled the minds of many theoretical and observational physicists/astrophysicists.

Another puzzling question in astronomy comes from the unusual behavior of the stars revolving around the galaxies with higher velocities than expected if we consider the apparent baryonic matter in the galaxy.This has led to many new questions related to something we called the dark matter, another unexplained phenomenon.

 


 

New research offers a novel insight into the nature of dark matter and dark energy and what the future of our Universe might be.

Researchers in Portsmouth and Rome have found hints that dark matter, the cosmic scaffolding on which our Universe is built, is being slowly erased, swallowed up by dark energy.

The findings appear in the journal Physical Review Letters, published by the American Physical Society. In the journal cosmologists at the Universities of Portsmouth and Rome, argue that the latest astronomical data favours a dark energy that grows as it interacts with dark matter, and this appears to be slowing the growth of structure in the cosmos.

Professor David Wands, Director of Portsmouth’sInstitute of Cosmology and Gravitation, is one of the research team.

He said: “This study is about the fundamental properties of space-time. On a cosmic scale, this is about our Universe and its fate.

“If the dark energy is growing and dark matter is evaporating we will end up with a big, empty, boring Universe with almost nothing in it.

 

“Dark matter provides a framework for structures to grow in the Universe. The galaxies we see are built on that scaffolding and what we are seeing here, in these findings, suggests that dark matter is evaporating, slowing that growth of structure.”

Cosmology underwent a paradigm shift in 1998 when researchers announced that the rate at which the Universe was expanding was accelerating. The idea of a constant dark energy throughout space-time (the “cosmological constant”) became the standard model of cosmology, but now the Portsmouth and Rome researchers believe they have found a better description, including energy transfer between dark energy and dark matter.

Research students Valentina Salvatelli and Najla Said from the University of Rome worked in Portsmouth with Dr Marco Bruni and Professor Wands, and with Professor Alessandro Melchiorri in Rome. They examined data from a number of astronomical surveys, including the Sloan Digital Sky Survey, and used the growth of structure revealed by these surveys to test different models of dark energy.

Professor Wands said: “Valentina and Najla spent several months here over the summer looking at the consequences of the latest observations. Much more data is available now than was available in 1998 and it appears that the standard model is no longer sufficient to describe all of the data. We think we’ve found a better model of dark energy.

“Since the late 1990s astronomers have been convinced that something is causing the expansion of our Universe to accelerate. The simplest explanation was that empty space – the vacuum – had an energy density that was a cosmological constant. However there is growing evidence that this simple model cannot explain the full range of astronomical data researchers now have access to; in particular the growth of cosmic structure, galaxies and clusters of galaxies, seems to be slower than expected.”

Professor Dragan Huterer,of the University of Michigan, has read the research and said scientists need to take notice of the findings.

He said: “The paper does look very interesting. Any time there is a new development in the dark energy sector we need to take notice since so little is understood about it. I would not say, however, that I am surprised at the results, that they come out different than in the simplest model with no interactions. We’ve known for some months now that there is some problem in all data fitting perfectly to the standard simplest model.”

Source: Materials taken from Uop News