Astronomers have used ALMA to detect a huge mass of glowing stardust in a galaxy seen when the Universe was only four percent of its present age. This galaxy was observed shortly after its formation and is the most distant galaxy in which dust has been detected. This observation is also the most distant detection of oxygen in the Universe. These new results provide brand-new insights into the birth and explosive deaths of the very first stars.
An international team of astronomers, led by Nicolas Laporte of University College London, have used the Atacama Large Millimeter/submillimeter Array (ALMA) to observe A2744_YD4, the youngest and most remote galaxy ever seen by ALMA. They were surprised to find that this youthful galaxy contained an abundance of interstellar dust — dust formed by the deaths of an earlier generation of stars.
Follow-up observations using the X-shooter instrument on ESO’s Very Large Telescope confirmed the enormous distance to A2744_YD4. The galaxy appears to us as it was when the Universe was only 600 million years old, during the period when the first stars and galaxies were forming .
“Not only is A2744_YD4 the most distant galaxy yet observed by ALMA,” comments Nicolas Laporte, “but the detection of so much dust indicates early supernovae must have already polluted this galaxy.”
Cosmic dust is mainly composed of silicon, carbon and aluminium, in grains as small as a millionth of a centimetre across. The chemical elements in these grains are forged inside stars and are scattered across the cosmos when the stars die, most spectacularly in supernova explosions, the final fate of short-lived, massive stars. Today, this dust is plentiful and is a key building block in the formation of stars, planets and complex molecules; but in the early Universe — before the first generations of stars died out — it was scarce.
The observations of the dusty galaxy A2744_YD4 were made possible because this galaxy lies behind a massive galaxy cluster called Abell 2744 . Because of a phenomenon called gravitational lensing, the cluster acted like a giant cosmic “telescope” to magnify the more distant A2744_YD4 by about 1.8 times, allowing the team to peer far back into the early Universe.
The ALMA observations also detected the glowing emission of ionised oxygen from A2744_YD4. This is the most distant, and hence earliest, detection of oxygen in the Universe, surpassing another ALMA result from 2016.
The detection of dust in the early Universe provides new information on when the first supernovae exploded and hence the time when the first hot stars bathed the Universe in light. Determining the timing of this “cosmic dawn” is one of the holy grails of modern astronomy, and it can be indirectly probed through the study of early interstellar dust.
The team estimates that A2744_YD4 contained an amount of dust equivalent to 6 million times the mass of our Sun, while the galaxy’s total stellar mass — the mass of all its stars — was 2 billion times the mass of our Sun. The team also measured the rate of star formation in A2744_YD4 and found that stars are forming at a rate of 20 solar masses per year — compared to just one solar mass per year in the Milky Way .
“This rate is not unusual for such a distant galaxy, but it does shed light on how quickly the dust in A2744_YD4 formed,” explains Richard Ellis (ESO and University College London), a co-author of the study. “Remarkably, the required time is only about 200 million years — so we are witnessing this galaxy shortly after its formation.”
This means that significant star formation began approximately 200 million years before the epoch at which the galaxy is being observed. This provides a great opportunity for ALMA to help study the era when the first stars and galaxies “switched on” — the earliest epoch yet probed. Our Sun, our planet and our existence are the products — 13 billion years later — of this first generation of stars. By studying their formation, lives and deaths, we are exploring our origins.
“With ALMA, the prospects for performing deeper and more extensive observations of similar galaxies at these early times are very promising,” says Ellis.
And Laporte concludes: “Further measurements of this kind offer the exciting prospect of tracing early star formation and the creation of the heavier chemical elements even further back into the early Universe.”
 Abell 2744 is a massive object, lying 3.5 billion light-years away (redshift 0.308), that is thought to be the result of four smaller galaxy clusters colliding. It has been nicknamed Pandora’s Cluster because of the many strange and different phenomena that were unleashed by the huge collision that occurred over a period of about 350 million years. The galaxies only make up five percent of the cluster’s mass, while dark matter makes up seventy-five percent, providing the massive gravitational influence necessary to bend and magnify the light of background galaxies. The remaining twenty percent of the total mass is thought to be in the form of hot gas.
 This rate means that the total mass of the stars formed every year is equivalent to 20 times the mass of the Sun.
This research was presented in a paper entitled “Dust in the Reionization Era: ALMA Observations of a z =8.38 Gravitationally-Lensed Galaxy” by Laporte et al., to appear in The Astrophysical Journal Letters.
The team is composed of N. Laporte (University College London, UK), R. S. Ellis (University College London, UK; ESO, Garching, Germany), F. Boone (Institut de Recherche en Astrophysique et Planétologie (IRAP), Toulouse, France), F. E. Bauer (Pontificia Universidad Católica de Chile, Instituto de Astrofísica, Santiago, Chile), D. Quénard (Queen Mary University of London, London, UK), G. Roberts-Borsani (University College London, UK), R. Pelló (Institut de Recherche en Astrophysique et Planétologie (IRAP), Toulouse, France), I. Pérez-Fournon (Instituto de Astrofísica de Canarias, Tenerife, Spain; Universidad de La Laguna, Tenerife, Spain), and A. Streblyanska (Instituto de Astrofísica de Canarias, Tenerife, Spain; Universidad de La Laguna, Tenerife, Spain).
The Atacama Large Millimeter/submillimeter Array (ALMA), an international astronomy facility, is a partnership of ESO, the U.S. National Science Foundation (NSF) and the National Institutes of Natural Sciences (NINS) of Japan in cooperation with the Republic of Chile. ALMA is funded by ESO on behalf of its Member States, by NSF in cooperation with the National Research Council of Canada (NRC) and the National Science Council of Taiwan (NSC) and by NINS in cooperation with the Academia Sinica (AS) in Taiwan and the Korea Astronomy and Space Science Institute (KASI).
ALMA construction and operations are led by ESO on behalf of its Member States; by the National Radio Astronomy Observatory (NRAO), managed by Associated Universities, Inc. (AUI), on behalf of North America; and by the National Astronomical Observatory of Japan (NAOJ) on behalf of East Asia. The Joint ALMA Observatory (JAO) provides the unified leadership and management of the construction, commissioning and operation of ALMA.
ESO is the foremost intergovernmental astronomy organisation in Europe and the world’s most productive ground-based astronomical observatory by far. It is supported by 16 countries: Austria, Belgium, Brazil, the Czech Republic, Denmark, France, Finland, Germany, Italy, the Netherlands, Poland, Portugal, Spain, Sweden, Switzerland and the United Kingdom, along with the host state of Chile. ESO carries out an ambitious programme focused on the design, construction and operation of powerful ground-based observing facilities enabling astronomers to make important scientific discoveries. ESO also plays a leading role in promoting and organising cooperation in astronomical research. ESO operates three unique world-class observing sites in Chile: La Silla, Paranal and Chajnantor. At Paranal, ESO operates the Very Large Telescope, the world’s most advanced visible-light astronomical observatory and two survey telescopes. VISTA works in the infrared and is the world’s largest survey telescope and the VLT Survey Telescope is the largest telescope designed to exclusively survey the skies in visible light. ESO is a major partner in ALMA, the largest astronomical project in existence. And on Cerro Armazones, close to Paranal, ESO is building the 39-metre European Extremely Large Telescope, the E-ELT, which will become “the world’s biggest eye on the sky”.
NASA’s Spitzer Space Telescope has revealed the first known system of seven Earth-size planets around a single star. Three of these planets are firmly located in the habitable zone, the area around the parent star where a rocky planet is most likely to have liquid water.
The discovery sets a new record for greatest number of habitable-zone planets found around a single star outside our solar system. All of these seven planets could have liquid water – key to life as we know it – under the right atmospheric conditions, but the chances are highest with the three in the habitable zone.
“This discovery could be a significant piece in the puzzle of finding habitable environments, places that are conducive to life,” said Thomas Zurbuchen, associate administrator of the agency’s Science Mission Directorate in Washington. “Answering the question ‘are we alone’ is a top science priority and finding so many planets like these for the first time in the habitable zone is a remarkable step forward toward that goal.”
At about 40 light-years (235 trillion miles) from Earth, the system of planets is relatively close to us, in the constellation Aquarius. Because they are located outside of our solar system, these planets are scientifically known as exoplanets.
This exoplanet system is called TRAPPIST-1, named for The Transiting Planets and Planetesimals Small Telescope (TRAPPIST) in Chile. In May 2016, researchers using TRAPPIST announced they had discovered three planets in the system. Assisted by several ground-based telescopes, including the European Southern Observatory’s Very Large Telescope, Spitzer confirmed the existence of two of these planets and discovered five additional ones, increasing the number of known planets in the system to seven.
The new results were published Wednesday in the journal Nature, and announced at a news briefing at NASA Headquarters in Washington.
Using Spitzer data, the team precisely measured the sizes of the seven planets and developed first estimates of the masses of six of them, allowing their density to be estimated.
Based on their densities, all of the TRAPPIST-1 planets are likely to be rocky. Further observations will not only help determine whether they are rich in water, but also possibly reveal whether any could have liquid water on their surfaces. The mass of the seventh and farthest exoplanet has not yet been estimated – scientists believe it could be an icy, “snowball-like” world, but further observations are needed.
“The seven wonders of TRAPPIST-1 are the first Earth-size planets that have been found orbiting this kind of star,” said Michael Gillon, lead author of the paper and the principal investigator of the TRAPPIST exoplanet survey at the University of Liege, Belgium. “It is also the best target yet for studying the atmospheres of potentially habitable, Earth-size worlds.”
In contrast to our sun, the TRAPPIST-1 star – classified as an ultra-cool dwarf – is so cool that liquid water could survive on planets orbiting very close to it, closer than is possible on planets in our solar system. All seven of the TRAPPIST-1 planetary orbits are closer to their host star than Mercury is to our sun. The planets also are very close to each other. If a person was standing on one of the planet’s surface, they could gaze up and potentially see geological features or clouds of neighboring worlds, which would sometimes appear larger than the moon in Earth’s sky.
The planets may also be tidally locked to their star, which means the same side of the planet is always facing the star, therefore each side is either perpetual day or night. This could mean they have weather patterns totally unlike those on Earth, such as strong winds blowing from the day side to the night side, and extreme temperature changes.
Spitzer, an infrared telescope that trails Earth as it orbits the sun, was well-suited for studying TRAPPIST-1 because the star glows brightest in infrared light, whose wavelengths are longer than the eye can see. In the fall of 2016, Spitzer observed TRAPPIST-1 nearly continuously for 500 hours. Spitzer is uniquely positioned in its orbit to observe enough crossing – transits – of the planets in front of the host star to reveal the complex architecture of the system. Engineers optimized Spitzer’s ability to observe transiting planets during Spitzer’s “warm mission,” which began after the spacecraft’s coolant ran out as planned after the first five years of operations.
“This is the most exciting result I have seen in the 14 years of Spitzer operations,” said Sean Carey, manager of NASA’s Spitzer Science Center at Caltech/IPAC in Pasadena, California. “Spitzer will follow up in the fall to further refine our understanding of these planets so that the James Webb Space Telescope can follow up. More observations of the system are sure to reveal more secrets.”
Following up on the Spitzer discovery, NASA’s Hubble Space Telescope has initiated the screening of four of the planets, including the three inside the habitable zone. These observations aim at assessing the presence of puffy, hydrogen-dominated atmospheres, typical for gaseous worlds like Neptune, around these planets.
In May 2016, the Hubble team observed the two innermost planets, and found no evidence for such puffy atmospheres. This strengthened the case that the planets closest to the star are rocky in nature.
“The TRAPPIST-1 system provides one of the best opportunities in the next decade to study the atmospheres around Earth-size planets,” said Nikole Lewis, co-leader of the Hubble study and astronomer at the Space Telescope Science Institute in Baltimore, Maryland. NASA’s planet-hunting Kepler space telescope also is studying the TRAPPIST-1 system, making measurements of the star’s minuscule changes in brightness due to transiting planets. Operating as the K2 mission, the spacecraft’s observations will allow astronomers to refine the properties of the known planets, as well as search for additional planets in the system. The K2 observations conclude in early March and will be made available on the public archive.
Spitzer, Hubble, and Kepler will help astronomers plan for follow-up studies using NASA’s upcoming James Webb Space Telescope, launching in 2018. With much greater sensitivity, Webb will be able to detect the chemical fingerprints of water, methane, oxygen, ozone, and other components of a planet’s atmosphere. Webb also will analyze planets’ temperatures and surface pressures – key factors in assessing their habitability.
NASA’s Jet Propulsion Laboratory (JPL) in Pasadena, California, manages the Spitzer Space Telescope mission for NASA’s Science Mission Directorate. Science operations are conducted at the Spitzer Science Center, at Caltech, in Pasadena, California. Spacecraft operations are based at Lockheed Martin Space Systems Company, Littleton, Colorado. Data are archived at the Infrared Science Archive housed at Caltech/IPAC. Caltech manages JPL for NASA.
The mix of products that countries export is a good predictor of income distribution, study finds.
By Larry Hardesty
CAMBRIDGE, Mass. – In a series of papers over the past 10 years, MIT Professor César Hidalgo and his collaborators have argued that the complexity of a country’s exports — not just their diversity but the expertise and technological infrastructure required to produce them — is a better predictor of future economic growth than factors economists have historically focused on, such as capital and education.
Now, a new paper by Hidalgo and his colleagues, appearing in the journal World Development, argues that everything else being equal, the complexity of a country’s exports also correlates with its degree of economic equality: The more complex a country’s products, the greater equality it enjoys relative to similar-sized countries with similar-sized economies.
“When people talk about the role of policy in inequality, there is an implicit assumption that you can always reduce inequality using only redistributive policies,” says Hidalgo, the Asahi Broadcasting Corporation Associate Professor of Media Arts and Sciences at the MIT Media Lab. “What these new results are telling us is that the effectiveness of policy is limited because inequality lives within a range of values that are determined by your underlying industrial structure.
“So if you’re a country like Venezuela, no matter how much money Chavez or Maduro gives out, you’re not going to be able to reduce inequality, because, well, all the money is coming in from one industry, and the 30,000 people involved in that industry of course are going to have an advantage in the economy. While if you’re in a country like Germany or Switzerland, where the economy is very diversified, and there are many people who are generating money in many different industries, firms are going to be under much more pressure to be more inclusive and redistributive.”
Joining Hidalgo on the paper are first author Dominik Hartmann, who was a postdoc in Hidalgo’s group when the work was done and is now a research fellow at the Fraunhofer Center for International Management and Knowledge Economy in Leipzig, Germany; Cristian Jara-Figueroa and Manuel Aristarán, MIT graduate students in media arts and sciences; and Miguel Guevara, a professor of computer science at Playa Ancha University in Valparaíso, Chile, who earned his PhD at the MIT Media Lab.
For Hidalgo and his colleagues, the complexity of a product is related to the breadth of knowledge required to produce it. The PhDs who operate a billion-dollar chip-fabrication facility are repositories of knowledge, and the facility of itself is the embodiment of knowledge. But complexity also factors in the infrastructure and institutions that facilitate the aggregation of knowledge, such as reliable transportation and communication systems, and a culture of trust that enables productive collaboration.
In the new study, rather than try to itemize and quantify all such factors — probably an impossible task — the researchers made a simplifying assumption: Complex products are rare products exported by countries with diverse export portfolios. For instance, both chromium ore and nonoptical microscopes are rare exports, but the Czech Republic, which is the second-leading exporter of nonoptical microscopes, has a more diverse export portfolio than South Africa, the leading exporter of chromium ore.
The researchers compared each country’s complexity measure to its Gini coefficient, the most widely used measure of income inequality. They also compared Gini coefficients to countries’ per-capita gross domestic products (GDPs) and to standard measures of institutional development and education.
According to the researchers’ analysis of economic data from 1996 to 2008, per-capita GDP predicts only 36 percent of the variation in Gini coefficients, but product complexity predicts 58 percent. Combining per-capita GDP, export complexity, education levels, and population predicts 69 percent of variation. However, whereas leaving out any of the other three factors lowers that figure to about 68 percent, leaving out complexity lowers it to 61 percent, indicating that the complexity measure captures something crucial that the other factors leave out.
Using trade data from 1963 to 2008, the researchers also showed that countries whose economic complexity increased, such as South Korea, saw reductions in income inequality, while countries whose economic complexity decreased, such as Norway, saw income inequality increase.
Researchers devise efficient power converter for internet of things
By Larry Hardesty
CAMBRIDGE, Mass. – The “internet of things” is the idea that vehicles, appliances, civil structures, manufacturing equipment, and even livestock will soon have sensors that report information directly to networked servers, aiding with maintenance and the coordination of tasks.
Those sensors will have to operate at very low powers, in order to extend battery life for months or make do with energy harvested from the environment. But that means that they’ll need to draw a wide range of electrical currents. A sensor might, for instance, wake up every so often, take a measurement, and perform a small calculation to see whether that measurement crosses some threshold. Those operations require relatively little current, but occasionally, the sensor might need to transmit an alert to a distant radio receiver. That requires much larger currents.
Generally, power converters, which take an input voltage and convert it to a steady output voltage, are efficient only within a narrow range of currents. But at the International Solid-State Circuits Conference last week, researchers from MIT’s Microsystems Technologies Laboratories (MTL) presented a new power converter that maintains its efficiency at currents ranging from 500 picoamps to 1 milliamp, a span that encompasses a 200,000-fold increase in current levels.
“Typically, converters have a quiescent power, which is the power that they consume even when they’re not providing any current to the load,” says Arun Paidimarri, who was a postdoc at MTL when the work was done and is now at IBM Research. “So, for example, if the quiescent power is a microamp, then even if the load pulls only a nanoamp, it’s still going to consume a microamp of current. My converter is something that can maintain efficiency over a wide range of currents.”
Paidimarri, who also earned doctoral and master’s degrees from MIT, is first author on the conference paper. He’s joined by his thesis advisor, Anantha Chandrakasan, the Vannevar Bush Professor of Electrical Engineering and Computer Science at MIT.
The researchers’ converter is a step-down converter, meaning that its output voltage is lower than its input voltage. In particular, it takes input voltages ranging from 1.2 to 3.3 volts and reduces them to between 0.7 and 0.9 volts.
“In the low-power regime, the way these power converters work, it’s not based on a continuous flow of energy,” Paidimarri says. “It’s based on these packets of energy. You have these switches, and an inductor, and a capacitor in the power converter, and you basically turn on and off these switches.”
The control circuitry for the switches includes a circuit that measures the output voltage of the converter. If the output voltage is below some threshold — in this case, 0.9 volts — the controllers throw a switch and release a packet of energy. Then they perform another measurement and, if necessary, release another packet.
If no device is drawing current from the converter, or if the current is going only to a simple, local circuit, the controllers might release between 1 and a couple hundred packets per second. But if the converter is feeding power to a radio, it might need to release a million packets a second.
To accommodate that range of outputs, a typical converter — even a low-power one — will simply perform 1 million voltage measurements a second; on that basis, it will release anywhere from 1 to 1 million packets. Each measurement consumes energy, but for most existing applications, the power drain is negligible. For the internet of things, however, it’s intolerable.
Paidimarri and Chandrakasan’s converter thus features a variable clock, which can run the switch controllers at a wide range of rates. That, however, requires more complex control circuits. The circuit that monitors the converter’s output voltage, for instance, contains an element called a voltage divider, which siphons off a little current from the output for measurement. In a typical converter, the voltage divider is just another element in the circuit path; it is, in effect, always on.
But siphoning current lowers the converter’s efficiency, so in the MIT researchers’ chip, the divider is surrounded by a block of additional circuit elements, which grant access to the divider only for the fraction of a second that a measurement requires. The result is a 50 percent reduction in quiescent power over even the best previously reported experimental low-power, step-down converter and a tenfold expansion of the current-handling range.
“This opens up exciting new opportunities to operate these circuits from new types of energy-harvesting sources, such as body-powered electronics,” Chandrakasan says.
The work was funded by Shell and Texas Instruments, and the prototype chips were built by the Taiwan Semiconductor Manufacturing Corporation, through its University Shuttle Program.
Is your business failing to capture full potential of a lucrative market?
Does your “One size fits for all” strategy not working?
Have you got an organization structure which is aligned with latest trends and technologies?
Is your overall business strategy well connected with real world?
If you are a business executive, such questions would pop up in your mind almost on daily basis. Sometimes you figure out the answer and other times you find yourself in doldrums, struggling to find the right direction to channelize your resources. This is where you need the services of strategy consultants who will aid you in connecting and shaping your thoughts and vision with the real world through a game plan.
In Pakistan, the role of strategy consultants has long been neglected and as a result businesses often fail to scale up to their full potential due to absence of any clear tailor made strategy. But gradually, with businesses getting more mature and the market dynamics fast changing; executives are realizing that they need companions to help them sail through uncharted waters.
Let some of your brainy work be outsourced and go for a tailor made strategy to fit your vision and needs.
A team of astronomers have used the SPHERE instrument on ESO’s Very Large Telescope to image the first planet ever found in a wide orbit inside a triple-star system. The orbit of such a planet had been expected to be unstable, probably resulting in the planet being quickly ejected from the system. But somehow this one survives. This unexpected observation suggests that such systems may actually be more common than previously thought. The results will be published online in the journal Science on 7 July 2016.
Luke Skywalker‘s home planet, Tatooine, in the Star Wars saga, was a strange world with two suns in the sky, but astronomers have now found a planet in an even more exotic system, where an observer would either experience constant daylight or enjoy triple sunrises and sunsets each day, depending on the seasons, which last longer than human lifetimes.
This world has been discovered by a team of astronomers led by the University of Arizona, USA, using direct imaging at ESO’s Very Large Telescope (VLT) in Chile. The planet, HD 131399Ab , is unlike any other known world — its orbit around the brightest of the three stars is by far the widest known within a multi-star system. Such orbits are often unstable, because of the complex and changing gravitational attraction from the other two stars in the system, and planets in stable orbits were thought to be very unlikely.
Located about 320 light-years from Earth in the constellation of Centaurus (The Centaur), HD 131399Ab is about 16 million years old, making it also one of the youngest exoplanets discovered to date, and one of very few directly imaged planets. With a temperature of around 580 degrees Celsius and an estimated mass of four Jupiter masses, it is also one of the coldest and least massive directly-imaged exoplanets.
“HD 131399Ab is one of the few exoplanets that have been directly imaged, and it’s the first one in such an interesting dynamical configuration,” said Daniel Apai, from the University of Arizona, USA, and one of the co-authors of the new paper.
“For about half of the planet’s orbit, which lasts 550 Earth-years, three stars are visible in the sky; the fainter two are always much closer together, and change in apparent separation from the brightest star throughout the year,” adds Kevin Wagner, the paper’s first author and discoverer of HD 131399Ab .
Kevin Wagner, who is a PhD student at the University of Arizona, identified the planet among hundreds of candidate planets and led the follow-up observations to verify its nature.
The planet also marks the first discovery of an exoplanet made with the SPHERE instrument on the VLT. SPHERE is sensitive to infrared light, allowing it to detect the heat signatures of young planets, along with sophisticated features correcting for atmospheric disturbances and blocking out the otherwise blinding light of their host stars.
Although repeated and long-term observations will be needed to precisely determine the planet’s trajectory among its host stars, observations and simulations seem to suggest the following scenario: the brightest star is estimated to be eighty percent more massive than the Sun and dubbed HD 131399A, which itself is orbited by the less massive stars, B and C, at about 300 au (one au, or astronomical unit, equals the average distance between the Earth and the Sun). All the while, B and C twirl around each other like a spinning dumbbell, separated by a distance roughly equal to that between the Sun and Saturn (10 au).
In this scenario, planet HD 131399Ab travels around the star A in an orbit with a radius of about 80 au, about twice as large as Pluto’s in the Solar System, and brings the planet to about one third of the separation between star A and the B/C star pair. The authors point out that a range of orbital scenarios is possible, and the verdict on the long-term stability of the system will have to wait for planned follow-up observations that will better constrain the planet’s orbit.
“If the planet was further away from the most massive star in the system, it would be kicked out of the system,” Apai explained. “Our computer simulations have shown that this type of orbit can be stable, but if you change things around just a little bit, it can become unstable very quickly.”
Planets in multi-star systems are of special interest to astronomers and planetary scientists because they provide an example of how the mechanism of planetary formation functions in these more extreme scenarios. While multi-star systems seem exotic to us in our orbit around our solitary star, multi-star systems are in fact just as common as single stars.
“It is not clear how this planet ended up on its wide orbit in this extreme system, and we can’t say yet what this means for our broader understanding of the types of planetary systems, but it shows that there is more variety out there than many would have deemed possible,” concludes Kevin Wagner. “What we do know is that planets in multi-star systems have been studied far less often, but are potentially just as numerous as planets in single-star systems.”
 The three components of the triple star are named HD 131399A, HD 131399B and HD 131399C respectively, in decreasing order of brightness. The planet orbits the brightest star and hence is named HD 131399Ab.
 For much of the planet’s year the stars would appear close together in the sky, giving it a familiar night-side and day-side with a unique triple sunset and sunrise each day. As the planet moves along its orbit the stars grow further apart each day, until they reach a point where the setting of one coincides with the rising of the other — at which point the planet is in near-constant daytime for about one-quarter of its orbit, or roughly 140 Earth-years.
On December 26, 2015 at 03:38:53 UTC, scientists observed gravitational waves–ripples in the fabric of spacetime–for the second time.
The gravitational waves were detected by both of the twin Laser Interferometer Gravitational-Wave Observatory (LIGO) detectors, located in Livingston, Louisiana, and Hanford, Washington, USA.
The LIGO Observatories are funded by the National Science Foundation (NSF), and were conceived, built, and are operated by Caltech and MIT. The discovery, accepted for publication in the journal Physical Review Letters, was made by the LIGO Scientific Collaboration (which includes the GEO Collaboration and the Australian Consortium for Interferometric Gravitational Astronomy) and the Virgo Collaboration using data from the two LIGO detectors.
Gravitational waves carry information about their origins and about the nature of gravity that cannot otherwise be obtained, and physicists have concluded that these gravitational waves were produced during the final moments of the merger of two black holes–14 and 8 times the mass of the sun–to produce a single, more massive spinning black hole that is 21 times the mass of the sun.
“It is very significant that these black holes were much less massive than those observed in the first detection,” says Gabriela González, LIGO Scientific Collaboration (LSC) spokesperson and professor of physics and astronomy at Louisiana State University. “Because of their lighter masses compared to the first detection, they spent more time–about one second–in the sensitive band of the detectors. It is a promising start to mapping the populations of black holes in our universe.”
During the merger, which occurred approximately 1.4 billion years ago, a quantity of energy roughly equivalent to the mass of the sun was converted into gravitational waves. The detected signal comes from the last 27 orbits of the black holes before their merger. Based on the arrival time of the signals–with the Livingston detector measuring the waves 1.1 milliseconds before the Hanford detector–the position of the source in the sky can be roughly determined.
“In the near future, Virgo, the European interferometer, will join a growing network of gravitational wave detectors, which work together with ground-based telescopes that follow-up on the signals,” notes Fulvio Ricci, the Virgo Collaboration spokesperson, a physicist at Istituto Nazionale di Nucleare (INFN) and professor at Sapienza University of Rome. “The three interferometers together will permit a far better localization in the sky of the signals.”
The first detection of gravitational waves, announced on February 11, 2016, was a milestone in physics and astronomy; it confirmed a major prediction of Albert Einstein’s 1915 general theory of relativity, and marked the beginning of the new field of gravitational-wave astronomy.
The second discovery “has truly put the ‘O’ for Observatory in LIGO,” says Caltech’s Albert Lazzarini, deputy director of the LIGO Laboratory. “With detections of two strong events in the four months of our first observing run, we can begin to make predictions about how often we might be hearing gravitational waves in the future. LIGO is bringing us a new way to observe some of the darkest yet most energetic events in our universe.”
“We are starting to get a glimpse of the kind of new astrophysical information that can only come from gravitational wave detectors,” says MIT’s David Shoemaker, who led the Advanced LIGO detector construction program.
Both discoveries were made possible by the enhanced capabilities of Advanced LIGO, a major upgrade that increases the sensitivity of the instruments compared to the first generation LIGO detectors, enabling a large increase in the volume of the universe probed
“With the advent of Advanced LIGO, we anticipated researchers would eventually succeed at detecting unexpected phenomena, but these two detections thus far have surpassed our expectations,” says NSF Director France A. Córdova. “NSF’s 40-year investment in this foundational research is already yielding new information about the nature of the dark universe.”
Advanced LIGO’s next data-taking run will begin this fall. By then, further improvements in detector sensitivity are expected to allow LIGO to reach as much as 1.5 to 2 times more of the volume of the universe. The Virgo detector is expected to join in the latter half of the upcoming observing run.
LIGO research is carried out by the LIGO Scientific Collaboration (LSC), a group of more than 1,000 scientists from universities around the United States and in 14 other countries. More than 90 universities and research institutes in the LSC develop detector technology and analyze data; approximately 250 students are strong contributing members of the collaboration. The LSC detector network includes the LIGO interferometers and the GEO600 detector.
Virgo research is carried out by the Virgo Collaboration, consisting of more than 250 physicists and engineers belonging to 19 different European research groups: 6 from Centre National de la Recherche Scientifique (CNRS) in France; 8 from the Istituto Nazionale di Fisica Nucleare (INFN) in Italy; 2 in The Netherlands with Nikhef; the MTA Wigner RCP in Hungary; the POLGRAW group in Poland and the European Gravitational Observatory (EGO), the laboratory hosting the Virgo detector near Pisa in Italy.
The NSF leads in financial support for Advanced LIGO. Funding organizations in Germany (Max Planck Society), the U.K. (Science and Technology Facilities Council, STFC) and Australia (Australian Research Council) also have made significant commitments to the project.
Several of the key technologies that made Advanced LIGO so much more sensitive have been developed and tested by the German UK GEO collaboration. Significant computer resources have been contributed by the AEI Hannover Atlas Cluster, the LIGO Laboratory, Syracuse University, the ARCCA cluster at Cardiff University, the University of Wisconsin-Milwaukee, and the Open Science Grid. Several universities designed, built, and tested key components and techniques for Advanced LIGO: The Australian National University, the University of Adelaide, the University of Western Australia, the University of Florida, Stanford University, Columbia University in the City of New York, and Louisiana State University. The GEO team includes scientists at the Max Planck Institute for Gravitational Physics (Albert Einstein Institute, AEI), Leibniz Universität Hannover, along with partners at the University of Glasgow, Cardiff University, the University of Birmingham, other universities in the United Kingdom and Germany, and the University of the Balearic Islands in Spain.
For more information and interview requests, please contact:
Director of Media Relations
Deputy Director, MIT News Office
Senior Content and Media Strategist
LIGO Scientific Collaboration
External Relations Manager
Louisiana State University
EGO-European Gravitational Observatory
Tel +39 050752325
The Computational Vision and Geometry Lab has developed a robot prototype that could soon autonomously move among us, following normal human social etiquettes. It’s named ‘Jackrabbot’ after the springy hares that bounce around campus.
BY VIGNESH RAMACHANDRAN
In order for robots to circulate on sidewalks and mingle with humans in other crowded places, they’ll have to understand the unwritten rules of pedestrian behavior. Stanford researchers have created a short, non-humanoid prototype of just such a moving, self-navigating machine.
The robot is nicknamed “Jackrabbot” – after the jackrabbits often seen darting across the Stanford campus – and looks like a ball on wheels. Jackrabbot is equipped with sensors to be able to understand its surroundings and navigate streets and hallways according to normal human etiquette.
The idea behind the work is that by observing how Jackrabbot navigates itself among students around the halls and sidewalks of Stanford’s School of Engineering, and over time learns unwritten conventions of these social behaviors, the researchers will gain critical insight in how to design the next generation of everyday robots such that they operate smoothly alongside humans in crowded open spaces like shopping malls or train stations.
“By learning social conventions, the robot can be part of ecosystems where humans and robots coexist,” said Silvio Savarese, an assistant professor of computer science and director of the Stanford Computational Vision and Geometry Lab.
The researchers will present their system for predicting human trajectories in crowded spaces at the Computer Vision and Pattern Recognition conference in Las Vegas on June 27.
As robotic devices become more common in human environments, it becomes increasingly important that they understand and respect human social norms, Savarese said. How should they behave in crowds? How do they share public resources, like sidewalks or parking spots? When should a robot take its turn? What are the ways people signal each other to coordinate movements and negotiate other spontaneous activities, like forming a line?
These human social conventions aren’t necessarily explicit nor are they written down complete with lane markings and traffic lights, like the traffic rules that govern the behavior of autonomous cars.
So Savarese’s lab is using machine learning techniques to create algorithms that will, in turn, allow the robot to recognize and react appropriately to unwritten rules of pedestrian traffic. The team’s computer scientists have been collecting images and video of people moving around the Stanford campus and transforming those images into coordinates. From those coordinates, they can train an algorithm.
“Our goal in this project is to actually learn those (pedestrian) rules automatically from observations – by seeing how humans behave in these kinds of social spaces,” Savarese said. “The idea is to transfer those rules into robots.”
Jackrabbot already moves automatically and can navigate without human assistance indoors, and the team members are fine-tuning the robot’s self-navigation capabilities outdoors. The next step in their research is the implementation of “social aspects” of pedestrian navigation such as deciding rights of way on the sidewalk. This work, described in their newest conference papers, has been demonstrated in computer simulations.
“We have developed a new algorithm that is able to automatically move the robot with social awareness, and we’re currently integrating that in Jackrabbot,” said Alexandre Alahi, a postdoctoral researcher in the lab.
Even though social robots may someday roam among humans, Savarese said he believes they don’t necessarily need to look like humans. Instead they should be designed to look as lovable and friendly as possible. In demos, the roughly three-foot-tall Jackrabbot roams around campus wearing a Stanford tie and sun-hat, generating hugs and curiosity from passersby.
Today, Jackrabbot is an expensive prototype. But Savarese estimates that in five or six years social robots like this could become as cheap as $500, making it possible for companies to release them to the mass market.
“It’s possible to make these robots affordable for on-campus delivery, or for aiding impaired people to navigate in a public space like a train station or for guiding people to find their way through an airport,” Savarese said.
The conference paper is titled “Social LSTM: Human Trajectory Prediction in Crowded Spaces.” See conference program for details.
Using a new satellite-based method, scientists at NASA, Environment and Climate Change Canada, and two universities have located 39 unreported and major human-made sources of toxic sulfur dioxide emissions.
A known health hazard and contributor to acid rain, sulfur dioxide (SO2) is one of six air pollutants regulated by the U.S. Environmental Protection Agency. Current, sulfur dioxide monitoring activities include the use of emission inventories that are derived from ground-based measurements and factors, such as fuel usage. The inventories are used to evaluate regulatory policies for air quality improvements and to anticipate future emission scenarios that may occur with economic and population growth.
But, to develop comprehensive and accurate inventories, industries, government agencies and scientists first must know the location of pollution sources.
“We now have an independent measurement of these emission sources that does not rely on what was known or thought known,” said Chris McLinden, an atmospheric scientist with Environment and Climate Change Canada in Toronto and lead author of the study published this week in Nature Geosciences. “When you look at a satellite picture of sulfur dioxide, you end up with it appearing as hotspots – bull’s-eyes, in effect — which makes the estimates of emissions easier.”
The 39 unreported emission sources, found in the analysis of satellite data from 2005 to 2014, are clusters of coal-burning power plants, smelters, oil and gas operations found notably in the Middle East, but also in Mexico and parts of Russia. In addition, reported emissions from known sources in these regions were — in some cases — two to three times lower than satellite-based estimates.
Altogether, the unreported and underreported sources account for about 12 percent of all human-made emissions of sulfur dioxide – a discrepancy that can have a large impact on regional air quality, said McLinden.
The research team also located 75 natural sources of sulfur dioxide — non-erupting volcanoes slowly leaking the toxic gas throughout the year. While not necessarily unknown, many volcanoes are in remote locations and not monitored, so this satellite-based data set is the first to provide regular annual information on these passive volcanic emissions.
“Quantifying the sulfur dioxide bull’s-eyes is a two-step process that would not have been possible without two innovations in working with the satellite data,” said co-author Nickolay Krotkov, an atmospheric scientist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland.
First was an improvement in the computer processing that transforms raw satellite observations from the Dutch-Finnish Ozone Monitoring Instrument aboard NASA’s Aura spacecraft into precise estimates of sulfur dioxide concentrations. Krotkov and his team now are able to more accurately detect smaller sulfur dioxide concentrations, including those emitted by human-made sources such as oil-related activities and medium-size power plants.
Being able to detect smaller concentrations led to the second innovation. McLinden and his colleagues used a new computer program to more precisely detect sulfur dioxide that had been dispersed and diluted by winds. They then used accurate estimates of wind strength and direction derived from a satellite data-driven model to trace the pollutant back to the location of the source, and also to estimate how much sulfur dioxide was emitted from the smoke stack.
“The unique advantage of satellite data is spatial coverage,” said Bryan Duncan, an atmospheric scientist at Goddard. “This paper is the perfect demonstration of how new and improved satellite datasets, coupled with new and improved data analysis techniques, allow us to identify even smaller pollutant sources and to quantify these emissions over the globe.”
The University of Maryland, College Park, and Dalhousie University in Halifax, Nova Scotia, contributed to this study.
For more information about, and access to, NASA’s air quality data, visit:
NASA uses the vantage point of space to increase our understanding of our home planet, improve lives, and safeguard our future. NASA develops new ways to observe and study Earth’s interconnected natural systems with long-term data records. The agency freely shares this unique knowledge and works with institutions around the world to gain new insights into how our planet is changing.
For more information about NASA Earth science research, visit: