Monthly Archives: June 2016

The LIGO Scientific Collaboration and the Virgo Collaboration identify a second gravitational wave event from another pair of black holes in the data from Advanced LIGO detectors

Gravitational waves detected from second pair of colliding black holes

The LIGO Scientific Collaboration and the Virgo Collaboration identify a second gravitational wave event in the data from Advanced LIGO detectors


PAPER: http://journals.aps.org/prl/abstract/10.1103/PhysRevLett.116.241103

IMAGES & AUDIO: https://caltech.app.box.com/v/LIGO-JuneAAS


On December 26, 2015 at 03:38:53 UTC, scientists observed gravitational waves–ripples in the fabric of spacetime–for the second time.

The gravitational waves were detected by both of the twin Laser Interferometer Gravitational-Wave Observatory (LIGO) detectors, located in Livingston, Louisiana, and Hanford, Washington, USA.

The LIGO Observatories are funded by the National Science Foundation (NSF), and were conceived, built, and are operated by Caltech and MIT. The discovery, accepted for publication in the journal Physical Review Letters, was made by the LIGO Scientific Collaboration (which includes the GEO Collaboration and the Australian Consortium for Interferometric Gravitational Astronomy) and the Virgo Collaboration using data from the two LIGO detectors.

Gravitational waves carry information about their origins and about the nature of gravity that cannot otherwise be obtained, and physicists have concluded that these gravitational waves were produced during the final moments of the merger of two black holes–14 and 8 times the mass of the sun–to produce a single, more massive spinning black hole that is 21 times the mass of the sun.

“It is very significant that these black holes were much less massive than those observed in the first detection,” says Gabriela González, LIGO Scientific Collaboration (LSC) spokesperson and professor of physics and astronomy at Louisiana State University. “Because of their lighter masses compared to the first detection, they spent more time–about one second–in the sensitive band of the detectors. It is a promising start to mapping the populations of black holes in our universe.”

During the merger, which occurred approximately 1.4 billion years ago, a quantity of energy roughly equivalent to the mass of the sun was converted into gravitational waves. The detected signal comes from the last 27 orbits of the black holes before their merger. Based on the arrival time of the signals–with the Livingston detector measuring the waves 1.1 milliseconds before the Hanford detector–the position of the source in the sky can be roughly determined.

“In the near future, Virgo, the European interferometer, will join a growing network of gravitational wave detectors, which work together with ground-based telescopes that follow-up on the signals,” notes Fulvio Ricci, the Virgo Collaboration spokesperson, a physicist at Istituto Nazionale di Nucleare (INFN) and professor at Sapienza University of Rome. “The three interferometers together will permit a far better localization in the sky of the signals.”

The first detection of gravitational waves, announced on February 11, 2016, was a milestone in physics and astronomy; it confirmed a major prediction of Albert Einstein’s 1915 general theory of relativity, and marked the beginning of the new field of gravitational-wave astronomy.

The second discovery “has truly put the ‘O’ for Observatory in LIGO,” says Caltech’s Albert Lazzarini, deputy director of the LIGO Laboratory. “With detections of two strong events in the four months of our first observing run, we can begin to make predictions about how often we might be hearing gravitational waves in the future. LIGO is bringing us a new way to observe some of the darkest yet most energetic events in our universe.”

“We are starting to get a glimpse of the kind of new astrophysical information that can only come from gravitational wave detectors,” says MIT’s David Shoemaker, who led the Advanced LIGO detector construction program.

Both discoveries were made possible by the enhanced capabilities of Advanced LIGO, a major upgrade that increases the sensitivity of the instruments compared to the first generation LIGO detectors, enabling a large increase in the volume of the universe probed

“With the advent of Advanced LIGO, we anticipated researchers would eventually succeed at detecting unexpected phenomena, but these two detections thus far have surpassed our expectations,” says NSF Director France A. Córdova. “NSF’s 40-year investment in this foundational research is already yielding new information about the nature of the dark universe.”

Advanced LIGO’s next data-taking run will begin this fall. By then, further improvements in detector sensitivity are expected to allow LIGO to reach as much as 1.5 to 2 times more of the volume of the universe. The Virgo detector is expected to join in the latter half of the upcoming observing run.

LIGO research is carried out by the LIGO Scientific Collaboration (LSC), a group of more than 1,000 scientists from universities around the United States and in 14 other countries. More than 90 universities and research institutes in the LSC develop detector technology and analyze data; approximately 250 students are strong contributing members of the collaboration. The LSC detector network includes the LIGO interferometers and the GEO600 detector.

Virgo research is carried out by the Virgo Collaboration, consisting of more than 250 physicists and engineers belonging to 19 different European research groups: 6 from Centre National de la Recherche Scientifique (CNRS) in France; 8 from the Istituto Nazionale di Fisica Nucleare (INFN) in Italy; 2 in The Netherlands with Nikhef; the MTA Wigner RCP in Hungary; the POLGRAW group in Poland and the European Gravitational Observatory (EGO), the laboratory hosting the Virgo detector near Pisa in Italy.

The NSF leads in financial support for Advanced LIGO. Funding organizations in Germany (Max Planck Society), the U.K. (Science and Technology Facilities Council, STFC) and Australia (Australian Research Council) also have made significant commitments to the project.

Several of the key technologies that made Advanced LIGO so much more sensitive have been developed and tested by the German UK GEO collaboration. Significant computer resources have been contributed by the AEI Hannover Atlas Cluster, the LIGO Laboratory, Syracuse University, the ARCCA cluster at Cardiff University, the University of Wisconsin-Milwaukee, and the Open Science Grid. Several universities designed, built, and tested key components and techniques for Advanced LIGO: The Australian National University, the University of Adelaide, the University of Western Australia, the University of Florida, Stanford University, Columbia University in the City of New York, and Louisiana State University. The GEO team includes scientists at the Max Planck Institute for Gravitational Physics (Albert Einstein Institute, AEI), Leibniz Universität Hannover, along with partners at the University of Glasgow, Cardiff University, the University of Birmingham, other universities in the United Kingdom and Germany, and the University of the Balearic Islands in Spain.


 

MEDIA CONTACTS

For more information and interview requests, please contact:

MIT
Kimberly Allen
Director of Media Relations
Deputy Director, MIT News Office
617-253-2702 (office)
617-852-6094 (cell)
allenkc@mit.edu

Caltech
Whitney Clavin
Senior Content and Media Strategist
626-390-9601 (cell)
wclavin@caltech.edu

NSF
Ivy Kupec
Media Officer
703-292-8796 (Office)
703-225-8216 (Cell)
ikupec@nsf.gov

LIGO Scientific Collaboration
Mimi LaValle
External Relations Manager
Louisiana State University
225-439-5633 (Cell)

http://mlavall@lsu.edu

EGO-European Gravitational Observatory
Séverine Perus
Media Contact
severine.perus@ego-gw.it
Tel +39 050752325

Stanford’s social robot ‘Jackrabbot’ seeks to understand pedestrian behavior

View video here.

The Computational Vision and Geometry Lab has developed a robot prototype that could soon autonomously move among us, following normal human social etiquettes. It’s named ‘Jackrabbot’ after the springy hares that bounce around campus.

BY VIGNESH RAMACHANDRAN


In order for robots to circulate on sidewalks and mingle with humans in other crowded places, they’ll have to understand the unwritten rules of pedestrian behavior. Stanford researchers have created a short, non-humanoid prototype of just such a moving, self-navigating machine.

The robot is nicknamed “Jackrabbot” – after the jackrabbits often seen darting across the Stanford campus – and looks like a ball on wheels. Jackrabbot is equipped with sensors to be able to understand its surroundings and navigate streets and hallways according to normal human etiquette.

The idea behind the work is that by observing how Jackrabbot navigates itself among students around the halls and sidewalks of Stanford’s School of Engineering, and over time learns unwritten conventions of these social behaviors, the researchers will gain critical insight in how to design the next generation of everyday robots such that they operate smoothly alongside humans in crowded open spaces like shopping malls or train stations.

“By learning social conventions, the robot can be part of ecosystems where humans and robots coexist,” said Silvio Savarese, an assistant professor of computer science and director of the Stanford Computational Vision and Geometry Lab.

The researchers will present their system for predicting human trajectories in crowded spaces at the Computer Vision and Pattern Recognition conference in Las Vegas on June 27.

As robotic devices become more common in human environments, it becomes increasingly important that they understand and respect human social norms, Savarese said. How should they behave in crowds? How do they share public resources, like sidewalks or parking spots? When should a robot take its turn? What are the ways people signal each other to coordinate movements and negotiate other spontaneous activities, like forming a line?

These human social conventions aren’t necessarily explicit nor are they written down complete with lane markings and traffic lights, like the traffic rules that govern the behavior of autonomous cars.

So Savarese’s lab is using machine learning techniques to create algorithms that will, in turn, allow the robot to recognize and react appropriately to unwritten rules of pedestrian traffic. The team’s computer scientists have been collecting images and video of people moving around the Stanford campus and transforming those images into coordinates. From those coordinates, they can train an algorithm.

“Our goal in this project is to actually learn those (pedestrian) rules automatically from observations – by seeing how humans behave in these kinds of social spaces,” Savarese said. “The idea is to transfer those rules into robots.”

Jackrabbot already moves automatically and can navigate without human assistance indoors, and the team members are fine-tuning the robot’s self-navigation capabilities outdoors. The next step in their research is the implementation of “social aspects” of pedestrian navigation such as deciding rights of way on the sidewalk. This work, described in their newest conference papers, has been demonstrated in computer simulations.

“We have developed a new algorithm that is able to automatically move the robot with social awareness, and we’re currently integrating that in Jackrabbot,” said Alexandre Alahi, a postdoctoral researcher in the lab.

Even though social robots may someday roam among humans, Savarese said he believes they don’t necessarily need to look like humans. Instead they should be designed to look as lovable and friendly as possible. In demos, the roughly three-foot-tall Jackrabbot roams around campus wearing a Stanford tie and sun-hat, generating hugs and curiosity from passersby.

Today, Jackrabbot is an expensive prototype. But Savarese estimates that in five or six years social robots like this could become as cheap as $500, making it possible for companies to release them to the mass market.

“It’s possible to make these robots affordable for on-campus delivery, or for aiding impaired people to navigate in a public space like a train station or for guiding people to find their way through an airport,” Savarese said.

The conference paper is titled “Social LSTM: Human Trajectory Prediction in Crowded Spaces.” See conference program for details.

Source: Stanford University News Service

NASA Satellite Finds Unreported Sources of Toxic Air Pollution

Using a new satellite-based method, scientists at NASA, Environment and Climate Change Canada, and two universities have located 39 unreported and major human-made sources of toxic sulfur dioxide emissions.

A known health hazard and contributor to acid rain, sulfur dioxide (SO2) is one of six air pollutants regulated by the U.S. Environmental Protection Agency. Current, sulfur dioxide monitoring activities include the use of emission inventories that are derived from ground-based measurements and factors, such as fuel usage. The inventories are used to evaluate regulatory policies for air quality improvements and to anticipate future emission scenarios that may occur with economic and population growth.

 Source: NASA

But, to develop comprehensive and accurate inventories, industries, government agencies and scientists first must know the location of pollution sources.

“We now have an independent measurement of these emission sources that does not rely on what was known or thought known,” said Chris McLinden, an atmospheric scientist with Environment and Climate Change Canada in Toronto and lead author of the study published this week in Nature Geosciences. “When you look at a satellite picture of sulfur dioxide, you end up with it appearing as hotspots – bull’s-eyes, in effect — which makes the estimates of emissions easier.”

The 39 unreported emission sources, found in the analysis of satellite data from 2005 to 2014, are clusters of coal-burning power plants, smelters, oil and gas operations found notably in the Middle East, but also in Mexico and parts of Russia. In addition, reported emissions from known sources in these regions were — in some cases — two to three times lower than satellite-based estimates.

Altogether, the unreported and underreported sources account for about 12 percent of all human-made emissions of sulfur dioxide – a discrepancy that can have a large impact on regional air quality, said McLinden.

The research team also located 75 natural sources of sulfur dioxide — non-erupting volcanoes slowly leaking the toxic gas throughout the year. While not necessarily unknown, many volcanoes are in remote locations and not monitored, so this satellite-based data set is the first to provide regular annual information on these passive volcanic emissions.

“Quantifying the sulfur dioxide bull’s-eyes is a two-step process that would not have been possible without two innovations in working with the satellite data,” said co-author Nickolay Krotkov, an atmospheric scientist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland.

First was an improvement in the computer processing that transforms raw satellite observations from the Dutch-Finnish Ozone Monitoring Instrument aboard NASA’s Aura spacecraft into precise estimates of sulfur dioxide concentrations. Krotkov and his team now are able to more accurately detect smaller sulfur dioxide concentrations, including those emitted by human-made sources such as oil-related activities and medium-size power plants.

Being able to detect smaller concentrations led to the second innovation. McLinden and his colleagues used a new computer program to more precisely detect sulfur dioxide that had been dispersed and diluted by winds. They then used accurate estimates of wind strength and direction derived from a satellite data-driven model to trace the pollutant back to the location of the source, and also to estimate how much sulfur dioxide was emitted from the smoke stack.

“The unique advantage of satellite data is spatial coverage,” said Bryan Duncan, an atmospheric scientist at Goddard. “This paper is the perfect demonstration of how new and improved satellite datasets, coupled with new and improved data analysis techniques, allow us to identify even smaller pollutant sources and to quantify these emissions over the globe.”

The University of Maryland, College Park, and Dalhousie University in Halifax, Nova Scotia, contributed to this study.

For more information about, and access to, NASA’s air quality data, visit:

http://so2.gsfc.nasa.gov/

NASA uses the vantage point of space to increase our understanding of our home planet, improve lives, and safeguard our future. NASA develops new ways to observe and study Earth’s interconnected natural systems with long-term data records. The agency freely shares this unique knowledge and works with institutions around the world to gain new insights into how our planet is changing.

For more information about NASA Earth science research, visit:

http://www.nasa.gov/earth