All posts by Prime

The LIGO Scientific Collaboration and the Virgo Collaboration identify a second gravitational wave event from another pair of black holes in the data from Advanced LIGO detectors

Gravitational waves detected from second pair of colliding black holes

The LIGO Scientific Collaboration and the Virgo Collaboration identify a second gravitational wave event in the data from Advanced LIGO detectors


PAPER: http://journals.aps.org/prl/abstract/10.1103/PhysRevLett.116.241103

IMAGES & AUDIO: https://caltech.app.box.com/v/LIGO-JuneAAS


On December 26, 2015 at 03:38:53 UTC, scientists observed gravitational waves–ripples in the fabric of spacetime–for the second time.

The gravitational waves were detected by both of the twin Laser Interferometer Gravitational-Wave Observatory (LIGO) detectors, located in Livingston, Louisiana, and Hanford, Washington, USA.

The LIGO Observatories are funded by the National Science Foundation (NSF), and were conceived, built, and are operated by Caltech and MIT. The discovery, accepted for publication in the journal Physical Review Letters, was made by the LIGO Scientific Collaboration (which includes the GEO Collaboration and the Australian Consortium for Interferometric Gravitational Astronomy) and the Virgo Collaboration using data from the two LIGO detectors.

Gravitational waves carry information about their origins and about the nature of gravity that cannot otherwise be obtained, and physicists have concluded that these gravitational waves were produced during the final moments of the merger of two black holes–14 and 8 times the mass of the sun–to produce a single, more massive spinning black hole that is 21 times the mass of the sun.

“It is very significant that these black holes were much less massive than those observed in the first detection,” says Gabriela González, LIGO Scientific Collaboration (LSC) spokesperson and professor of physics and astronomy at Louisiana State University. “Because of their lighter masses compared to the first detection, they spent more time–about one second–in the sensitive band of the detectors. It is a promising start to mapping the populations of black holes in our universe.”

During the merger, which occurred approximately 1.4 billion years ago, a quantity of energy roughly equivalent to the mass of the sun was converted into gravitational waves. The detected signal comes from the last 27 orbits of the black holes before their merger. Based on the arrival time of the signals–with the Livingston detector measuring the waves 1.1 milliseconds before the Hanford detector–the position of the source in the sky can be roughly determined.

“In the near future, Virgo, the European interferometer, will join a growing network of gravitational wave detectors, which work together with ground-based telescopes that follow-up on the signals,” notes Fulvio Ricci, the Virgo Collaboration spokesperson, a physicist at Istituto Nazionale di Nucleare (INFN) and professor at Sapienza University of Rome. “The three interferometers together will permit a far better localization in the sky of the signals.”

The first detection of gravitational waves, announced on February 11, 2016, was a milestone in physics and astronomy; it confirmed a major prediction of Albert Einstein’s 1915 general theory of relativity, and marked the beginning of the new field of gravitational-wave astronomy.

The second discovery “has truly put the ‘O’ for Observatory in LIGO,” says Caltech’s Albert Lazzarini, deputy director of the LIGO Laboratory. “With detections of two strong events in the four months of our first observing run, we can begin to make predictions about how often we might be hearing gravitational waves in the future. LIGO is bringing us a new way to observe some of the darkest yet most energetic events in our universe.”

“We are starting to get a glimpse of the kind of new astrophysical information that can only come from gravitational wave detectors,” says MIT’s David Shoemaker, who led the Advanced LIGO detector construction program.

Both discoveries were made possible by the enhanced capabilities of Advanced LIGO, a major upgrade that increases the sensitivity of the instruments compared to the first generation LIGO detectors, enabling a large increase in the volume of the universe probed

“With the advent of Advanced LIGO, we anticipated researchers would eventually succeed at detecting unexpected phenomena, but these two detections thus far have surpassed our expectations,” says NSF Director France A. Córdova. “NSF’s 40-year investment in this foundational research is already yielding new information about the nature of the dark universe.”

Advanced LIGO’s next data-taking run will begin this fall. By then, further improvements in detector sensitivity are expected to allow LIGO to reach as much as 1.5 to 2 times more of the volume of the universe. The Virgo detector is expected to join in the latter half of the upcoming observing run.

LIGO research is carried out by the LIGO Scientific Collaboration (LSC), a group of more than 1,000 scientists from universities around the United States and in 14 other countries. More than 90 universities and research institutes in the LSC develop detector technology and analyze data; approximately 250 students are strong contributing members of the collaboration. The LSC detector network includes the LIGO interferometers and the GEO600 detector.

Virgo research is carried out by the Virgo Collaboration, consisting of more than 250 physicists and engineers belonging to 19 different European research groups: 6 from Centre National de la Recherche Scientifique (CNRS) in France; 8 from the Istituto Nazionale di Fisica Nucleare (INFN) in Italy; 2 in The Netherlands with Nikhef; the MTA Wigner RCP in Hungary; the POLGRAW group in Poland and the European Gravitational Observatory (EGO), the laboratory hosting the Virgo detector near Pisa in Italy.

The NSF leads in financial support for Advanced LIGO. Funding organizations in Germany (Max Planck Society), the U.K. (Science and Technology Facilities Council, STFC) and Australia (Australian Research Council) also have made significant commitments to the project.

Several of the key technologies that made Advanced LIGO so much more sensitive have been developed and tested by the German UK GEO collaboration. Significant computer resources have been contributed by the AEI Hannover Atlas Cluster, the LIGO Laboratory, Syracuse University, the ARCCA cluster at Cardiff University, the University of Wisconsin-Milwaukee, and the Open Science Grid. Several universities designed, built, and tested key components and techniques for Advanced LIGO: The Australian National University, the University of Adelaide, the University of Western Australia, the University of Florida, Stanford University, Columbia University in the City of New York, and Louisiana State University. The GEO team includes scientists at the Max Planck Institute for Gravitational Physics (Albert Einstein Institute, AEI), Leibniz Universität Hannover, along with partners at the University of Glasgow, Cardiff University, the University of Birmingham, other universities in the United Kingdom and Germany, and the University of the Balearic Islands in Spain.


 

MEDIA CONTACTS

For more information and interview requests, please contact:

MIT
Kimberly Allen
Director of Media Relations
Deputy Director, MIT News Office
617-253-2702 (office)
617-852-6094 (cell)
allenkc@mit.edu

Caltech
Whitney Clavin
Senior Content and Media Strategist
626-390-9601 (cell)
wclavin@caltech.edu

NSF
Ivy Kupec
Media Officer
703-292-8796 (Office)
703-225-8216 (Cell)
ikupec@nsf.gov

LIGO Scientific Collaboration
Mimi LaValle
External Relations Manager
Louisiana State University
225-439-5633 (Cell)

http://mlavall@lsu.edu

EGO-European Gravitational Observatory
Séverine Perus
Media Contact
severine.perus@ego-gw.it
Tel +39 050752325

Stanford’s social robot ‘Jackrabbot’ seeks to understand pedestrian behavior

View video here.

The Computational Vision and Geometry Lab has developed a robot prototype that could soon autonomously move among us, following normal human social etiquettes. It’s named ‘Jackrabbot’ after the springy hares that bounce around campus.

BY VIGNESH RAMACHANDRAN


In order for robots to circulate on sidewalks and mingle with humans in other crowded places, they’ll have to understand the unwritten rules of pedestrian behavior. Stanford researchers have created a short, non-humanoid prototype of just such a moving, self-navigating machine.

The robot is nicknamed “Jackrabbot” – after the jackrabbits often seen darting across the Stanford campus – and looks like a ball on wheels. Jackrabbot is equipped with sensors to be able to understand its surroundings and navigate streets and hallways according to normal human etiquette.

The idea behind the work is that by observing how Jackrabbot navigates itself among students around the halls and sidewalks of Stanford’s School of Engineering, and over time learns unwritten conventions of these social behaviors, the researchers will gain critical insight in how to design the next generation of everyday robots such that they operate smoothly alongside humans in crowded open spaces like shopping malls or train stations.

“By learning social conventions, the robot can be part of ecosystems where humans and robots coexist,” said Silvio Savarese, an assistant professor of computer science and director of the Stanford Computational Vision and Geometry Lab.

The researchers will present their system for predicting human trajectories in crowded spaces at the Computer Vision and Pattern Recognition conference in Las Vegas on June 27.

As robotic devices become more common in human environments, it becomes increasingly important that they understand and respect human social norms, Savarese said. How should they behave in crowds? How do they share public resources, like sidewalks or parking spots? When should a robot take its turn? What are the ways people signal each other to coordinate movements and negotiate other spontaneous activities, like forming a line?

These human social conventions aren’t necessarily explicit nor are they written down complete with lane markings and traffic lights, like the traffic rules that govern the behavior of autonomous cars.

So Savarese’s lab is using machine learning techniques to create algorithms that will, in turn, allow the robot to recognize and react appropriately to unwritten rules of pedestrian traffic. The team’s computer scientists have been collecting images and video of people moving around the Stanford campus and transforming those images into coordinates. From those coordinates, they can train an algorithm.

“Our goal in this project is to actually learn those (pedestrian) rules automatically from observations – by seeing how humans behave in these kinds of social spaces,” Savarese said. “The idea is to transfer those rules into robots.”

Jackrabbot already moves automatically and can navigate without human assistance indoors, and the team members are fine-tuning the robot’s self-navigation capabilities outdoors. The next step in their research is the implementation of “social aspects” of pedestrian navigation such as deciding rights of way on the sidewalk. This work, described in their newest conference papers, has been demonstrated in computer simulations.

“We have developed a new algorithm that is able to automatically move the robot with social awareness, and we’re currently integrating that in Jackrabbot,” said Alexandre Alahi, a postdoctoral researcher in the lab.

Even though social robots may someday roam among humans, Savarese said he believes they don’t necessarily need to look like humans. Instead they should be designed to look as lovable and friendly as possible. In demos, the roughly three-foot-tall Jackrabbot roams around campus wearing a Stanford tie and sun-hat, generating hugs and curiosity from passersby.

Today, Jackrabbot is an expensive prototype. But Savarese estimates that in five or six years social robots like this could become as cheap as $500, making it possible for companies to release them to the mass market.

“It’s possible to make these robots affordable for on-campus delivery, or for aiding impaired people to navigate in a public space like a train station or for guiding people to find their way through an airport,” Savarese said.

The conference paper is titled “Social LSTM: Human Trajectory Prediction in Crowded Spaces.” See conference program for details.

Source: Stanford University News Service

NASA Satellite Finds Unreported Sources of Toxic Air Pollution

Using a new satellite-based method, scientists at NASA, Environment and Climate Change Canada, and two universities have located 39 unreported and major human-made sources of toxic sulfur dioxide emissions.

A known health hazard and contributor to acid rain, sulfur dioxide (SO2) is one of six air pollutants regulated by the U.S. Environmental Protection Agency. Current, sulfur dioxide monitoring activities include the use of emission inventories that are derived from ground-based measurements and factors, such as fuel usage. The inventories are used to evaluate regulatory policies for air quality improvements and to anticipate future emission scenarios that may occur with economic and population growth.

 Source: NASA

But, to develop comprehensive and accurate inventories, industries, government agencies and scientists first must know the location of pollution sources.

“We now have an independent measurement of these emission sources that does not rely on what was known or thought known,” said Chris McLinden, an atmospheric scientist with Environment and Climate Change Canada in Toronto and lead author of the study published this week in Nature Geosciences. “When you look at a satellite picture of sulfur dioxide, you end up with it appearing as hotspots – bull’s-eyes, in effect — which makes the estimates of emissions easier.”

The 39 unreported emission sources, found in the analysis of satellite data from 2005 to 2014, are clusters of coal-burning power plants, smelters, oil and gas operations found notably in the Middle East, but also in Mexico and parts of Russia. In addition, reported emissions from known sources in these regions were — in some cases — two to three times lower than satellite-based estimates.

Altogether, the unreported and underreported sources account for about 12 percent of all human-made emissions of sulfur dioxide – a discrepancy that can have a large impact on regional air quality, said McLinden.

The research team also located 75 natural sources of sulfur dioxide — non-erupting volcanoes slowly leaking the toxic gas throughout the year. While not necessarily unknown, many volcanoes are in remote locations and not monitored, so this satellite-based data set is the first to provide regular annual information on these passive volcanic emissions.

“Quantifying the sulfur dioxide bull’s-eyes is a two-step process that would not have been possible without two innovations in working with the satellite data,” said co-author Nickolay Krotkov, an atmospheric scientist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland.

First was an improvement in the computer processing that transforms raw satellite observations from the Dutch-Finnish Ozone Monitoring Instrument aboard NASA’s Aura spacecraft into precise estimates of sulfur dioxide concentrations. Krotkov and his team now are able to more accurately detect smaller sulfur dioxide concentrations, including those emitted by human-made sources such as oil-related activities and medium-size power plants.

Being able to detect smaller concentrations led to the second innovation. McLinden and his colleagues used a new computer program to more precisely detect sulfur dioxide that had been dispersed and diluted by winds. They then used accurate estimates of wind strength and direction derived from a satellite data-driven model to trace the pollutant back to the location of the source, and also to estimate how much sulfur dioxide was emitted from the smoke stack.

“The unique advantage of satellite data is spatial coverage,” said Bryan Duncan, an atmospheric scientist at Goddard. “This paper is the perfect demonstration of how new and improved satellite datasets, coupled with new and improved data analysis techniques, allow us to identify even smaller pollutant sources and to quantify these emissions over the globe.”

The University of Maryland, College Park, and Dalhousie University in Halifax, Nova Scotia, contributed to this study.

For more information about, and access to, NASA’s air quality data, visit:

http://so2.gsfc.nasa.gov/

NASA uses the vantage point of space to increase our understanding of our home planet, improve lives, and safeguard our future. NASA develops new ways to observe and study Earth’s interconnected natural systems with long-term data records. The agency freely shares this unique knowledge and works with institutions around the world to gain new insights into how our planet is changing.

For more information about NASA Earth science research, visit:

http://www.nasa.gov/earth

The elliptical galaxy NGC 1600, 200 million light-years away — shown in the centre of the image and highlighted in the box —, hosts in its centre one of the biggest supermassive black holes known . Until the discovery of this example, astronomers assumed that such huge black holes could only be found in the centres of massive galaxies at the centre of galaxy clusters. NGC 1600, however, is a rather isolated galaxy.

The image is a composition of a ground based view and observations made with the NASA/ESA Hubble Space Telescope.

Credit:
NASA, ESA, Digital Sky Survey 2

NGC 1600′s super massive blackhole discovery puzzles astronomers

Astronomers have uncovered one of the biggest supermassive black holes, with the mass of 17 billion Suns, in an unlikely place: the centre of a galaxy that lies in a quiet backwater of the Universe. The observations, made with the NASA/ESA Hubble Space Telescope and the Gemini Telescope in Hawaii, indicate that these monster objects may be more common than once thought. The results of this study are released in the journal Nature.

The elliptical galaxy NGC 1600, 200 million light-years away — shown in the centre of the image and highlighted in the box —, hosts in its centre one of the biggest supermassive black holes known . Until the discovery of this example, astronomers assumed that such huge black holes could only be found in the centres of massive galaxies at the centre of galaxy clusters. NGC 1600, however, is a rather isolated galaxy. The image is a composition of a ground based view and observations made with the NASA/ESA Hubble Space Telescope. Credit: NASA, ESA, Digital Sky Survey 2
The elliptical galaxy NGC 1600, 200 million light-years away — shown in the centre of the image and highlighted in the box —, hosts in its centre one of the biggest supermassive black holes known . Until the discovery of this example, astronomers assumed that such huge black holes could only be found in the centres of massive galaxies at the centre of galaxy clusters. NGC 1600, however, is a rather isolated galaxy.
The image is a composition of a ground based view and observations made with the NASA/ESA Hubble Space Telescope.
Credit:
NASA, ESA, Digital Sky Survey 2

Until now, the biggest supermassive black holes — those having more than 10 billion times the mass of our Sun — have only been found at the cores of very large galaxies in the centres of massive galaxy clusters. Now, an international team of astronomers using the NASA/ESA Hubble Space Telescope has discovered a supersized black hole with a mass of 17 billion Suns in the centre of the rather isolated galaxy NGC 1600.

NGC 1600 is an elliptical galaxy which is located not in a cluster of galaxies, but in a small group of about twenty. The group is located 200 million light-years away in the constellation Eridanus. While finding a gigantic supermassive black hole in a massive galaxy within a cluster of galaxies is to be expected, finding one in an average-sized galaxy group like the one surrounding NGC 1600 is much more surprising.

“Even though we already had hints that the galaxy might host an extreme object in the centre, we were surprised that the black hole in NGC 1600 is ten times more massive than predicted by the mass of the galaxy,” explains lead author of the study Jens Thomas from the Max Planck-Institute for Extraterrestrial Physics, Germany.

Based on previous Hubble surveys of supermassive black holes, astronomers had discovered a correlation between a black hole’s mass and the mass of its host galaxy’s central bulge of stars: the larger the galaxy bulge, the more massive the black hole is expected to be. “It appears from our finding that this relation does not work so well with extremely massive black holes,” says Thomas. “These monster black holes account for a much larger fraction of the host galaxy’s mass than the previous correlations would suggest.”

Finding this extremely massive black hole in NGC 1600 leads astronomers to ask whether these objects are more common than previously thought. “There are quite a few galaxies the size of NGC 1600 that reside in average-size galaxy groups,” explains co-author Chung-Pei Ma, an astronomer from the University of California, Berkeley, USA, and head of the MASSIVE Survey [1]. “We estimate that these smaller groups are about fifty times more abundant than large, dense galaxy clusters. So the question now is: is this the tip of an iceberg? Maybe there are a lot more monster black holes out there.”

It is assumed that this black hole grew by merging with another supermassive black hole from another galaxy. It may then have continued to grow by gobbling up gas funneled to the core of the galaxy by further galaxy collisions. Thus may also explain why NGC 1600 resides in a sparsely populated region of the Universe and why it is at least three times brighter than its neighbours.

As the supermassive black hole is currently dormant, astronomers were only able to find it and estimate its mass by measuring the velocities of stars close to it, using the Gemini North 8-metre telescope on Mauna Kea, Hawaii. Using these data the team discovered that stars lying about 3000 light-years from the core are moving as if there had been many more stars in the core in the distant past. This indicates that most of the stars in this region have been kicked out from the centre of the galaxy.

Archival Hubble images, taken with the Near Infrared Camera and Multi-Object Spectrometer (NICMOS), support the idea that the two merging supermassive black holes in the distant past gave stars the boot. The NICMOS images revealed that the galaxy’s core is unusually faint, indicating a lack of stars close to the galactic centre. “We estimate that the mass of stars tossed out of the central region of NGC 1600 is equal to 40 billion Suns,” concludes Thomas. “This is comparable to ejecting the entire disc of our Milky Way galaxy.”

Notes
[1] The MASSIVE Survey, which began in 2014, measures the mass of stars, dark matter, and the central black hole of the 100 most massive, nearby galaxies, those larger than 300 billion solar masses and within 350 million light-years of Earth. Among its goals are to find the descendants of luminous quasars that may be sleeping unsuspected in large nearby galaxies and to understand how galaxies form and grow supermassive black holes.

More information
The Hubble Space Telescope is a project of international cooperation between ESA and NASA.

The study “A 17-billion-solar-mass black hole in a group galaxy with a diffuse core” appeared in the journal Nature.

The international team of astronomers in this study consists of J. Thomas (Max Planck Institute for Extraterrestrial Physics, Germany), C.-P. Ma (University of California, Berkeley, USA), N. McConnell (Dominion Astrophysical Observatory, Canada), J. Greene (Princeton University, USA), J. Blakeslee (Dominion Astrophysical Observatory, Canada), and R. Janish (University of California, Berkeley, USA)

Source: Space Telescope

Academic and research collaboration to improve people to people contacts for peace and progress

Syed Faisal ur Rahman

Muslim world especially Middle East and surrounding regions, where we live, are facing some of the worst political turmoil of our history. We are seeing wars, terrorism, refugee crisis and resulting economic. The toughest calamities are faced by common people who have very little or no control over the policies which are resulting in the current mess. Worst thing which is happening is the exploitation of sectarianism as a tool to forward foreign policy and strategic agenda. Muslims in many parts of the world are criticizing western powers for this situation but we also need to seriously do some soul searching.

We need to see why are we in this mess?

For me one major reason is that OIC members have failed to find enough common constructive goals to bring their people together.

After the Second World War, Europe realized the importance of academic and economic cooperation for promoting peace and stability. CERN is a prime example of how formal foes can join hands for the purpose of discovery and innovation.

France and Germany have established common institutes and their universities regularly conduct joint research projects. UK and USA, despite enormous bloodshed the historical American war of independence, enjoy exemplary people to people relationships and academic collaboration is a major part of it. It is this attitude of thinking big, finding common constructive goals and strong academic collaboration, which has put them in the forefront of science and technology.

Over the last few decades, humanity has sent probes like Voyager which are challenging the limits of our solar system, countries are thinking about colonizing Mars, satellites like PLANCK and WMAP are tracking radiation from the early stages of our universe, quantum computing is now looking like a possibility and projects are being made for hyper-sonic flights. But in most of the so called Muslim world, we are stuck with centuries old and good for nothing sectarian issues.

Despite some efforts in the defense sector, OIC member countries largely lack the technology base to independently produce jets, automobiles, advanced electronics, precision instruments and many other things which are being produced by public or independent private sector companies in USA, China, Russia, Japan and Europe. Most of the things which are being indigenously produced by OIC countries rely heavily on foreign core components like engine or high precision electronics items. This is due to our lack of investment on fundamental research especially Physics.

OIC countries like Turkey, Pakistan, Malaysia, Iran, Saudi Arabia and some others have some basic infrastructure on which they can build upon to conduct research projects and joint ventures in areas like sending space probes, ground based optical and radio astronomy, particle physics, climate change and development of strong industrial technology base.  All we need is the will to start joint projects and promote knowledge sharing via exchange of researchers or joint academic and industrial research projects.

These joint projects will not only be helpful in enhancing people to people contacts and improving academic research standards but they will also contribute positively in the overall progress of humanity. It is a great loss for humanity as a whole that a civilization, which once led the efforts to develop astronomy, medicine and other key areas of science, is not making any or making very little contribution in advancing our understanding of the universe.

The situation is bad and if we look at Syria, Afghanistan, Iraq, Yemen or Libya then it seems we have hit the rock bottom. It is “Us” who need to find the way out of this mess as no one is going to solve our problems especially the current sectarian mess which is a result of narrow mindsets taking weak decisions. To come out of this dire state, we need broad minds with big vision and a desire of moving forward through mutual respect and understanding.

 

New device could provide electrical power source from walking and other ambient motions:MIT Research

Harnessing the energy of small bending motions
New device could provide electrical power source from walking and other ambient motions.

By David Chandler


 

CAMBRIDGE, Mass.–For many applications such as biomedical, mechanical, or environmental monitoring devices, harnessing the energy of small motions could provide a small but virtually unlimited power supply. While a number of approaches have been attempted, researchers at MIT have now developed a completely new method based on electrochemical principles, which could be capable of harvesting energy from a broader range of natural motions and activities, including walking.

The new system, based on the slight bending of a sandwich of metal and polymer sheets, is described in the journal Nature Communications, in a paper by MIT professor Ju Li, graduate students Sangtae Kim and Soon Ju Choi, and four others.

Most previously designed devices for harnessing small motions have been based on the triboelectric effect (essentially friction, like rubbing a balloon against a wool sweater) or piezoelectrics (crystals that produce a small voltage when bent or compressed). These work well for high-frequency sources of motion such as those produced by the vibrations of machinery. But for typical human-scale motions such as walking or exercising, such systems have limits.

“When you put in an impulse” to such traditional materials, “they respond very well, in microseconds. But this doesn’t match the timescale of most human activities,” says Li, who is the Battelle Energy Alliance Professor in Nuclear Science and Engineering and professor of materials science and engineering. “Also, these devices have high electrical impedance and bending rigidity and can be quite expensive,” he says.

Simple and flexible

By contrast, the new system uses technology similar to that in lithium ion batteries, so it could likely be produced inexpensively at large scale, Li says. In addition, these devices would be inherently flexible, making them more compatible with wearable technology and less likely to break under mechanical stress.

While piezoelectric materials are based on a purely physical process, the new system is electrochemical, like a battery or a fuel cell. It uses two thin sheets of lithium alloys as electrodes, separated by a layer of porous polymer soaked with liquid electrolyte that is efficient at transporting lithium ions between the metal plates. But unlike a rechargeable battery, which takes in electricity, stores it, and then releases it, this system takes in mechanical energy and puts out electricity.

When bent even a slight amount, the layered composite produces a pressure difference that squeezes lithium ions through the polymer (like the reverse osmosis process used in water desalination). It also produces a counteracting voltage and an electrical current in the external circuit between the two electrodes, which can be then used directly to power other devices.

Because it requires only a small amount of bending to produce a voltage, such a device could simply have a tiny weight attached to one end to cause the metal to bend as a result of ordinary movements, when strapped to an arm or leg during everyday activities. Unlike batteries and solar cells, the output from the new system comes in the form of alternating current (AC), with the flow moving first in one direction and then the other as the material bends first one way and then back.

This device converts mechanical to electrical energy; therefore, “it is not limited by the second law of thermodynamics,” Li says, which sets an upper limit on the theoretically possible efficiency. “So in principle, [the efficiency] could be 100 percent,” he says. In this first-generation device developed to demonstrate the electrochemomechanical working principle, he says, “the best we can hope for is about 15 percent” efficiency. But the system could easily be manufactured in any desired size and is amenable to industrial manufacturing process.

Test of time

The test devices maintain their properties through many cycles of bending and unbending, Li reports, with little reduction in performance after 1,500 cycles. “It’s a very stable system,” he says.

Previously, the phenomenon underlying the new device “was considered a parasitic effect in the battery community,” according to Li, and voltage put into the battery could sometimes induce bending. “We do just the opposite,” Li says, putting in the stress and getting a voltage as output. Besides being a potential energy source, he says, this could also be a complementary diagnostic tool in electrochemistry. “It’s a good way to evaluate damage mechanisms in batteries, a way to understand battery materials better,” he says.

In addition to harnessing daily motion to power wearable devices, the new system might also be useful as an actuator with biomedical applications, or used for embedded stress sensors in settings such as roads, bridges, keyboards, or other structures, the researchers suggest.

The team also included postdoc Kejie Zhao (now assistant professor at Purdue University) and visiting graduate student Giorgia Gobbi , and Hui Yang and Sulin Zhang at Penn State. The work was supported by the National Science Foundation, the MIT MADMEC Contest, the Samsung Scholarship Foundation, and the Kwanjeong Educational Foundation.

Source: MIT News Office

ight behaves both as a particle and as a wave. Since the days of Einstein, scientists have been trying to directly observe both of these aspects of light at the same time. Now, scientists at EPFL have succeeded in capturing the first-ever snapshot of this dual behavior.
Credit:EPFL

Entering 2016 with new hope

Syed Faisal ur Rahman


 

Year 2015 left many good and bad memories for many of us. On one hand we saw more wars, terrorist attacks and political confrontations, and on the other hand we saw humanity raising voices for peace, sheltering refugees and joining hands to confront the climate change.

In science, we saw first ever photograph of light as both wave and particle. We also saw some serious development in machine learning, data sciences and artificial intelligence areas with some voices raising caution about the takeover of AI over humanity and issues related to privacy. The big question of energy and climate change remained a key point of  discussion in scientific and political circles. The biggest break through came near the end of the year with Paris deal during COP21.

The deal involving around 200 countries represent a true spirit of humanity to limit global warming below 2C and commitments for striving to keep temperatures at above 1.5C pre-industrial levels. This truly global commitment also served in bringing rival countries to sit together for a common cause to save humanity from self destruction. I hope the spirit will continue in other areas of common interest as well.

This spectacular view from the NASA/ESA Hubble Space Telescope shows the rich galaxy cluster Abell 1689. The huge concentration of mass bends light coming from more distant objects and can increase their total apparent brightness and make them visible. One such object, A1689-zD1, is located in the box — although it is still so faint that it is barely seen in this picture. New observations with ALMA and ESO’s VLT have revealed that this object is a dusty galaxy seen when the Universe was just 700 million years old. Credit: NASA; ESA; L. Bradley (Johns Hopkins University); R. Bouwens (University of California, Santa Cruz); H. Ford (Johns Hopkins University); and G. Illingworth (University of California, Santa Cruz)
This spectacular view from the NASA/ESA Hubble Space Telescope shows the rich galaxy cluster Abell 1689. The huge concentration of mass bends light coming from more distant objects and can increase their total apparent brightness and make them visible. One such object, A1689-zD1, is located in the box — although it is still so faint that it is barely seen in this picture.
New observations with ALMA and ESO’s VLT have revealed that this object is a dusty galaxy seen when the Universe was just 700 million years old.
Credit:
NASA; ESA; L. Bradley (Johns Hopkins University); R. Bouwens (University of California, Santa Cruz); H. Ford (Johns Hopkins University); and G. Illingworth (University of California, Santa Cruz)

Space Sciences also saw some enormous advancements with New Horizon sending photographs from Pluto, SpaceX successfully landed the reusable Falcon 9 rocket back after a successful launch and we also saw the discovery of the largest regular formation in the Universe,by Prof Lajos Balazs, which is a ring of nine galaxies 7 billion light years away and 5 billion light years wide covering a third of our sky.We also learnt this year that Mars once had more water than Earth’s Arctic Ocean. NASA later confirmed the evidence that water flows on the surface of Mars. The announcement led to some interesting insight into the atmospheric studies and history of the red planet.

In the researchers' new system, a returning beam of light is mixed with a locally stored beam, and the correlation of their phase, or period of oscillation, helps remove noise caused by interactions with the environment. Illustration: Jose-Luis Olivares/MIT
In the researchers’ new system, a returning beam of light is mixed with a locally stored beam, and the correlation of their phase, or period of oscillation, helps remove noise caused by interactions with the environment.
Illustration: Jose-Luis Olivares/MIT

We also saw some encouraging advancements in neurosciences where we saw MIT’s researchers  developing a technique allowing direct stimulation of neurons, which could be an effective treatment for a variety of neurological diseases, without the need for implants or external connections. We also saw researchers reactivating neuro-plasticity in older mice, restoring their brains to a younger state and we also saw some good progress in combating Alzheimer’s diseases.

Quantum physics again stayed as a key area of scientific advancements. Quantu

ight behaves both as a particle and as a wave. Since the days of Einstein, scientists have been trying to directly observe both of these aspects of light at the same time. Now, scientists at EPFL have succeeded in capturing the first-ever snapshot of this dual behavior. Credit:EPFL
ight behaves both as a particle and as a wave. Since the days of Einstein, scientists have been trying to directly observe both of these aspects of light at the same time. Now, scientists at EPFL have succeeded in capturing the first-ever snapshot of this dual behavior.
Credit:EPFL

m computing is getting more closer to become a viable alternative to current architecture. The packing of the single-photon detectors on an optical chip is a crucial step toward quantum-computational circuits. Researchers at the Australian National University (ANU)  performed experiment to prove that reality does not exist until it is measured.

There are many other areas where science and technology reached new heights and will hopefully continue to do so in the year 2016. I hope these advancements will not only help us in growing economically but also help us in becoming better human beings and a better society.

 

 

 

 

 

SpaceX successfully landed it’s Falcon 9 rocket after launching it into space

SpaceX, founded by Elon Musk, has landed it’s Falcon 9 rocket after launching it into space. The rocket is part of an attempt to develop a credible relaunch-able platform for sending satellites into space.

 

According to SpaceX’s youtube page:

 

With this mission, SpaceX’s Falcon 9 rocket will deliver 11 satellites to low-Earth orbit for ORBCOMM, a leading global provider of Machine-to-Machine communication and Internet of Things solutions. The ORBCOMM launch is targeted for an evening launch from Space Launch Complex 40 at Cape Canaveral Air Force Station, Fla. If all goes as planned, the 11 satellites will be deployed approximately 20 minutes after liftoff, completing a 17-satellite, low Earth orbit constellation for ORBCOMM. This mission also marks SpaceX’s return-to-flight as well as its first attempt to land a first stage on land. The landing of the first stage is a secondary test objective.”

The youtube video link is given below:
ORBCOMM-2 Full Launch Webcast

 

Finding new employees who support company culture a top concern for businesses expanding abroad, EIU report finds

  • New report identifies “softer” aspects of business expansions, such as sourcing new employees who support and enhance the brand’s existing culture, as a top concern
  • Other findings include the desire to open new markets and gain market share as the main drivers for corporate expansions abroad
  • A location’s level of taxation or skills shortages do not seem to be as much of a concern to companies expanding overseas as might have been expected

A new report released on December 3rd, by The Economist Intelligence Unit (EIU) states that bringing new people into a company’s culture and values is among the biggest challenges during international expansions. Corporate overseas expansion: Opportunities and barriers, sponsored by TMF Group, builds on a survey of 155 senior executives who have knowledge of the issues involved in their company’s expansion into foreign markets.

Among those interviewed for the report there was near-unanimous agreement that maintaining company culture while respecting local customs and cultural differences is a fundamental objective for a successful international expansion. By contrast, policymakers may have overstated the importance of a location’s level of taxation, as this seems to be far less of a concern in companies’ expansion projects than might have been expected.

The survey also finds that a desire to open new markets and gain market share are the principal drivers of corporate expansions abroad, selected by 59% and 57% of respondents respectively. This is especially the case for European countries, as sluggish growth in domestic markets has encouraged many European companies to seek stronger returns overseas. By contrast, the majority of respondents in Asia-Pacific (53%) are particularly driven by the need to find new sources of capital.

Martin Koehring, the editor of the report, said: “It’s clear from our report that once a company’s executive team has identified its scope for an overseas expansion, much of the success will rest on comprehensive planning. This includes ‘softer’ brand-authenticity elements, such as maintaining the company culture and values, that are in some regards more pressing—or perhaps more challenging to master—than ‘harder’ aspects such as currency hedging, integrating operational systems and ensuring compliance with local regulations.”

Read Corporate overseas expansion: Opportunities and barriers here

Source: EIU

Stanford study finds promise in expanding renewables based on results in three major economies

A new Stanford study found that renewable energy can make a major and increasingly cost-effective contribution to alleviating climate change.

BY TERRY NAGEL


Stanford energy experts have released a study that compares the experiences of three large economies in ramping up renewable energy deployment and concludes that renewables can make a major and increasingly cost-effective contribution to climate change mitigation.

The report from Stanford’s Steyer-Taylor Center for Energy Policy and Finance analyzes the experiences of Germany, California and Texas, the world’s fourth, eighth and 12th largest economies, respectively. It found, among other things, that Germany, which gets about half as much sunshine as California and Texas, nevertheless generates electricity from solar installations at a cost comparable to that of Texas and only slightly higher than in California.

The report was released in time for the United Nations Climate Change Conference that started this week, where international leaders are gathering to discuss strategies to deal with global warming, including massive scale-ups of renewable energy.

“As policymakers from around the world gather for the climate negotiations in Paris, our report draws on the experiences of three leaders in renewable-energy deployment to shed light on some of the most prominent and controversial themes in the global renewables debate,” said Dan Reicher, executive director of the Steyer-Taylor Center, which is a joint center between Stanford Law School and Stanford Graduate School of Business. Reicher also is interim president and chief executive officer of the American Council on Renewable Energy.

“Our findings suggest that renewable energy has entered the mainstream and is ready to play a leading role in mitigating global climate change,” said Felix Mormann, associate professor of law at the University of Miami, faculty fellow at the Steyer-Taylor Center and lead author of the report.

Other conclusions of the report, “A Tale of Three Markets: Comparing the Solar and Wind Deployment Experiences of California, Texas, and Germany,” include:

  • Germany’s success in deploying renewable energy at scale is due largely to favorable treatment of “soft cost” factors such as financing, permitting, installation and grid access. This approach has allowed the renewable energy policies of some countries to deliver up to four times the average deployment of other countries, despite offering only half the financial incentives.
  • Contrary to widespread concern, a higher share of renewables does not automatically translate to higher electricity bills for ratepayers. While Germany’s residential electric rates are two to three times those of California and Texas, this price differential is only partly due to Germany’s subsidies for renewables. The average German household’s electricity bill is, in fact, lower than in Texas and only slightly higher than in California, partly as a result of energy-efficiency efforts in German homes.
  • An increase in the share of intermittent solar and wind power need not jeopardize the stability of the electric grid. From 2006 to 2013, Germany tripled the amount of electricity generated from solar and wind to a market share of 26 percent, while managing to reduce average annual outage times for electricity customers in its grid from an already impressive 22 minutes to just 15 minutes. During that same period, California tripled the amount of electricity produced from solar and wind to a joint market share of 8 percent and reduced its outage times from more than 100 minutes to less than 90 minutes. However, Texas increased its outage times from 92 minutes to 128 minutes after ramping up its wind-generated electricity sixfold to a market share of 10 percent.

The study may inform the energy debate in the United States, where expanding the nation’s renewable energy infrastructure is a top priority of the Obama administration and the subject of debate among presidential candidates.

The current share of renewables in U.S. electricity generation is 14 percent – half that of Germany. Germany’s ambitious – and controversial – Energiewende (Energy Transition) initiative commits the country to meeting 80 percent of its electricity needs with renewables by 2050. In the United States, 29 states, including California and Texas, have set mandatory targets for renewable energy.

In California, Gov. Jerry Brown recently signed legislation committing the state to producing 50 percent of its electricity from renewables by 2030. Texas, the leading U.S. state for wind development, set a mandate of 10,000 megawatts of renewable energy capacity by 2025, but reached this target 15 years ahead of schedule and now generates over 10 percent of the state’s electricity from wind alone.

Source: Sanford News