KARACHI: At least 209 people lost their lives and hundreds others sustained injuries in structures’ collapse and landslides caused by a powerful 7.5 magnitude earthquake that jolted northern parts of Pakistan on Monday.
The enormity of the quake can be gauged from the fact that the tremors were felt alls across South Asia.
Majority of deaths were reported from Shangla while the death toll is feared to rise even further in view of the massiveness of this natural calamity.
The powerful quake caused a large number of walls, houses and other structures to cave in while many instances of land-sliding were also reported from some parts of the affected areas.
The earthquake was also felt in several parts of Punjab including Lahore where thousands of people had to rush outside of their houses, shops, offices and other structures for safety. They said ‘never before an earthquake had made us feel this much panic’.
Tremors were also felt in Islamabad, Sargodha, Kashmir and several other parts of the country.
According to Commissioner Malakand, 137 people died in Swat-Malakand division while 835 suffered injuries. He said as many as 813 houses collapsed in Malakand.
Chief Minister Gilgit-Baltistan said the intenstity of today’s earthquake seemd much greater compared to that of 2005.
The US Geological Survey put the epicentre near Jurm in northeast Afghanistan, 250 kilometres (160 miles) from the capital Kabul and at a depth of 213.5 kilometres.
The Met Office in Pakistan said the magnitude was 8.1 on the Richter scale.
The epicentre is just a few hundred kilometres from the site of a 7.6 magnitude quake that struck in October 2005, killing more than 75,000 people and displacing some 3.5 million more.
The earthquake was said to be one of the most powerful ever recorded in Pakistan’s history.
Quake in Afghanistan and India
Thousands of frightened people rushed into the streets across Afghanistan and India as the quake rocked a swathe of the subcontinent. Shockwaves were felt in areas as far away as New Delhi in India and Kabul in Afghanistan.
Hundreds of people raced from buildings onto the streets in different cities while the quake was also felt in the Kashmir region.
System that replaces human intuition with algorithms outperforms 615 of 906 human teams.
By Larry Hardesty
Big-data analysis consists of searching for buried patterns that have some kind of predictive power. But choosing which “features” of the data to analyze usually requires some human intuition. In a database containing, say, the beginning and end dates of various sales promotions and weekly profits, the crucial data may not be the dates themselves but the spans between them, or not the total profits but the averages across those spans.
MIT researchers aim to take the human element out of big-data analysis, with a new system that not only searches for patterns but designs the feature set, too. To test the first prototype of their system, they enrolled it in three data science competitions, in which it competed against human teams to find predictive patterns in unfamiliar data sets. Of the 906 teams participating in the three competitions, the researchers’ “Data Science Machine” finished ahead of 615.
In two of the three competitions, the predictions made by the Data Science Machine were 94 percent and 96 percent as accurate as the winning submissions. In the third, the figure was a more modest 87 percent. But where the teams of humans typically labored over their prediction algorithms for months, the Data Science Machine took somewhere between two and 12 hours to produce each of its entries.
“We view the Data Science Machine as a natural complement to human intelligence,” says Max Kanter, whose MIT master’s thesis in computer science is the basis of the Data Science Machine. “There’s so much data out there to be analyzed. And right now it’s just sitting there not doing anything. So maybe we can come up with a solution that will at least get us started on it, at least get us moving.”
Between the lines
Kanter and his thesis advisor, Kalyan Veeramachaneni, a research scientist at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), describe the Data Science Machine in a paper that Kanter will present next week at the IEEE International Conference on Data Science and Advanced Analytics.
Veeramachaneni co-leads the Anyscale Learning for All group at CSAIL, which applies machine-learning techniques to practical problems in big-data analysis, such as determining the power-generation capacity of wind-farm sites or predicting which students are at risk fordropping out of online courses.
“What we observed from our experience solving a number of data science problems for industry is that one of the very critical steps is called feature engineering,” Veeramachaneni says. “The first thing you have to do is identify what variables to extract from the database or compose, and for that, you have to come up with a lot of ideas.”
In predicting dropout, for instance, two crucial indicators proved to be how long before a deadline a student begins working on a problem set and how much time the student spends on the course website relative to his or her classmates. MIT’s online-learning platform MITxdoesn’t record either of those statistics, but it does collect data from which they can be inferred.
Kanter and Veeramachaneni use a couple of tricks to manufacture candidate features for data analyses. One is to exploit structural relationships inherent in database design. Databases typically store different types of data in different tables, indicating the correlations between them using numerical identifiers. The Data Science Machine tracks these correlations, using them as a cue to feature construction.
For instance, one table might list retail items and their costs; another might list items included in individual customers’ purchases. The Data Science Machine would begin by importing costs from the first table into the second. Then, taking its cue from the association of several different items in the second table with the same purchase number, it would execute a suite of operations to generate candidate features: total cost per order, average cost per order, minimum cost per order, and so on. As numerical identifiers proliferate across tables, the Data Science Machine layers operations on top of each other, finding minima of averages, averages of sums, and so on.
It also looks for so-called categorical data, which appear to be restricted to a limited range of values, such as days of the week or brand names. It then generates further feature candidates by dividing up existing features across categories.
Once it’s produced an array of candidates, it reduces their number by identifying those whose values seem to be correlated. Then it starts testing its reduced set of features on sample data, recombining them in different ways to optimize the accuracy of the predictions they yield.
“The Data Science Machine is one of those unbelievable projects where applying cutting-edge research to solve practical problems opens an entirely new way of looking at the problem,” says Margo Seltzer, a professor of computer science at Harvard University who was not involved in the work. “I think what they’ve done is going to become the standard quickly — very quickly.”
In a world transformed by climate change and human activity, Stanford scientists say that conserving biodiversity and protecting species will require an interdisciplinary combination of ecological and social research methods.
By Ker Than
A threatened tree species in Alaska could serve as a model for integrating ecological and social research methods in efforts to safeguard species that are vulnerable to climate change effects and human activity.
In a new Stanford-led study, published online this week in the journal Biological Conservation, scientists assessed the health of yellow cedar, a culturally and commercially valuable tree throughout coastal Alaska that is experiencing climate change-induced dieback.
In an era when climate change touches every part of the globe, the traditional conservation approach of setting aside lands to protect biodiversity is no longer sufficient to protect species, said the study’s first author, Lauren Oakes, a research associate at Stanford University.
“A lot of that kind of conservation planning was intended to preserve historic conditions, which, for example, might be defined by the population of a species 50 years ago or specific ecological characteristics when a park was established,” said Oakes, who is a recent PhD graduate of the Emmett Interdisciplinary Program in Environment and Resources (E-IPER) at Stanford’s School of Earth, Energy, & Environmental Sciences.
But as the effects of climate change become increasingly apparent around the world, resource managers are beginning to recognize that “adaptive management” strategies are needed that account for how climate change affects species now and in the future.
Similarly, because climate change effects will vary across regions, new management interventions must consider not only local laws, policies and regulations, but also local peoples’ knowledge about climate change impacts and their perceptions about new management strategies. For yellow cedar, new strategies could include assisting migration of the species to places where it may be more likely to survive or increasing protection of the tree from direct uses, such as harvesting.
Gathering these perspectives requires an interdisciplinary social-ecological approach, said study leader Eric Lambin, the George and Setsuko Ishiyama Provostial Professor in the School of Earth, Energy, & Environmental Sciences.
“The impact of climate change on ecosystems is not just a biophysical issue. Various actors depend on these ecosystems and on the services they provide for their livelihoods,” said Lambin, who is also a senior fellow at the Stanford Woods Institute for the Environment.
“Moreover, as the geographic distribution of species is shifting due to climate change, new areas that are currently under human use will need to be managed for biodiversity conservation. Any feasible management solution needs to integrate the ecological and social dimensions of this challenge.”
Gauging yellow cedar health
The scientists used aerial surveys to map the distribution of yellow cedar in Alaska’s Glacier Bay National Park and Preserve (GLBA) and collected data about the trees’ health and environmental conditions from 18 randomly selected plots inside the park and just south of the park on designated wilderness lands.
“Some of the plots were really challenging to access,” Oakes said. “We would get dropped off by boat for 10 to 15 days at a time, travel by kayak on the outer coast, and hike each day through thick forests to reach the sites. We’d wake up at 6 a.m. and it wouldn’t be until 11 a.m. that we reached the sites and actually started the day’s work of measuring trees.”
The field surveys revealed that yellow cedars inside of GLBA were relatively healthy and unstressed compared to trees outside the park, to the south. Results also showed reduced crowns and browned foliage in yellow cedar trees at sites outside the park, indicating early signs of the dieback progressing toward the park.
Additionally, modeling by study co-authors Paul Hennon, David D’Amore, and Dustin Wittwer at the USDA Forest Service suggested the dieback is expected to emerge inside GLBA in the future. As the region warms, reductions in snow cover, which helps insulate the tree’s shallow roots, leave the roots vulnerable to sudden springtime cold events.
In addition to collecting data about the trees themselves with a team of research assistants, Oakes conducted interviews with 45 local residents and land managers to understand their perceptions about climate change-induced yellow cedar dieback; whether or not they thought humans should intervene to protect the species in GLBA; and what forms those interventions should take.
One unexpected and interesting pattern that emerged from the interviews is that those participants who perceived protected areas as “separate” from nature commonly expressed strong opposition to intervention inside protected areas, like GLBA. In contrast, those who thought of humans as being “a part of” protected areas viewed intervention more favorably.
“Native Alaskans told me stories of going to yellow cedar trees to walk with their ancestors,” Oakes said. “There were other interview participants who said they’d go to a yellow cedar tree every day just to be in the presence of one.”
These people tended to support new kinds of interventions because they believed humans were inherently part of the system and they derived many intangible values, like spiritual or recreational values, from the trees. In contrast, those who perceived protected areas as “natural” and separate from humans were more likely to oppose new interventions in the protected areas.
Lambin said he was not surprised to see this pattern for individuals because people’s choices are informed by their values. “It was less expected for land managers who occupy an official role,” he added. “We often think about an organization and its missions, but forget that day-to-day decisions are made by people who carry their own value systems and perceptions of risks.”
The insights provided by combining ecological and social techniques could inform decisions about when, where, and how to adapt conservation practices in a changing climate, said study co-author Nicole Ardoin, an assistant professor at Stanford’s Graduate School of Education and a center fellow at the Woods Institute.
“Some initial steps in southeast Alaska might include improving tree monitoring in protected areas and increasing collaboration among the agencies that oversee managed and protected lands, as well as working with local community members to better understand how they value these species,” Ardoin said.
The team members said they believe their interdisciplinary approach is applicable to other climate-sensitive ecosystems and species, ranging from redwood forests in California to wild herbivore species in African savannas, and especially those that are currently surrounded by human activities.
“In a human-dominated planet, such studies will have to become the norm,” Lambin said. “Humans are part of these land systems that are rapidly transforming.”
This study was done in partnership with the U.S. Forest Service Pacific Northwest Research Station. It was funded with support from the George W. Wright Climate Change Fellowship; the Morrison Institute for Population and Resource Studies and the School of Earth, Energy & Environmental Sciences at Stanford University; the Wilderness Society Gloria Barron Fellowship; the National Forest Foundation; and U.S. Forest Service Pacific Northwest Research Station and Forest Health Protection.
For more Stanford experts on climate change and other topics, visit Stanford Experts.
Recent postponement of the first Organization of Islamic Countries (OIC) summit on Science and Technology and COMSTECH 15th general assembly meeting, by the government of Pakistan due to security reasons tells a lot about our national priorities.
The summit was first of its kind meeting of the heads of state and dignitaries from the Muslim world on the issue of science and technology.
Today most Muslim countries are known in other parts of the world as backward, narrow minded and violent regions. Recent wars in the Middle East, sectarian rifts and totalitarian regimes are also not presenting a great picture either. While rest of the world is sending probes towards the edge of our solar system, sending missions to Mars and exploring moons of Saturn, we are busy and failing in finding moon on the right dates of the Islamic calendar.
Any average person can figure out that we need something drastic to change this situation. This summit was exactly the kind of step we needed for a jump start. Some serious efforts were made by the COMSTECH staff under the leadership of Dr. Shaukat Hameed Khan and even the secretary general of OIC was pushing hard for the summit. According to reports, OIC secretary general personally visited more than a dozen OIC member countries to successfully convince their head of states to attend the summit.
This summit would have also provided an opportunity to bring harmony and peace in the Muslim world as many Muslim countries are at odds with each other on regional issues like in Syria, Iraq, Yemen and Afghanistan.
Last century saw enormous developments in the fields of fundamental science, which also helped countries to rapidly develop their potential in industry, medical sciences, defense, space and many other sectors. Countries which made science and technology research and education as priority areas emerged as stronger nations as compared to those who merely relied on agriculture and the abundance of natural resources. We are now living in an era where humanity is reaching to the end points of our solar system through probes like Voyager 1, sent decades ago by NASA with messages from our civilization; Quantum computing is well on its way to become a reality; Humanity is also endeavoring to colonize other planets through multi-national projects; We are also looking deepest into the space for new stars, galaxies and even to some of the earliest times after the creation of our universe through cosmic microwave background probes like Planck.
Unfortunately, in Pakistan, anti-science and anti-research attitudes are getting stronger. The lack of anti-science and anti-research attitude is not just limited to the religious zealots but the so called liberals of Pakistan do not simply put much heed to what is going around in the world of science.
If you are one of the regular followers of political arena, daily news coverage on the media and keep your ears open to hear what is going around in the country then you can easily get the idea what are our priorities as a nation. How many talk shows we saw on the main stream media over the cancellation of the summit? How many questions were raised in the parliament?
The absence or very unnoticeable presence of such issues is conspicuous and apart from one senator, Senator Sehar Kamran, who wrote a piece in a news paper, no politician even bothered to raise the relevant questions.
Forget about main stream media or politicians. If we go to social media or drawing room discussions, did you hear anyone discussing the issue in a debate when we make fuss about issues like what kind of dress some xyz model was wearing on her court hearing in a money laundering case or which politician’s marriage is supposedly in trouble or whose hand Junaid Jamshed was holding in group photo?
We boast about our success in reducing terrorism through successful military operations and use that success to attract investors, sports teams and tourists but on the other hand we are using security concerns as an excuse to cancel an important summit on the development of science and technology. This shows that either we are confused or hypocrites or we are simply not ready for any kind of intellectual growth.
There is a need to seriously do some brain storming and soul searching about our priorities. One thing which I have learned as a student of Astronomy is that we are insignificant as compared to the vastness of our universe, the only thing which can make us somewhat special as compared to other species on earth or a lifeless rock on Pluto is that we can challenge our thinking ability to learn, to explore and to discover. Unfortunately, in our country we are losing this special capacity day by day.
Researchers use engineered viruses to provide quantum-based enhancement of energy transport.
By David Chandler
CAMBRIDGE, Mass.–Nature has had billions of years to perfect photosynthesis, which directly or indirectly supports virtually all life on Earth. In that time, the process has achieved almost 100 percent efficiency in transporting the energy of sunlight from receptors to reaction centers where it can be harnessed — a performance vastly better than even the best solar cells.
One way plants achieve this efficiency is by making use of the exotic effects of quantum mechanics — effects sometimes known as “quantum weirdness.” These effects, which include the ability of a particle to exist in more than one place at a time, have now been used by engineers at MIT to achieve a significant efficiency boost in a light-harvesting system.
Surprisingly, the MIT researchers achieved this new approach to solar energy not with high-tech materials or microchips — but by using genetically engineered viruses.
This achievement in coupling quantum research and genetic manipulation, described this week in the journal Nature Materials, was the work of MIT professors Angela Belcher, an expert on engineering viruses to carry out energy-related tasks, and Seth Lloyd, an expert on quantum theory and its potential applications; research associate Heechul Park; and 14 collaborators at MIT and in Italy.
Lloyd, a professor of mechanical engineering, explains that in photosynthesis, a photon hits a receptor called a chromophore, which in turn produces an exciton — a quantum particle of energy. This exciton jumps from one chromophore to another until it reaches a reaction center, where that energy is harnessed to build the molecules that support life.
But the hopping pathway is random and inefficient unless it takes advantage of quantum effects that allow it, in effect, to take multiple pathways at once and select the best ones, behaving more like a wave than a particle.
This efficient movement of excitons has one key requirement: The chromophores have to be arranged just right, with exactly the right amount of space between them. This, Lloyd explains, is known as the “Quantum Goldilocks Effect.”
That’s where the virus comes in. By engineering a virus that Belcher has worked with for years, the team was able to get it to bond with multiple synthetic chromophores — or, in this case, organic dyes. The researchers were then able to produce many varieties of the virus, with slightly different spacings between those synthetic chromophores, and select the ones that performed best.
In the end, they were able to more than double excitons’ speed, increasing the distance they traveled before dissipating — a significant improvement in the efficiency of the process.
The project started from a chance meeting at a conference in Italy. Lloyd and Belcher, a professor of biological engineering, were reporting on different projects they had worked on, and began discussing the possibility of a project encompassing their very different expertise. Lloyd, whose work is mostly theoretical, pointed out that the viruses Belcher works with have the right length scales to potentially support quantum effects.
In 2008, Lloyd had published a paper demonstrating that photosynthetic organisms transmit light energy efficiently because of these quantum effects. When he saw Belcher’s report on her work with engineered viruses, he wondered if that might provide a way to artificially induce a similar effect, in an effort to approach nature’s efficiency.
“I had been talking about potential systems you could use to demonstrate this effect, and Angela said, ‘We’re already making those,’” Lloyd recalls. Eventually, after much analysis, “We came up with design principles to redesign how the virus is capturing light, and get it to this quantum regime.”
Within two weeks, Belcher’s team had created their first test version of the engineered virus. Many months of work then went into perfecting the receptors and the spacings.
Once the team engineered the viruses, they were able to use laser spectroscopy and dynamical modeling to watch the light-harvesting process in action, and to demonstrate that the new viruses were indeed making use of quantum coherence to enhance the transport of excitons.
“It was really fun,” Belcher says. “A group of us who spoke different [scientific] languages worked closely together, to both make this class of organisms, and analyze the data. That’s why I’m so excited by this.”
While this initial result is essentially a proof of concept rather than a practical system, it points the way toward an approach that could lead to inexpensive and efficient solar cells or light-driven catalysis, the team says. So far, the engineered viruses collect and transport energy from incoming light, but do not yet harness it to produce power (as in solar cells) or molecules (as in photosynthesis). But this could be done by adding a reaction center, where such processing takes place, to the end of the virus where the excitons end up.
The research was supported by the Italian energy company Eni through the MIT Energy Initiative. In addition to MIT postdocs Nimrod Heldman and Patrick Rebentrost, the team included researchers at the University of Florence, the University of Perugia, and Eni.
Stanford sociologist Robb Willer finds that an effective way to persuade people in politics is to reframe arguments to appeal to the moral values of those holding opposing positions.
BY CLIFTON B. PARKER
In today’s American politics, it might seem impossible to craft effective political messages that reach across the aisle on hot-button issues like same-sex marriage, national health insurance and military spending. But, based on new research by Stanford sociologist Robb Willer, there’s a way to craft messages that could lead to politicians finding common ground.
“We found the most effective arguments are ones in which you find a new way to connect a political position to your target audience’s moral values,” Willer said.
While most people’s natural inclination is to make political arguments grounded in their own moral values, Willer said, these arguments are less persuasive than “reframed” moral arguments.
To be persuasive, reframe political arguments to appeal to the moral values of those holding the opposing political positions, said Matthew Feinberg, assistant professor of organizational behavior at the University of Toronto, who co-authored the study with Willer. Their work was published recently online in the Personality and Social Psychology Bulletin.
Such reframed moral appeals are persuasive because they increase the apparent agreement between a political position and the target audience’s moral values, according to the research, Feinberg said.
In fact, Willer pointed out, the research shows a “potential effective path for building popular support in our highly polarized political world.” Creating bipartisan success on legislative issues – whether in Congress or in state legislatures – requires such a sophisticated approach to building coalitions among groups not always in agreement with each other, he added.
Different moral values
Feinberg and Willer drew upon past research showing that American liberals and conservatives tend to endorse different moral values to different extents. For example, liberals tend to be more concerned with care and equality where conservatives are more concerned with values like group loyalty, respect for authority and purity.
They then conducted four studies testing the idea that moral arguments reframed to fit a target audience’s moral values could be persuasive on even deeply entrenched political issues. In one study, conservative participants recruited via the Internet were presented with passages that supported legalizing same-sex marriage.
Conservative participants were ultimately persuaded by a patriotism-based argument that “same-sex couples are proud and patriotic Americans … [who] contribute to the American economy and society.”
On the other hand, they were significantly less persuaded by a passage that argued for legalized same-sex marriage in terms of fairness and equality.
Feinberg and Willer found similar results for studies targeting conservatives with a pro-national health insurance message and liberals with arguments for high levels of military spending and making English the official language of the United States. In all cases, messages were significantly more persuasive when they fit the values endorsed more by the target audience.
“Morality can be a source of political division, a barrier to building bi-partisan support for policies,” Willer said. “But it can also be a bridge if you can connect your position to your audience’s deeply held moral convictions.”
Values and framing messages
“Moral reframing is not intuitive to people,” Willer said. “When asked to make moral political arguments, people tend to make the ones they believe in and not that of an opposing audience – but the research finds this type of argument unpersuasive.”
To test this, the researchers conducted two additional studies examining the moral arguments people typically make. They asked a panel of self-reported liberals to make arguments that would convince a conservative to support same-sex marriage, and a panel of conservatives to convince liberals to support English being the official language of the United States.
They found that, in both studies, most participants crafted messages with significant moral content, and most of that moral content reflected their own moral values, precisely the sort of arguments their other studies showed were ineffective.
“Our natural tendency is to make political arguments in terms of our own morality,” Feinberg said. “But the most effective arguments are based on the values of whomever you are trying to persuade.”
In all, Willer and Feinberg conducted six online studies involving 1,322 participants.
Using images from ESO’s Very Large Telescope and the NASA/ESA Hubble Space Telescope, astronomers have discovered never-before-seen structures within a dusty disc surrounding a nearby star. The fast-moving wave-like features in the disc of the star AU Microscopii are unlike anything ever observed, or even predicted, before now. The origin and nature of these features present a new mystery for astronomers to explore. The results are published in the journal Nature on 8 October 2015.
AU Microscopii, or AU Mic for short, is a young, nearby star surrounded by a large disc of dust . Studies of such debris discs can provide valuable clues about how planets, which form from these discs, are created.
Astronomers have been searching AU Mic’s disc for any signs of clumpy or warped features, as such signs might give away the location of possible planets. And in 2014 they used the more powerful high-contrast imaging capabilities of ESO’s newly installed SPHERE instrument, mounted on the Very Large Telescope for their search — and discovered something very unusual.
“Our observations have shown something unexpected,” explains Anthony Boccaletti, LESIA (Observatoire de Paris/CNRS/UPMC/Paris-Diderot), France, and lead author on the paper. “The images from SPHERE show a set of unexplained features in the disc which have an arch-like, or wave-like, structure, unlike anything that has ever been observed before.”
Five wave-like arches at different distances from the star show up in the new images, reminiscent of ripples in water. After spotting the features in the SPHERE data the team turned to earlier images of the disc taken by the NASA/ESA Hubble Space Telescope in 2010 and 2011 to see whether the features were also visible in these . They were not only able to identify the features on the earlier Hubble images — but they also discovered that they had changed over time. It turns out that these ripples are moving — and very fast!
“We reprocessed images from the Hubble data and ended up with enough information to track the movement of these strange features over a four-year period,” explains team member Christian Thalmann (ETH Zürich, Switzerland). “By doing this, we found that the arches are racing away from the star at speeds of up to about 40 000 kilometres/hour!”
The features further away from the star seem to be moving faster than those closer to it. At least three of the features are moving so fast that they could well be escaping from the gravitational attraction of the star. Such high speeds rule out the possibility that these are conventional disc features caused by objects — like planets — disturbing material in the disc while orbiting the star. There must have been something else involved to speed up the ripples and make them move so quickly, meaning that they are a sign of something truly unusual .
“Everything about this find was pretty surprising!” comments co-author Carol Grady of Eureka Scientific, USA. “And because nothing like this has been observed or predicted in theory we can only hypothesise when it comes to what we are seeing and how it came about.”
The team cannot say for sure what caused these mysterious ripples around the star. But they have considered and ruled out a series of phenomena as explanations, including the collision of two massive and rare asteroid-like objects releasing large quantities of dust, and spiral waves triggered by instabilities in the system’s gravity.
But other ideas that they have considered look more promising.
“One explanation for the strange structure links them to the star’s flares. AU Mic is a star with high flaring activity — it often lets off huge and sudden bursts of energy from on or near its surface,” explains co-author Glenn Schneider of Steward Observatory, USA. “One of these flares could perhaps have triggered something on one of the planets — if there are planets — like a violent stripping of material which could now be propagating through the disc, propelled by the flare’s force.”
“It is very satisfying that SPHERE has proved to be very capable at studying discs like this in its first year of operation,” adds Jean-Luc Beuzit, who is both a co-author of the new study and also led the development of SPHERE itself.
The team plans to continue to observe the AU Mic system with SPHERE and other facilities, including ALMA, to try to understand what is happening. But, for now, these curious features remain an unsolved mystery.
 AU Microscopii lies just 32 light-years away from Earth. The disc essentially comprises asteroids that have collided with such vigour that they have been ground to dust.
 The data were gathered by Hubble’s Space Telescope Imaging Spectrograph (STIS).
 The edge-on view of the disc complicates the interpretation of its three-dimensional structure.
This research was presented in a paper entitled “Fast-Moving Structures in the Debris Disk Around AU Microscopii”, to appear in the journal Nature on 8 October 2015.
On Aug. 7, 1972, in the heart of the Apollo era, an enormous solar flare exploded from the sun’s atmosphere. Along with a gigantic burst of light in nearly all wavelengths, this event accelerated a wave of energetic particles. Mostly protons, with a few electrons and heavier elements mixed in, this wash of quick-moving particles would have been dangerous to anyone outside Earth’s protective magnetic bubble. Luckily, the Apollo 16 crew had returned to Earth just five months earlier, narrowly escaping this powerful event.
In the early days of human space flight, scientists were only just beginning to understand how events on the sun could affect space, and in turn how that radiation could affect humans and technology. Today, as a result of extensive space radiation research, we have a much better understanding of our space environment, its effects, and the best ways to protect astronauts—all crucial parts of NASA’s mission to send humans to Mars.
“The Martian” film highlights the radiation dangers that could occur on a round trip to Mars. While the mission in the film is fictional, NASA has already started working on the technology to enable an actual trip to Mars in the 2030s. In the film, the astronauts’ habitat on Mars shields them from radiation, and indeed, radiation shielding will be a crucial technology for the voyage. From better shielding to advanced biomedical countermeasures, NASA currently studies how to protect astronauts and electronics from radiation – efforts that will have to be incorporated into every aspect of Mars mission planning, from spacecraft and habitat design to spacewalk protocols.
“The space radiation environment will be a critical consideration for everything in the astronauts’ daily lives, both on the journeys between Earth and Mars and on the surface,” said Ruthan Lewis, an architect and engineer with the human spaceflight program at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. “You’re constantly being bombarded by some amount of radiation.”
Radiation, at its most basic, is simply waves or sub-atomic particles that transports energy to another entity – whether it is an astronaut or spacecraft component. The main concern in space is particle radiation. Energetic particles can be dangerous to humans because they pass right through the skin, depositing energy and damaging cells or DNA along the way. This damage can mean an increased risk for cancer later in life or, at its worst, acute radiation sickness during the mission if the dose of energetic particles is large enough.
Fortunately for us, Earth’s natural protections block all but the most energetic of these particles from reaching the surface. A huge magnetic bubble, called the magnetosphere, which deflects the vast majority of these particles, protects our planet. And our atmosphere subsequently absorbs the majority of particles that do make it through this bubble. Importantly, since the International Space Station (ISS) is in low-Earth orbit within the magnetosphere, it also provides a large measure of protection for our astronauts.
“We have instruments that measure the radiation environment inside the ISS, where the crew are, and even outside the station,” said Kerry Lee, a scientist at NASA’s Johnson Space Center in Houston.
This ISS crew monitoring also includes tracking of the short-term and lifetime radiation doses for each astronaut to assess the risk for radiation-related diseases. Although NASA has conservative radiation limits greater than allowed radiation workers on Earth, the astronauts are able to stay well under NASA’s limit while living and working on the ISS, within Earth’s magnetosphere.
But a journey to Mars requires astronauts to move out much further, beyond the protection of Earth’s magnetic bubble.
“There’s a lot of good science to be done on Mars, but a trip to interplanetary space carries more radiation risk than working in low-Earth orbit,” said Jonathan Pellish, a space radiation engineer at Goddard.
A human mission to Mars means sending astronauts into interplanetary space for a minimum of a year, even with a very short stay on the Red Planet. Nearly all of that time, they will be outside the magnetosphere, exposed to the harsh radiation environment of space. Mars has no global magnetic field to deflect energetic particles, and its atmosphere is much thinner than Earth’s, so they’ll get only minimal protection even on the surface of Mars.
Throughout the entire trip, astronauts must be protected from two sources of radiation. The first comes from the sun, which regularly releases a steady stream of solar particles, as well as occasional larger bursts in the wake of giant explosions, such as solar flares and coronal mass ejections, on the sun. These energetic particles are almost all protons, and, though the sun releases an unfathomably large number of them, the proton energy is low enough that they can almost all be physically shielded by the structure of the spacecraft.
Since solar activity strongly contributes to the deep-space radiation environment, a better understanding of the sun’s modulation of this radiation environment will allow mission planners to make better decisions for a future Mars mission. NASA currently operates a fleet of spacecraft studying the sun and the space environment throughout the solar system. Observations from this area of research, known as heliophysics, help us better understand the origin of solar eruptions and what effects these events have on the overall space radiation environment.
“If we know precisely what’s going on, we don’t have to be as conservative with our estimates, which gives us more flexibility when planning the mission,” said Pellish.
The second source of energetic particles is harder to shield. These particles come from galactic cosmic rays, often known as GCRs. They’re particles accelerated to near the speed of light that shoot into our solar system from other stars in the Milky Way or even other galaxies. Like solar particles, galactic cosmic rays are mostly protons. However, some of them are heavier elements, ranging from helium up to the heaviest elements. These more energetic particles can knock apart atoms in the material they strike, such as in the astronaut, the metal walls of a spacecraft, habitat, or vehicle, causing sub-atomic particles to shower into the structure. This secondary radiation, as it is known, can reach a dangerous level.
There are two ways to shield from these higher-energy particles and their secondary radiation: use a lot more mass of traditional spacecraft materials, or use more efficient shielding materials.
The sheer volume of material surrounding a structure would absorb the energetic particles and their associated secondary particle radiation before they could reach the astronauts. However, using sheer bulk to protect astronauts would be prohibitively expensive, since more mass means more fuel required to launch.
Using materials that shield more efficiently would cut down on weight and cost, but finding the right material takes research and ingenuity. NASA is currently investigating a handful of possibilities that could be used in anything from the spacecraft to the Martian habitat to space suits.
“The best way to stop particle radiation is by running that energetic particle into something that’s a similar size,” said Pellish. “Otherwise, it can be like you’re bouncing a tricycle off a tractor-trailer.”
Because protons and neutrons are similar in size, one element blocks both extremely well—hydrogen, which most commonly exists as just a single proton and an electron. Conveniently, hydrogen is the most abundant element in the universe, and makes up substantial parts of some common compounds, such as water and plastics like polyethylene. Engineers could take advantage of already-required mass by processing the astronauts’ trash into plastic-filled tiles used to bolster radiation protection. Water, already required for the crew, could be stored strategically to create a kind of radiation storm shelter in the spacecraft or habitat. However, this strategy comes with some challenges—the crew would need to use the water and then replace it with recycled water from the advanced life support systems.
Polyethylene, the same plastic commonly found in water bottles and grocery bags, also has potential as a candidate for radiation shielding. It is very high in hydrogen and fairly cheap to produce—however, it’s not strong enough to build a large structure, especially a spacecraft, which goes through high heat and strong forces during launch. And adding polyethylene to a metal structure would add quite a bit of mass, meaning that more fuel would be required for launch.
“We’ve made progress on reducing and shielding against these energetic particles, but we’re still working on finding a material that is a good shield and can act as the primary structure of the spacecraft,” said Sheila Thibeault, a materials researcher at NASA’s Langley Research Center in Hampton, Virginia.
One material in development at NASA has the potential to do both jobs: Hydrogenated boron nitride nanotubes—known as hydrogenated BNNTs—are tiny, nanotubes made of carbon, boron, and nitrogen, with hydrogen interspersed throughout the empty spaces left in between the tubes. Boron is also an excellent absorber secondary neutrons, making hydrogenated BNNTs an ideal shielding material.
“This material is really strong—even at high heat—meaning that it’s great for structure,” said Thibeault.
Remarkably, researchers have successfully made yarn out of BNNTs, so it’s flexible enough to be woven into the fabric of space suits, providing astronauts with significant radiation protection even while they’re performing spacewalks in transit or out on the harsh Martian surface. Though hydrogenated BNNTs are still in development and testing, they have the potential to be one of our key structural and shielding materials in spacecraft, habitats, vehicles, and space suits that will be used on Mars.
Physical shields aren’t the only option for stopping particle radiation from reaching astronauts: Scientists are also exploring the possibility of building force fields. Force fields aren’t just the realm of science fiction: Just like Earth’s magnetic field protects us from energetic particles, a relatively small, localized electric or magnetic field would—if strong enough and in the right configuration—create a protective bubble around a spacecraft or habitat. Currently, these fields would take a prohibitive amount of power and structural material to create on a large scale, so more work is needed for them to be feasible.
The risk of health effects can also be reduced in operational ways, such as having a special area of the spacecraft or Mars habitat that could be a radiation storm shelter; preparing spacewalk and research protocols to minimize time outside the more heavily-shielded spacecraft or habitat; and ensuring that astronauts can quickly return indoors in the event of a radiation storm.
Radiation risk mitigation can also be approached from the human body level. Though far off, a medication that would counteract some or all of the health effects of radiation exposure would make it much easier to plan for a safe journey to Mars and back.
“Ultimately, the solution to radiation will have to be a combination of things,” said Pellish. “Some of the solutions are technology we have already, like hydrogen-rich materials, but some of it will necessarily be cutting edge concepts that we haven’t even thought of yet.”
Hurricane Joaquin had become a Category 4 hurricane on the Saffir-Simpson Wind Scale by 2 p.m. EDT on October 1. At NASA, satellite imagery from NOAA’s GOES-East satellite was compiled into an animation that showed the hurricane strengthening. Earlier in the day, NASA-NOAA’s Suomi NPP satellite saw powerful thunderstorms within, indicating further strengthening.
The GOES-East satellite is managed by NOAA, and at NASA’s GOES Project at the NASA Goddard Space Flight Center in Greenbelt, Maryland, imagery from GOES-East we compiled into an animation. The infrared and visible imagery from September 29 to October 1 from showed Hurricane Joaquin become a major hurricane in the Bahamas.
Earlier in the morning, NASA-NOAA’s Suomi NPP satellite passed over Joaquin at 06:10 UTC (2:10 a.m. EDT) as it was strengthening from a Category 2 to a Category 3 hurricane. The Visible Infrared Imaging Radiometer Suite (VIIRS) instrument aboard captured an infrared image that showed cloud top temperatures colder than -63F/-53C, indicative of powerful storms within the hurricane. NASA research has shown that storms with cloud tops that high (and that stretch that high into the troposphere) have the capability to generate heavy rain.
On October 1, a Hurricane Warning was in effect for the Central Bahamas, Northwestern Bahamas including the Abacos, Berry Islands, Eleuthera, Grand Bahama Island, and New Providence, The Acklins, Crooked Island, and Mayaguana in the southeastern Bahamas. A Hurricane Watch was in effect for Bimini and Andros Island, and a Tropical Storm Warning was in effect for the remainder of the southeastern Bahamas excluding the Turks and Caicos Islands and Andros Island.
At 2 p.m. EDT (1800 UTC), the center of Hurricane Joaquin was located near latitude 23.0 North, longitude 74.2 West. Joaquin was moving generally southwestward at about 6 mph (9 kph), and the National Hurricane Center forecast a turn toward the northwest and north on Friday, October 2. On the forecast track, the center of Joaquin will move near or over portions of the central Bahamas today and tonight and pass near or over portions of the northwestern Bahamas on Friday, October 2.
Reports from an Air Force Reserve Hurricane Hunter aircraft indicated that maximum sustained winds have increased to near 130 mph (210 kph) with higher gusts. Joaquin is now a category 4 hurricane on the Saffir-Simpson Hurricane Wind Scale. Some additional strengthening is possible during the next 24 hours, with some fluctuations in intensity possible Friday night and Saturday.
Hurricane force winds extend outward up to 45 miles (75 km) from the center and tropical storm force winds extend outward up to 140 miles (220 km).
The latest minimum central pressure extrapolated from Hurricane Hunter aircraft data is 936 millibars. For effects on the Bahamas, updates to forecasts, watches and warnings, visit the National Hurricane Center website: http://www.nhc.noaa.gov.
The NHC updated forecast takes Joaquin on a more northerly track from Saturday, October 3 through Tuesday, October 6 toward Long Island, New York. Tracks and forecasts are subject to change.
Achromatopsia is a rare, inherited vision disorder that affects the eye’s cone cells, resulting in problems with daytime vision, clarity and color perception. It often strikes people early in life, and currently there is no cure for the condition.
One of the most promising avenues for developing a cure, however, is through gene therapy, and to create those therapies requires animal models of disease that closely replicate the human condition.
In a new study, a collaboration between University of Pennsylvania and Temple University scientists has identified two naturally occurring genetic mutations in dogs that result in achromatopsia. Having identified the mutations responsible, they used structural modeling and molecular dynamics on the Titan supercomputer at Oak Ridge National Laboratory and the Stampede supercomputer at the Texas Advanced Computing Center to simulate how the mutations would impact the resulting protein, showing that the mutations destabilized a molecular channel essential to light signal transduction.
The findings provide new insights into the molecular cause of this form of blindness and also present new opportunities for conducting preclinical assessments of curative gene therapy for achromatopsia in both dogs and humans.
“Our work in the dogs, in vitro and in silico shows us the consequences of these mutations in disrupting the function of these crucial channels,” said Karina Guziewicz, senior author on the study and a senior research investigator at Penn’s School of Veterinary Medicine. “Everything we found suggests that gene therapy will be the best approach to treating this disease, and we are looking forward to taking that next step.”
The study was published in the journal PLOS ONE and coauthored by Penn Vet’s Emily V. Dutrow and Temple’s Naoto Tanaka. Additional coauthors from Penn Vet included Gustavo D. Aguirre, Keiko Miyadera, Shelby L. Reinstein, William R. Crumley and Margret L. Casal. Temple’s team, all from the College of Science and Technology, included Lucie Delemotte, Christopher M. MacDermaid, Michael L. Klein and Jacqueline C. Tanaka. Christopher J. Dixon of Veterinary Vision in the United Kingdom also contributed.
The research began with a German shepherd that was brought to Penn Vet’s Ryan Hospital. The owners were worried about its vision.
“This dog displayed a classical loss of cone vision; it could not see well in daylight but had no problem in dim light conditions,” said Aguirre, professor of medical genetics and ophthalmology at Penn Vet.
The Penn Vet researchers wanted to identify the genetic cause, but the dog had none of the “usual suspects,” the known gene mutations responsible for achromatopsia in dogs. To find the new mutation, the scientists looked at five key genes that play a role in phototransduction, or the process by which light signals are transmitted through the eye to the brain.
They found what they were looking for on the CNGA3 gene, which encodes a cyclic nucleotide channel and plays a key role in transducing visual signals. The change was a “missense” mutation, meaning that the mutation results in the production of a different amino acid. Meanwhile, they heard from colleague Dixon that he had examined Labrador retrievers with similar symptoms. When the Penn team performed the same genetic analysis, they found a different mutation on the same part of the same gene where the shepherd’s mutation was found. Neither mutation had ever been characterized previously in dogs.
“The next step was to take this further and look at the consequences of these particular mutations,” Guziewicz said.
The group had the advantage of using the Titan and Stampede supercomputers, which can simulate models of the atomic structure of proteins and thereby elucidate how the protein might function. That work revealed that both mutations disrupted the function of the channel, making it unstable.
“The computational approach allows us to model, right down to the atomic level, how small changes in protein sequence can have a major impact on signaling,” said MacDermaid, assistant professor of research at Temple’s Institute for Computational Molecular Science. “We can then use these insights to help us understand and refine our experimental and clinical work.”
The Temple researchers recreated these mutated channels and showed that one resulted in a loss of channel function. Further in vitro experiments showed that the second mutation caused the channels to be routed improperly within the cell.
Penn Vet researchers have had success in treating various forms of blindness in dogs with gene therapy, setting the stage to treat human blindness. In human achromatopsia, nearly 100 different mutations have been identified in the CNGA3 gene, including the very same one identified in the German shepherd in this study.
The results, therefore, lay the groundwork for designing gene therapy constructs that can target this form of blindness with the same approach.
The study was supported by the Foundation Fighting Blindness, the National Eye Institute, the National Science Foundation, the European Union Seventh Framework Program, Hope for Vision, the Macula Vision Research Foundation and the Van Sloun Fund for Canine Genetic Research.