Tag Archives: human

NASA Satellite Finds Unreported Sources of Toxic Air Pollution

Using a new satellite-based method, scientists at NASA, Environment and Climate Change Canada, and two universities have located 39 unreported and major human-made sources of toxic sulfur dioxide emissions.

A known health hazard and contributor to acid rain, sulfur dioxide (SO2) is one of six air pollutants regulated by the U.S. Environmental Protection Agency. Current, sulfur dioxide monitoring activities include the use of emission inventories that are derived from ground-based measurements and factors, such as fuel usage. The inventories are used to evaluate regulatory policies for air quality improvements and to anticipate future emission scenarios that may occur with economic and population growth.

 Source: NASA

But, to develop comprehensive and accurate inventories, industries, government agencies and scientists first must know the location of pollution sources.

“We now have an independent measurement of these emission sources that does not rely on what was known or thought known,” said Chris McLinden, an atmospheric scientist with Environment and Climate Change Canada in Toronto and lead author of the study published this week in Nature Geosciences. “When you look at a satellite picture of sulfur dioxide, you end up with it appearing as hotspots – bull’s-eyes, in effect — which makes the estimates of emissions easier.”

The 39 unreported emission sources, found in the analysis of satellite data from 2005 to 2014, are clusters of coal-burning power plants, smelters, oil and gas operations found notably in the Middle East, but also in Mexico and parts of Russia. In addition, reported emissions from known sources in these regions were — in some cases — two to three times lower than satellite-based estimates.

Altogether, the unreported and underreported sources account for about 12 percent of all human-made emissions of sulfur dioxide – a discrepancy that can have a large impact on regional air quality, said McLinden.

The research team also located 75 natural sources of sulfur dioxide — non-erupting volcanoes slowly leaking the toxic gas throughout the year. While not necessarily unknown, many volcanoes are in remote locations and not monitored, so this satellite-based data set is the first to provide regular annual information on these passive volcanic emissions.

“Quantifying the sulfur dioxide bull’s-eyes is a two-step process that would not have been possible without two innovations in working with the satellite data,” said co-author Nickolay Krotkov, an atmospheric scientist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland.

First was an improvement in the computer processing that transforms raw satellite observations from the Dutch-Finnish Ozone Monitoring Instrument aboard NASA’s Aura spacecraft into precise estimates of sulfur dioxide concentrations. Krotkov and his team now are able to more accurately detect smaller sulfur dioxide concentrations, including those emitted by human-made sources such as oil-related activities and medium-size power plants.

Being able to detect smaller concentrations led to the second innovation. McLinden and his colleagues used a new computer program to more precisely detect sulfur dioxide that had been dispersed and diluted by winds. They then used accurate estimates of wind strength and direction derived from a satellite data-driven model to trace the pollutant back to the location of the source, and also to estimate how much sulfur dioxide was emitted from the smoke stack.

“The unique advantage of satellite data is spatial coverage,” said Bryan Duncan, an atmospheric scientist at Goddard. “This paper is the perfect demonstration of how new and improved satellite datasets, coupled with new and improved data analysis techniques, allow us to identify even smaller pollutant sources and to quantify these emissions over the globe.”

The University of Maryland, College Park, and Dalhousie University in Halifax, Nova Scotia, contributed to this study.

For more information about, and access to, NASA’s air quality data, visit:

http://so2.gsfc.nasa.gov/

NASA uses the vantage point of space to increase our understanding of our home planet, improve lives, and safeguard our future. NASA develops new ways to observe and study Earth’s interconnected natural systems with long-term data records. The agency freely shares this unique knowledge and works with institutions around the world to gain new insights into how our planet is changing.

For more information about NASA Earth science research, visit:

http://www.nasa.gov/earth

Automating big-data analysis : MIT Research

System that replaces human intuition with algorithms outperforms 615 of 906 human teams.

By Larry Hardesty


Big-data analysis consists of searching for buried patterns that have some kind of predictive power. But choosing which “features” of the data to analyze usually requires some human intuition. In a database containing, say, the beginning and end dates of various sales promotions and weekly profits, the crucial data may not be the dates themselves but the spans between them, or not the total profits but the averages across those spans.

MIT researchers aim to take the human element out of big-data analysis, with a new system that not only searches for patterns but designs the feature set, too. To test the first prototype of their system, they enrolled it in three data science competitions, in which it competed against human teams to find predictive patterns in unfamiliar data sets. Of the 906 teams participating in the three competitions, the researchers’ “Data Science Machine” finished ahead of 615.

In two of the three competitions, the predictions made by the Data Science Machine were 94 percent and 96 percent as accurate as the winning submissions. In the third, the figure was a more modest 87 percent. But where the teams of humans typically labored over their prediction algorithms for months, the Data Science Machine took somewhere between two and 12 hours to produce each of its entries.

“We view the Data Science Machine as a natural complement to human intelligence,” says Max Kanter, whose MIT master’s thesis in computer science is the basis of the Data Science Machine. “There’s so much data out there to be analyzed. And right now it’s just sitting there not doing anything. So maybe we can come up with a solution that will at least get us started on it, at least get us moving.”

Between the lines

Kanter and his thesis advisor, Kalyan Veeramachaneni, a research scientist at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), describe the Data Science Machine in a paper that Kanter will present next week at the IEEE International Conference on Data Science and Advanced Analytics.

Veeramachaneni co-leads the Anyscale Learning for All group at CSAIL, which applies machine-learning techniques to practical problems in big-data analysis, such as determining the power-generation capacity of wind-farm sites or predicting which students are at risk fordropping out of online courses.

“What we observed from our experience solving a number of data science problems for industry is that one of the very critical steps is called feature engineering,” Veeramachaneni says. “The first thing you have to do is identify what variables to extract from the database or compose, and for that, you have to come up with a lot of ideas.”

In predicting dropout, for instance, two crucial indicators proved to be how long before a deadline a student begins working on a problem set and how much time the student spends on the course website relative to his or her classmates. MIT’s online-learning platform MITxdoesn’t record either of those statistics, but it does collect data from which they can be inferred.

Featured composition

Kanter and Veeramachaneni use a couple of tricks to manufacture candidate features for data analyses. One is to exploit structural relationships inherent in database design. Databases typically store different types of data in different tables, indicating the correlations between them using numerical identifiers. The Data Science Machine tracks these correlations, using them as a cue to feature construction.

For instance, one table might list retail items and their costs; another might list items included in individual customers’ purchases. The Data Science Machine would begin by importing costs from the first table into the second. Then, taking its cue from the association of several different items in the second table with the same purchase number, it would execute a suite of operations to generate candidate features: total cost per order, average cost per order, minimum cost per order, and so on. As numerical identifiers proliferate across tables, the Data Science Machine layers operations on top of each other, finding minima of averages, averages of sums, and so on.

It also looks for so-called categorical data, which appear to be restricted to a limited range of values, such as days of the week or brand names. It then generates further feature candidates by dividing up existing features across categories.

Once it’s produced an array of candidates, it reduces their number by identifying those whose values seem to be correlated. Then it starts testing its reduced set of features on sample data, recombining them in different ways to optimize the accuracy of the predictions they yield.

“The Data Science Machine is one of those unbelievable projects where applying cutting-edge research to solve practical problems opens an entirely new way of looking at the problem,” says Margo Seltzer, a professor of computer science at Harvard University who was not involved in the work. “I think what they’ve done is going to become the standard quickly — very quickly.”

Source: MIT News Office

 

Three internal forces which can make your employees work your way !

Employee Motivation

TOP LEADERSHIP

  • It should be much more interactive and formulate a strong system which shouldn’t rely too much excessively on personal opinions.
  • Bring more hope and more certainty by continuously interacting with employees, even working down the level; otherwise you will only be left with mediocre talent.
  • They should also realize that things have changed (both internal and external environment) from the time they were in a low cadre position. They should no more be looking for their own reflections in their sub-ordinates.
  • Bring Young Leadership up. Formulate interdepartmental teams of youngsters, to interact with Top Leadership directly; this will not only keep top management well informed of the ground realities but will also make youngsters feel more involved.
  • Create a culture of respect.
  • Build a mentoring a culture.

HUMAN RESOURCES DEPARTMENT:

  • Identify and train well educated employees as leaders with a positive attitude having diversity.
  • Interdepartmental transfers to be encouraged, as employees having alacrity for learning don’t like to get stuck with one dimensional routine job for years.
  • Money / Benefits are not the only source of motivation.
  • Salary Comparability / Benchmarking to be performed + Benefits like corporate memberships of Clubs etc. to bring more charm to the lives of employees and their families.
  • HR should have direct personalized/confidential level of communication with employees. How would I see my future here? It seems that much of it isn’t in my control. Should I keep waiting for the upper hierarchy to have some movements or vacancies; should business figures be the only criteria for promotion (even if I don’t have any contribution in formulating a clear strategy) and keep waiting for the crop to grow even from a dead land; should I keep worrying about ways to please bosses, even if I know at times it isn’t in the interest of company.
  • Formulate a talent Management program.
  • Make interdepartmental teams of employees and assign in general tasks like canteen management or any event management etc. This will keep them more involved and will give them “Sense of Achievement” (as if they may not get it from their immediate leaders). Such activities will also help HR in identifying talent.
  • System should be made stronger and too much reliance shouldn’t be placed on perceptions formed; as it happens that the preceptor has only “Hammer” in his/her tool box and every problem he/she perceives to be a “Nail”.
  • HR should be well aware with the capabilities (in terms of Knowledge, skills and attitude) of employees. It’s difficult but not impossible especially when employees are to be considered as customer from HR’s point of view, to serve in the best possible way.
  • Create a culture of respect.
  • Give Sense of Recognition/Achievement to Employees.

IMMEDIATE LEADERS:

They have a vital role to play as far as motivation level of employees is concerned.

In this role priority should not be given to such people who have this organisation as their only source of learning in their lives – as such people cannot bring change and their capabilities remain a subset of their previous bosses. Such people don’t bring change and don’t encourage change. They want to make the ball role as it has been rolling since their time.

Give Sense of Recognition/Achievement to Employees.

Create a culture of respect.

Build a mentoring Culture.

How to build a proactive workforce

Building a proactive workforce is every manager’s dream as it can boost a company’s performance, but a new study has found if job satisfaction is low those ‘agents of change’ quickly lose that can-do attitude.

Researchers followed 75 workers for two years, measuring their job satisfaction levels and how proactive they were.

They found that those with high levels of job satisfaction remained proactive two years later, but those with low levels tailed off in terms of proactivity. Interestingly there was a group who had high job satisfaction but did not promote change in their organisation and still didn’t two years later.

The study by Karoline Strauss, of Warwick Business School, Mark Griffin and Sharon Parker, of University of Western Australia, and Claire Mason, of the Commonwealth Scientific and Industrial Research Organization, also looked at how adaptive workers were and discovered that the easier they adapted to change the more likely they would remain proactive over the long term.

“Proactivity is important for innovation and implementing organisational change,” said Dr Strauss, who is part of the Organisation & Human Resources Management Group at Warwick Business School.

“So it is important to sustain a proactive workforce and we have found that job satisfaction is important, not just as an instigator of proactivity, but as a force for maintaining momentum.

“There has been research showing that job satisfaction leads to a more compliant workforce, and we did find that highly satisfied employees who had not tried to promote change at work were unlikely to do so in the future. But we also found that those with high levels of job satisfaction who were proactive maintained that over two years.

“Low levels of job satisfaction may motivate high levels of proactive behaviour in the short term as workers looked to change things to become more satisfied, but this is not sustained over the long term. Our findings suggest that these workers will either succeed in changing their environment at work and so no longer see the need to seek change, or fail, become frustrated and not persevere with their proactive behaviour.”

Management research has found that effective change in an organisation requires proactivity among the workforce to be maintained over a long period. As well as job satisfaction the study on an Australian healthcare organisation, entitled Building and sustaining proactive behaviors: the role of adaptivity and job satsifaction and published in the Journal of Business and Psychology, discovered adaptability was also an important factor.

“If employees do not adapt to change, they are consequently unlikely to support proactivity,” said Dr Strauss. “This research found a significant positive link between a worker’s adaptivity and proactivity.

“Those who fail to adapt to change seem to be less likely to initiate change in the future as they may see change as threatening and may lose confidence in their own ability to be proactive. Irrespective of their past proactivity we found that employees’ proactivity may decrease if they fail to adapt to change and that may impact on a company’s performance and profitability.”

Dr Karoline Strauss also teaches Organisational Behaviour on the Warwick MBA by full-time studyWarwick Executive MBAWarwick MBA by distance learning and Global Energy MBA. She also teaches Management, Organisation and Society on Warwick Business School’s Undergraduate courses.

Source: Warwick Business School

Simple isn’t better when talking about science, Stanford philosopher suggests

Taking a philosophical approach to the assumptions that surround the study of human behavior, Stanford philosophy Professor Helen Longino suggests that no single research method is capable of answering the question of nature vs. nurture.


 

By Barbara Wilcox

Studies of the origins of human sexuality and aggression are typically in the domain of the sciences, where researchers examine genetic, neurobiological, social and environmental factors.

Behavioral research findings draw intense interest from other researchers, policymakers and the general public. But Stanford’s Helen E. Longino, the Clarence Irving Lewis Professor of Philosophy, says there’s more to the story.

Longino, who specializes in the philosophy of science, asserts in her latest book that the limitations of behavioral research are not clearly communicated in academic or popular discourse. As a result, this lack of communication distorts the scope of current behavioral research.

In her book Studying Human Behavior: How Scientists Investigate Aggression and Sexuality, Longino examines five common scientific approaches to the study of behavior – quantitative behavioral genetics, molecular behavioral genetics, developmental psychology, neurophysiology and anatomy, and social/environmental methods.

Applying the analytical tools of philosophy, Longino defines what is – and is not – measured by each of these approaches. She also reflects on how this research is depicted in academic and popular media.

In her analysis of citations of behavioral research, Longino found that the demands of journalism and of the culture at large favor science with a very simple storyline. Research that looks for a single “warrior gene” or a “gay gene,” for example, receives more attention in both popular and scholarly media than research that takes an integrative approach across scientific approaches or disciplines.

Longino spoke with the Stanford News Service about why it is important for scientists and the public to understand the parameters of behavioral research:

 

Your research suggests that social-science researchers are not adequately considering the limitations of their processes and findings. To what do you attribute this phenomenon?

The sciences have become hyper-specialized. Scientists rarely have the opportunity or support to step back from their research and ask how it connects with other work on similar topics. I see one role of philosophers of science as the provision of that larger, interpretive picture. This is not to say that there is one correct interpretation, rather that as philosophers we can show that the interpretive questions are askable.

 

Why study behavioral research through a philosophic lens?

Philosophy deals, in part, with the study of how things are known. A philosopher can ask, “What are the grounds for believing any of the claims here? What are the relationships between these approaches? The differences? What can we learn? What can this way of thinking not tell us?”

These are the questions I asked of each article I read. I developed a grid system for analyzing and recording the way the behavior under study was defined and measured, the correlational or observational data – including size and character of sample population – developed, the hypotheses evaluated.

 

What about your findings do you think would surprise people most?

I went into the project thinking that what would differentiate each approach was its definition of behavior. As the patterns emerged, I saw that. What differentiated each approach was how it characterized the range of possible causal factors.

Because each approach characterized this range differently, the measurements of different research approaches were not congruent. Thus, their results could not be combined or integrated or treated as empirical competitors. But this is what is required if the nature vs. nurture – or nature and nurture – question is to be meaningful.

I also investigated the representation of this research in public media. I found that research that locates the roots of behavior in the individual is cited far more often than population-based studies, and that research that cites genetic or neurobiological factors is cited more frequently than research into social or environmental influences on behavior. Interestingly, science journalists fairly consistently described biological studies as being more fruitful and promising than studies into social factors of behavior.

Social research was always treated as “terminally inconclusive,” using terms that amount to “we’ll never get an answer.” Biological research was always treated as being a step “on the road to knowledge.”

 

What prompted you to begin the research that became Studying Human Behavior?

In 1992, an East Coast conference on “genetic factors and crime” was derailed under pressure from activists and the Congressional Black Caucus, which feared that the findings being presented might be misused to find a racial basis for crime or links between race and intelligence. I became interested in the conceptual and theoretical foundations of the conference – the voiced and unvoiced assumptions made by both the conference participants and by the activists, policymakers and other users of the research.

 

Why did you pair human aggression and sexuality as a subject for a book?

While I started with the research on aggression, research on sexual orientation started popping up in the news and I wanted to include research on at least two behaviors or families of behavior in order to avoid being misled by potential sample bias. Of course, these behaviors are central to social life, so how we try to understand them is intrinsically interesting.

 

What could science writers be doing better?

Articles in the popular media, such as the science sections of newspapers, rarely discuss the methodology of studies that they cover as news. Yet methodology and the disciplinary approach of the scientists doing the research are critical because they frame the question.

For example, quantitative behavioral genetics research will consider a putatively shared genome against social factors such as birth order, parental environment and socioeconomic status. Molecular genetics research seeks to associate specific traits with specific alleles or combinations within the genome, but the social factors examined by quantitative behavioral genetics lie outside its purview. Neurobiological research might occupy a middle ground. But no single approach or even a combination of approaches can measure all the factors that bear on a behavior.

It’s also important to know that often, behavior is not what’s being studied. It’s a tool, not the subject. The process of serotonin re-uptake, for example, may be of primary interest to the researcher, not the behavior that it yields. Yet behavior is what’s being reported.

 

What advice do you have for people who might be concerned about potential political ramifications of research into sexuality or aggression?

I see political ramifications in what is not studied.

In studying sexual orientation, the 7-point Kinsey scale was an improvement over a previous binary measure of orientation. Researchers employing the Kinsey scale still tend to find greater concentrations at the extremes. Middle points still get dropped out of the analysis. In addition to more attention to intermediates on the scale, there could be focus on other dimensions of erotic orientation in addition to, or instead of, the sex of the individual to which one is attracted.

Similarly, there are a number of standard ways to measure aggressive response, but they are all focused on the individual. Collective action is not incorporated. If the interest in studying aggression is to shed light on crime, there’s a whole lot of behavior that falls outside that intersection, including white-collar crime and state- or military-sponsored crime.

 

What other fields of inquiry could benefit from your findings?

Climate study is as complex as behavioral study. We’d have a much better debate about climate change if we were not looking for a single answer or silver bullet. The public should understand the complexities that the IPCC [Intergovernmental Panel on Climate Change] must cope with in producing its findings.

Source: Stanford News Service