Tag Archives: memory

Physicists solve quantum tunneling mystery

An international team of scientists studying ultrafast physics have solved a mystery of quantum mechanics, and found that quantum tunneling is an instantaneous process.

The new theory could lead to faster and smaller electronic components, for which quantum tunneling is a significant factor. It will also lead to a better understanding of diverse areas such as electron microscopy, nuclear fusion and DNA mutations.

“Timescales this short have never been explored before. It’s an entirely new world,” said one of the international team, Professor Anatoli Kheifets, from The Australian National University (ANU).

“We have modelled the most delicate processes of nature very accurately.”

At very small scales quantum physics shows that particles such as electrons have wave-like properties – their exact position is not well defined. This means they can occasionally sneak through apparently impenetrable barriers, a phenomenon called quantum tunneling.

Quantum tunneling plays a role in a number of phenomena, such as nuclear fusion in the sun, scanning tunneling microscopy, and flash memory for computers. However, the leakage of particles also limits the miniaturisation of electronic components.

Professor Kheifets and Dr. Igor Ivanov, from the ANU Research School of Physics and Engineering, are members of a team which studied ultrafast experiments at the attosecond scale (10-18 seconds), a field that has developed in the last 15 years.

Until their work, a number of attosecond phenomena could not be adequately explained, such as the time delay when a photon ionised an atom.

“At that timescale the time an electron takes to quantum tunnel out of an atom was thought to be significant. But the mathematics says the time during tunneling is imaginary – a complex number – which we realised meant it must be an instantaneous process,” said Professor Kheifets.

“A very interesting paradox arises, because electron velocity during tunneling may become greater than the speed of light. However, this does not contradict the special theory of relativity, as the tunneling velocity is also imaginary” said Dr Ivanov, who recently took up a position at the Center for Relativistic Laser Science in Korea.

The team’s calculations, which were made using the Raijin supercomputer, revealed that the delay in photoionisation originates not from quantum tunneling but from the electric field of the nucleus attracting the escaping electron.

The results give an accurate calibration for future attosecond-scale research, said Professor Kheifets.

“It’s a good reference point for future experiments, such as studying proteins unfolding, or speeding up electrons in microchips,” he said.

The research is published in Nature Physics.

Source: ANU

Better debugger

System to automatically find a common type of programming bug significantly outperforms its predecessors.

By Larry Hardesty


CAMBRIDGE, Mass. – Integer overflows are one of the most common bugs in computer programs — not only causing programs to crash but, even worse, potentially offering points of attack for malicious hackers. Computer scientists have devised a battery of techniques to identify them, but all have drawbacks.

This month, at the Association for Computing Machinery’s International Conference on Architectural Support for Programming Languages and Operating Systems, researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) will present a new algorithm for identifying integer-overflow bugs. The researchers tested the algorithm on five common open-source programs, in which previous analyses had found three bugs. The new algorithm found all three known bugs — and 11 new ones.

The variables used by computer programs come in a few standard types, such as floating-point numbers, which can contain decimals; characters, like the letters of this sentence; or integers, which are whole numbers. Every time the program creates a new variable, it assigns it a fixed amount of space in memory.

If a program tries to store too large a number at a memory address reserved for an integer, the operating system will simply lop off the bits that don’t fit. “It’s like a car odometer,” says Stelios Sidiroglou-Douskos, a research scientist at CSAIL and first author on the new paper. “You go over a certain number of miles, you go back to zero.”

In itself, an integer overflow won’t crash a program; in fact, many programmers use integer overflows to perform certain types of computations more efficiently. But if a program tries to do something with an integer that has overflowed, havoc can ensue. Say, for instance, that the integer represents the number of pixels in an image the program is processing. If the program allocates memory to store the image, but its estimate of the image’s size is off by several orders of magnitude, the program will crash.

Charting a course

Any program can be represented as a flow chart — or, more technically, a graph, with boxes that represent operations connected by line segments that represent the flow of data between operations. Any given program input will trace a single route through the graph. Prior techniques for finding integer-overflow bugs would start at the top of the graph and begin working through it, operation by operation.

For even a moderately complex program, however, that graph is enormous; exhaustive exploration of the entire thing would be prohibitively time-consuming. “What this means is that you can find a lot of errors in the early input-processing code,” says Martin Rinard, an MIT professor of computer science and engineering and a co-author on the new paper. “But you haven’t gotten past that part of the code before the whole thing poops out. And then there are all these errors deep in the program, and how do you find them?”

Rinard, Sidiroglou-Douskos, and several other members of Rinard’s group — researchers Eric Lahtinen and Paolo Piselli and graduate students Fan Long, Doekhwan Kim, and Nathan Rittenhouse — take a different approach. Their system, dubbed DIODE (for Directed Integer Overflow Detection), begins by feeding the program a single sample input. As that input is processed, however — as it traces a path through the graph — the system records each of the operations performed on it by adding new terms to what’s known as a “symbolic expression.”

“These symbolic expressions are complicated like crazy,” Rinard explains. “They’re bubbling up through the very lowest levels of the system into the program. This 32-bit integer has been built up of all these complicated bit-level operations that the lower-level parts of your system do to take this out of your input file and construct those integers for you. So if you look at them, they’re pages long.”

Trigger warning

When the program reaches a point at which an integer is involved in a potentially dangerous operation — like a memory allocation — DIODE records the current state of the symbolic expression. The initial test input won’t trigger an overflow, but DIODE can analyze the symbolic expression to calculate an input that will.

The process still isn’t over, however: Well-written programs frequently include input checks specifically designed to prevent problems like integer overflows, and the new input, unlike the initial input, might fail those checks. So DIODE seeds the program with its new input, and if it fails such a check, it imposes a new constraint on the symbolic expression and computes a new overflow-triggering input. This process continues until the system either finds an input that can pass the checks but still trigger an overflow, or it concludes that triggering an overflow is impossible.

If DIODE does find a trigger value, it reports it, providing developers with a valuable debugging tool. Indeed, since DIODE doesn’t require access to a program’s source code but works on its “binary” — the executable version of the program — a program’s users could run it and then send developers the trigger inputs as graphic evidence that they may have missed security vulnerabilities.

Source: News Office

The rise and fall of cognitive skills:Neuroscientists find that different parts of the brain work best at different ages.

By Anne Trafton


CAMBRIDGE, Mass–Scientists have long known that our ability to think quickly and recall information, also known as fluid intelligence, peaks around age 20 and then begins a slow decline. However, more recent findings, including a new study from neuroscientists at MIT and Massachusetts General Hospital (MGH), suggest that the real picture is much more complex.

The study, which appears in the XX issue of the journal Psychological Science, finds that different components of fluid intelligence peak at different ages, some as late as age 40.

“At any given age, you’re getting better at some things, you’re getting worse at some other things, and you’re at a plateau at some other things. There’s probably not one age at which you’re peak on most things, much less all of them,” says Joshua Hartshorne, a postdoc in MIT’s Department of Brain and Cognitive Sciences and one of the paper’s authors.

“It paints a different picture of the way we change over the lifespan than psychology and neuroscience have traditionally painted,” adds Laura Germine, a postdoc in psychiatric and neurodevelopmental genetics at MGH and the paper’s other author.

Measuring peaks

Until now, it has been difficult to study how cognitive skills change over time because of the challenge of getting large numbers of people older than college students and younger than 65 to come to a psychology laboratory to participate in experiments. Hartshorne and Germine were able to take a broader look at aging and cognition because they have been running large-scale experiments on the Internet, where people of any age can become research subjects.

Their web sites, gameswithwords.org and testmybrain.org, feature cognitive tests designed to be completed in just a few minutes. Through these sites, the researchers have accumulated data from nearly 3 million people in the past several years.

In 2011, Germine published a study showing that the ability to recognize faces improves until the early 30s before gradually starting to decline. This finding did not fit into the theory that fluid intelligence peaks in late adolescence. Around the same time, Hartshorne found that subjects’ performance on a visual short-term memory task also peaked in the early 30s.

Intrigued by these results, the researchers, then graduate students at Harvard University, decided that they needed to explore a different source of data, in case some aspect of collecting data on the Internet was skewing the results. They dug out sets of data, collected decades ago, on adult performance at different ages on the Weschler Adult Intelligence Scale, which is used to measure IQ, and the Weschler Memory Scale. Together, these tests measure about 30 different subsets of intelligence, such as digit memorization, visual search, and assembling puzzles.

Hartshorne and Germine developed a new way to analyze the data that allowed them to compare the age peaks for each task. “We were mapping when these cognitive abilities were peaking, and we saw there was no single peak for all abilities. The peaks were all over the place,” Hartshorne says. “This was the smoking gun.”

However, the dataset was not as large as the researchers would have liked, so they decided to test several of the same cognitive skills with their larger pools of Internet study participants. For the Internet study, the researchers chose four tasks that peaked at different ages, based on the data from the Weschler tests. They also included a test of the ability to perceive others’ emotional state, which is not measured by the Weschler tests.

The researchers gathered data from nearly 50,000 subjects and found a very clear picture showing that each cognitive skill they were testing peaked at a different age. For example, raw speed in processing information appears to peak around age 18 or 19, then immediately starts to decline. Meanwhile, short-term memory continues to improve until around age 25, when it levels off and then begins to drop around age 35.

For the ability to evaluate other people’s emotional states, the peak occurred much later, in the 40s or 50s.

More work will be needed to reveal why each of these skills peaks at different times, the researchers say. However, previous studies have hinted that genetic changes or changes in brain structure may play a role.

“If you go into the data on gene expression or brain structure at different ages, you see these lifespan patterns that we don’t know what to make of. The brain seems to continue to change in dynamic ways through early adulthood and middle age,” Germine says. “The question is: What does it mean? How does it map onto the way you function in the world, or the way you think, or the way you change as you age?”

Accumulated intelligence

The researchers also included a vocabulary test, which serves as a measure of what is known as crystallized intelligence — the accumulation of facts and knowledge. These results confirmed that crystallized intelligence peaks later in life, as previously believed, but the researchers also found something unexpected: While data from the Weschler IQ tests suggested that vocabulary peaks in the late 40s, the new data showed a later peak, in the late 60s or early 70s.

The researchers believe this may be a result of better education, more people having jobs that require a lot of reading, and more opportunities for intellectual stimulation for older people.

Hartshorne and Germine are now gathering more data from their websites and have added new cognitive tasks designed to evaluate social and emotional intelligence, language skills, and executive function. They are also working on making their data public so that other researchers can access it and perform other types of studies and analyses.

“We took the existing theories that were out there and showed that they’re all wrong. The question now is: What is the right one? To get to that answer, we’re going to need to run a lot more studies and collect a lot more data,” Hartshorne says.

The research was funded by the National Institutes of Health, the National Science Foundation, and a National Defense Science and Engineering Graduate Fellowship.

Source: MIT News Office

Neuroscientists reverse memories’ emotional associations

MIT study also identifies the brain circuit that links feelings to memories.

By Anne Trafton

Most memories have some kind of emotion associated with them: Recalling the week you just spent at the beach probably makes you feel happy, while reflecting on being bullied provokes more negative feelings.

A new study from MIT neuroscientists reveals the brain circuit that controls how memories become linked with positive or negative emotions. Furthermore, the researchers found that they could reverse the emotional association of specific memories by manipulating brain cells with optogenetics — a technique that uses light to control neuron activity.

The findings, described in the Aug. 27 issue of Nature, demonstrated that a neuronal circuit connecting the hippocampus and the amygdala plays a critical role in associating emotion with memory. This circuit could offer a target for new drugs to help treat conditions such as post-traumatic stress disorder, the researchers say.

“In the future, one may be able to develop methods that help people to remember positive memories more strongly than negative ones,” says Susumu Tonegawa, the Picower Professor of Biology and Neuroscience, director of the RIKEN-MIT Center for Neural Circuit Genetics at MIT’s Picower Institute for Learning and Memory, and senior author of the paper.

 This image depicts the injection sites and the expression of the viral constructs in the two areas of the brain studied: the Dentate Gyrus of the hippocampus (middle) and the Basolateral Amygdala (bottom corners). Credits Image courtesy of the researchers/MIT

This image depicts the injection sites and the expression of the viral constructs in the two areas of the brain studied: the Dentate Gyrus of the hippocampus (middle) and the Basolateral Amygdala (bottom corners).
Credits: Image courtesy of the researchers/MIT

The paper’s lead authors are Roger Redondo, a Howard Hughes Medical Institute postdoc at MIT, and Joshua Kim, a graduate student in MIT’s Department of Biology.

Shifting memories

Memories are made of many elements, which are stored in different parts of the brain. A memory’s context, including information about the location where the event took place, is stored in cells of the hippocampus, while emotions linked to that memory are found in the amygdala.

Previous research has shown that many aspects of memory, including emotional associations, are malleable. Psychotherapists have taken advantage of this to help patients suffering from depression and post-traumatic stress disorder, but the neural circuitry underlying such malleability is not known.

In this study, the researchers set out to explore that malleability with an experimental technique they recently devised that allows them to tag neurons that encode a specific memory, or engram. To achieve this, they label hippocampal cells that are turned on during memory formation with a light-sensitive protein called channelrhodopsin. From that point on, any time those cells are activated with light, the mice recall the memory encoded by that group of cells.

Last year, Tonegawa’s lab used this technique to implant, or “incept,” false memories in miceby reactivating engrams while the mice were undergoing a different experience. In the new study, the researchers wanted to investigate how the context of a memory becomes linked to a particular emotion. First, they used their engram-labeling protocol to tag neurons associated with either a rewarding experience (for male mice, socializing with a female mouse) or an unpleasant experience (a mild electrical shock). In this first set of experiments, the researchers labeled memory cells in a part of the hippocampus called the dentate gyrus.

Two days later, the mice were placed into a large rectangular arena. For three minutes, the researchers recorded which half of the arena the mice naturally preferred. Then, for mice that had received the fear conditioning, the researchers stimulated the labeled cells in the dentate gyrus with light whenever the mice went into the preferred side. The mice soon began avoiding that area, showing that the reactivation of the fear memory had been successful.

The reward memory could also be reactivated: For mice that were reward-conditioned, the researchers stimulated them with light whenever they went into the less-preferred side, and they soon began to spend more time there, recalling the pleasant memory.

A couple of days later, the researchers tried to reverse the mice’s emotional responses. For male mice that had originally received the fear conditioning, they activated the memory cells involved in the fear memory with light for 12 minutes while the mice spent time with female mice. For mice that had initially received the reward conditioning, memory cells were activated while they received mild electric shocks.

Next, the researchers again put the mice in the large two-zone arena. This time, the mice that had originally been conditioned with fear and had avoided the side of the chamber where their hippocampal cells were activated by the laser now began to spend more time in that side when their hippocampal cells were activated, showing that a pleasant association had replaced the fearful one. This reversal also took place in mice that went from reward to fear conditioning.

Altered connections

The researchers then performed the same set of experiments but labeled memory cells in the basolateral amygdala, a region involved in processing emotions. This time, they could not induce a switch by reactivating those cells — the mice continued to behave as they had been conditioned when the memory cells were first labeled.

This suggests that emotional associations, also called valences, are encoded somewhere in the neural circuitry that connects the dentate gyrus to the amygdala, the researchers say. A fearful experience strengthens the connections between the hippocampal engram and fear-encoding cells in the amygdala, but that connection can be weakened later on as new connections are formed between the hippocampus and amygdala cells that encode positive associations.

“That plasticity of the connection between the hippocampus and the amygdala plays a crucial role in the switching of the valence of the memory,” Tonegawa says.

These results indicate that while dentate gyrus cells are neutral with respect to emotion, individual amygdala cells are precommitted to encode fear or reward memory. The researchers are now trying to discover molecular signatures of these two types of amygdala cells. They are also investigating whether reactivating pleasant memories has any effect on depression, in hopes of identifying new targets for drugs to treat depression and post-traumatic stress disorder.

David Anderson, a professor of biology at the California Institute of Technology, says the study makes an important contribution to neuroscientists’ fundamental understanding of the brain and also has potential implications for treating mental illness.

“This is a tour de force of modern molecular-biology-based methods for analyzing processes, such as learning and memory, at the neural-circuitry level. It’s one of the most sophisticated studies of this type that I’ve seen,” he says.

The research was funded by the RIKEN Brain Science Institute, Howard Hughes Medical Institute, and the JPB Foundation.