MIT Research News' Journal
 
[Most Recent Entries] [Calendar View]

Thursday, March 31st, 2016

    Time Event
    11:59a
    How the brain processes emotions

    Some mental illnesses may stem, in part, from the brain’s inability to correctly assign emotional associations to events. For example, people who are depressed often do not feel happy even when experiencing something that they normally enjoy.

    A new study from MIT reveals how two populations of neurons in the brain contribute to this process. The researchers found that these neurons, located in an almond-sized region known as the amygdala, form parallel channels that carry information about pleasant or unpleasant events.

    Learning more about how this information is routed and misrouted could shed light on mental illnesses including depression, addiction, anxiety, and posttraumatic stress disorder, says Kay Tye, the Whitehead Career Development Assistant Professor of Brain and Cognitive Sciences and a member of MIT’s Picower Institute for Learning and Memory.

    “I think this project really cuts across specific categorizations of diseases and could be applicable to almost any mental illness,” says Tye, the senior author of the study, which appears in the March 31 online issue of Neuron.

    The paper’s lead authors are postdoc Anna Beyeler and graduate student Praneeth Namburi.

    Emotional circuits

    In a previous study, Tye’s lab identified two populations of neurons involved in processing positive and negative emotions. One of these populations relays information to the nucleus accumbens, which plays a role in learning to seek rewarding experiences, while the other sends input to the centromedial amygdala.

    In the new study, the researchers wanted to find out what those neurons actually do as an animal reacts to a frightening or pleasurable stimulus. To do that, they first tagged each population with a light-sensitive protein called channelrhodopsin. In three groups of mice, they labeled cells projecting to the nucleus accumbens, the centromedial amygdala, and a third population that connects to the ventral hippocampus. Tye’s lab has previously shown that the connection to the ventral hippocampus is involved in anxiety.

    Tagging the neurons is necessary because the populations that project to different targets are otherwise indistinguishable. “As far as we can tell they’re heavily intermingled,” Tye says. “Unlike some other regions of the brain, there is no topographical separation based on where they go.”

    After labeling each cell population, the researchers trained the mice to discriminate between two different sounds, one associated with a reward (sugar water) and the other associated with a bitter taste (quinine). They then recorded electrical activity from each group of neurons as the mice encountered the two stimuli. This technique allows scientists to compare the brain’s anatomy (which neurons are connected to each other) and its physiology (how those neurons respond to environmental input).

    The researchers were surprised to find that neurons within each subpopulation did not all respond the same way. Some responded to one cue and some responded to the other, and some responded to both. Some neurons were excited by the cue while others were inhibited.

    “The neurons within each projection are very heterogeneous. They don’t all do the same thing,” Tye says.

    However, despite these differences, the researchers did find overall patterns for each population. Among the neurons that project to the nucleus accumbens, most were excited by the rewarding stimulus and did not respond to the aversive one. Among neurons that project to the central amygdala, most were excited by the aversive cue but not the rewarding cue. Among neurons that project to the ventral hippocampus, the neurons appeared to be more balanced between responding to the positive and negative cues.

    “This is consistent with the previous paper, but we added the actual neural dynamics of the firing and the heterogeneity that was masked by the previous approach of optogenetic manipulation,” Tye says. “The missing piece of that story was what are these neurons actually doing, in real time, when the animal is being presented with stimuli.”

    Digging deep

    The findings suggest that to fully understand how the brain processes emotions, neuroscientists will have to delve deeper into more specific populations, Tye says.

    “Five or 10 years ago, everything was all about specific brain regions. And then in the past four or five years there’s been more focus on specific projections. And now, this study presents a window into the next era, when even specific projections are not specific enough. There’s still heterogeneity even when you subdivide at this level,” she says. “We’ve still got a long way to go in terms of appreciating the full complexities of the brain.”

    “Neuroscience is quickly moving beyond the classical idea of ‘one brain region equals one function,’” says Joshua Johansen, a team leader at the RIKEN Brain Science Institute in Japan, who was not involved in the research. “This paper represents an important step in this process by showing that within the amygdala, the way distinct populations of cells process information is a critical determinant of how emotional responses arise.”

    Another question still remaining is why these different populations are intermingled in the amygdala. One hypothesis is that the cells responding to different inputs need to be able to quickly interact with each other, coordinating responses to an urgent signal, such as an alert that danger is present. “We are exploring the interactions between these different projections, and we think that could be a key to how we so quickly select an appropriate action when we’re presented with a stimulus,” Tye says.

    In the long term, the researchers hope their work will lead to new therapies for mental illnesses. “The first step is to define the circuits and then try to go in animal models of these pathologies and see how these circuits are functioning differently. Then we can try to develop strategies to restore them and try to translate that to human patients,” says Beyeler, who is soon starting her own lab at the University of Lausanne to further pursue this line of research.

    1:59p
    A programming language for living cells

    MIT biological engineers have created a programming language that allows them to rapidly design complex, DNA-encoded circuits that give new functions to living cells.

    Using this language, anyone can write a program for the function they want, such as detecting and responding to certain environmental conditions. They can then generate a DNA sequence that will achieve it.

    “It is literally a programming language for bacteria,” says Christopher Voigt, an MIT professor of biological engineering. “You use a text-based language, just like you’re programming a computer. Then you take that text and you compile it and it turns it into a DNA sequence that you put into the cell, and the circuit runs inside the cell.”

    Voigt and colleagues at Boston University and the National Institute of Standards and Technology have used this language, which they describe in the April 1 issue of Science, to build circuits that can detect up to three inputs and respond in different ways. Future applications for this kind of programming include designing bacterial cells that can produce a cancer drug when they detect a tumor, or creating yeast cells that can halt their own fermentation process if too many toxic byproducts build up.

    The researchers plan to make the user design interface available on the Web.

    No experience needed

    Over the past 15 years, biologists and engineers have designed many genetic parts, such as sensors, memory switches, and biological clocks, that can be combined to modify existing cell functions and add new ones.

    However, designing each circuit is a laborious process that requires great expertise and often a lot of trial and error. “You have to have this really intimate knowledge of how those pieces are going to work and how they’re going to come together,” Voigt says.

    Users of the new programming language, however, need no special knowledge of genetic engineering.

    “You could be completely naive as to how any of it works. That’s what’s really different about this,” Voigt says. “You could be a student in high school and go onto the Web-based server and type out the program you want, and it spits back the DNA sequence.”

    The language is based on Verilog, which is commonly used to program computer chips. To create a version of the language that would work for cells, the researchers designed computing elements such as logic gates and sensors that can be encoded in a bacterial cell’s DNA. The sensors can detect different compounds, such as oxygen or glucose, as well as light, temperature, acidity, and other environmental conditions. Users can also add their own sensors. “It’s very customizable,” Voigt says.

    The biggest challenge, he says, was designing the 14 logic gates used in the circuits so that they wouldn’t interfere with each other once placed in the complex environment of a living cell.

    In the current version of the programming language, these genetic parts are optimized for E. coli, but the researchers are working on expanding the language for other strains of bacteria, including Bacteroides, commonly found in the human gut, and Pseudomonas, which often lives in plant roots, as well as the yeast Saccharomyces cerevisiae. This would allow users to write a single program and then compile it for different organisms to get the right DNA sequence for each one.

    Biological circuits

    Using this language, the researchers programmed 60 circuits with different functions, and 45 of them worked correctly the first time they were tested. Many of the circuits were designed to measure one or more environmental conditions, such as oxygen level or glucose concentration, and respond accordingly. Another circuit was designed to rank three different inputs and then respond based on the priority of each one.

    One of the new circuits is the largest biological circuit ever built, containing seven logic gates and about 12,000 base pairs of DNA.

    Another advantage of this technique is its speed. Until now, “it would take years to build these types of circuits. Now you just hit the button and immediately get a DNA sequence to test,” Voigt says.

    His team plans to work on several different applications using this approach: bacteria that can be swallowed to aid in digestion of lactose; bacteria that can live on plant roots and produce insecticide if they sense the plant is under attack; and yeast that can be engineered to shut off when they are producing too many toxic byproducts in a fermentation reactor.

    The lead author of the Science paper is MIT graduate student Alec Nielsen. Other authors are former MIT postdoc Bryan Der, MIT postdoc Jonghyeon Shin, Boston University graduate student Prashant Vaidyanathan, Boston University associate professor Douglas Densmore, and National Institute of Standards and Technology researchers Vanya Paralanov, Elizabeth Strychalski, and David Ross.

    2:00p
    Pharmacy on demand

    MIT researchers have developed a compact, portable pharmaceutical manufacturing system that can be reconfigured to produce a variety of drugs on demand.

    Just as an emergency generator supplies electricity to handle a power outage, this system could be rapidly deployed to produce drugs needed to handle an unexpected disease outbreak, or to prevent a drug shortage caused by a manufacturing plant shutdown, the researchers say.

    “Think of this as the emergency backup for pharmaceutical manufacturing,” says Allan Myerson, an MIT professor of the practice in the Department of Chemical Engineering. “The purpose is not to replace traditional manufacturing; it’s to provide an alternative for these special situations.”

    Such a system could also be used to produce small quantities of drugs needed for clinical trials or to treat rare diseases, says Klavs Jensen, the Warren K. Lewis Professor of Chemical Engineering at MIT.

    “The goal of this project was to build a small-scale, portable unit that was completely integrated, so you could imagine being able to ship it anywhere. And as long as you had the right chemicals, you could make pharmaceuticals,” Jensen says.

    Jensen, Myerson, and Timothy Jamison, the head of MIT’s Department of Chemistry, are the senior authors of a paper describing the new system in the March 31 online edition of Science. The lead author is MIT research associate Andrea Adamo.

    More flexibility

    Traditional drug manufacturing, also known as “batch processing,” can take weeks or months. Active pharmaceutical ingredients are synthesized in chemical manufacturing plants and then shipped to other sites to be converted into a form that can be given to patients, such as tablets, drug solutions, or suspensions. This system offers little flexibility to respond to surges in demand and is susceptible to severe disruption if one of the plants has to shut down.

    Many pharmaceutical companies are now looking into developing an alternative approach known as flow processing — a continuous process that is done all in one location. Five years ago, an MIT team that included Jamison, Jensen, and Myerson demonstrated a larger prototype (24 by 8 by 8 feet) for the continuous integrated manufacturing of drugs from chemical synthesis to tablets. That project has ended, but the continuous manufacturing initiative, funded by Novartis, is still underway as the researchers develop new methods for synthesis, purification, and formulation.   

    In the new endeavor, funded by the Defense Advanced Research Projects Agency (DARPA), the MIT researchers built on what they learned from the Novartis-funded project to create a much smaller, transportable device. Their new system can produce four drugs formulated as solutions or suspensions — Benadryl, lidocaine, Valium, and Prozac. Using this apparatus, the researchers can manufacture about 1,000 doses of a given drug in 24 hours.

    Key to the continuous system is the development of chemical reactions that can take place as the reactants flow through relatively small tubes as opposed to the huge vats in which most pharmaceutical reactions now take place. Traditional batch processing is limited by the difficulty of cooling these vats, but the flow system allows reactions that produce a great deal of heat to be run safely.

    “In many cases we were developing syntheses of targets that had never been done in a continuous flow platform,” Jamison says. “That presents a lot of challenges even if there is a good precedent from the batch perspective. We also recognized it as an opportunity where, because of some of the phenomena that one can leverage in [a flow-based system], you can make molecules differently.”

    The chemical reactions required to synthesize each drug take place in the first of two modules. The reactions were designed so that they can take place at temperatures up to 250 degrees Celsius and pressures up to 17 atmospheres.

    By swapping in different module components, the researchers can easily reconfigure the system to produce different drugs. “Within a few hours we could change from one compound to the other,” Jensen says.

    In the second module, the crude drug solution is purified by crystallization, filtered, and dried to remove solvent, then dissolved or suspended in water as the final dosage form. The researchers also incorporated an ultrasound monitoring system that ensures the formulated drug solution is at the correct concentration.

    Small-scale manufacturing

    John Lewin, the division director of critical care and surgery pharmacy at Johns Hopkins Hospital, says this type of manufacturing could bring down production costs and help patients get better access to the drugs they need.

    “This sets the foundation for a new paradigm in terms of the way we manufacture pharmaceuticals and distribute them to patients,” says Lewin, who was not involved in the study. “Such a device could really meet a lot of the supply chain challenges here in the U.S. and around the world.”

    One of the advantages of this small-scale system is that it could be used to make small amounts of drugs that would be prohibitively expensive to make in a large-scale plant. This would be useful for so-called “orphan drugs” — drugs needed by a small number of patients. “Sometimes it’s very difficult to get those drugs, because economically it makes no sense to have a huge production operation for those,” Jensen says.

    It could also be useful in regions with few pharmaceutical storage facilities, because drugs can be produced on demand, eliminating the need for long-term storage.

    “The idea here is you make what you need, and you make a simple dosage form, because they’re going to be taken on demand. The dosages don’t have to have long-term stability,” Myerson says. “People line up, you make it, and they take it.”

    The researchers are now working on the second phase of the project, which includes making the system about 40 percent smaller and producing drugs whose chemical syntheses are more complex. They are also working on producing tablets, which are more complicated to manufacture than liquid drugs.

    << Previous Day 2016/03/31
    [Calendar]
    Next Day >>

MIT Research News   About LJ.Rossia.org