MIT Research News' Journal
 
[Most Recent Entries] [Calendar View]

Wednesday, October 11th, 2017

    Time Event
    5:01a
    Regina Barzilay wins MacArthur “genius grant”

    Regina Barzilay, a professor in MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) who does research in natural language processing and machine learning, is a recipient of a 2017 MacArthur Fellowship, sometimes referred to as a “genius grant.”

    The fellowships carry a five-year, $625,000 prize, which recipients are free to use as they see fit. Twenty-one current MIT faculty members and three staff members have won MacArthur Fellowships, which were established in 1981 and are usually given out to roughly 25 people each year.
     
    In accepting the award, Barzilay credited MIT for being an energizing and encouraging community.

    "I have been blessed to work with amazing students and colleagues who challenge my thinking, inspire me, and give me a new perspective on research,” Barzilay says. "From my first days at MIT, it was clear to me that you don't have to conform to existing standards in the field. You are free to explore any direction you like."

    The Delta Electronics Professor of Electrical Engineering and Computer Science, Barzilay does research in natural language processing (NLP) and machine learning. Her research covers multiple areas of NLP, from syntactic parsing and the deciphering of dead languages, to developing new ways to train neural networks that can provide rationales for their decisions.

    “I’m rarely interested in providing yet another solution to traditional NLP tasks,” she says. “I’m most excited about solving problems not within the mainstream of the field that require new perspectives.”

    She has also been active in applying machine learning methods to oncology and drug design, arguing that data-driven approaches will soon revolutionize early detection and treatment of cancer.  

    The MacArthur Foundation cited Barzilay for making “significant contributions to a wide range of problems in computational linguistics, including both interpretation and generation of human language.”

    Barzilay joined the MIT faculty in 2003 after earning her PhD at Columbia University, where her dissertation centered on developing systems that can summarize news stories. She is the recipient of the National Science Foundation Career Award, the Microsoft Faculty Fellowship, and multiple “best paper” awards in her field.

    Barzilay also co-teaches 6.036, MIT’s popular Introduction to Machine Learning course, which enrolled more than 700 students this spring. For her contributions to teaching machine learning and natural language processing, she was awarded the Jamieson Award for Excellence in teaching.

    Other recent MacArthur Fellows on the MIT faculty include economist Heidi Williams (2015), computer scientist Dina Katabi and astrophysicist Sara Seager (2013); writer Junot Diaz (2012); physicist Nergis Mavalvala (2010); development economist Esther Duflo (2009); and architectural engineer John Ochsendorf and physicist Marin Soljacic (2008).

    12:00p
    Will metal supplies limit battery expansion?

    The dramatic rise in production of electric vehicles, coupled with expected growth in the use of grid-connected battery systems for storing electricity from renewable sources, raises a crucial question: Are there enough raw materials to enable significantly increased production of lithium-ion batteries, which are the dominant type of rechargeable batteries on the market?

    A new analysis by researchers at MIT and elsewhere indicates that for the near future, there will be no absolute limitations on battery manufacturing due to shortages of the critical metals they require. But, without proper planning, there could be short-term bottlenecks in the supplies of some metals, particularly lithium and cobalt, that could cause temporary slowdowns in production.

    The analysis, by professor Elsa Olivetti and doctoral student Xinkai Fu in MIT’s Department of Materials Science and Engineering, Gerbrand Ceder at the University of California at Berkeley, and Gabrielle Gaustad at the Rochester Institute of Technology, appears today in the journal Joule.

    Olivetti, who is the Atlantic Richfield Assistant Professor of Energy Studies, says the new journal’s editors asked her to look at possible resource limitations as battery production escalates globally. To do that, Olivetti and her co-authors concentrated on five of the most essential ingredients needed to produce today’s lithium-ion batteries: lithium, cobalt, manganese, nickel, and carbon in the form of graphite. Other key ingredients, such as copper, aluminum, and some polymers used as membranes, are considered abundant enough that they are not likely to be a limiting factor.

    Among those five materials, it was quickly clear that nickel and manganese are used much more widely in other industries; battery production, even if significantly increased, is “not a significant part of the pie,” Olivetti says, so nickel and manganese supplies are not likely to be impacted. Ultimately, the most significant materials whose supply chains could become limited are lithium and cobalt, she says.

    For those two elements, the team looked at the diversity of the supply options in terms of geographical distribution, production facilities, and other variables. For lithium, there are two main pathways to production: mining and processing of brines. Of those, production from brine can be ramped up to meet demand much more rapidly, within as little as six or eight months, compared to bringing a new underground mine into production, Olivetti says. Although there might still be disruptions in the supply of lithium, she says, these are unlikely to seriously disrupt battery production.

    Cobalt is a bit more complex. Its major source is the Democratic Republic of the Congo, which has a history of violent conflict and corruption. “That’s been a challenge,” Olivetti says. Cobalt is typically produced as a byproduct of other mining activity. “Often a mine’s revenue comes from nickel, and cobalt is a secondary product,” she says.

    But the main potential cause of delays in obtaining new supplies of the mineral comes from not its inherent geographic distribution, but the actual extraction infrastructure. “The delay is in the ability to open new mines,” she says. “With any of these things, the material is out there, but the question is at what price.” To guard against possible disruptions in the cobalt supply, she says, researchers “are trying to move to cathode materials [for lithium-ion batteries] that are less cobalt-dependent.”

    The study looked out over the next 15 years, and within that time frame, Olivetti says, there are potentially some bottlenecks in the supply chain, but no serious obstacles to meeting the rising demand. Still, she says, “it’s important for stakeholders to be aware of the bottlenecks,” as unanticipated supply disruptions could put some companies out of business. Companies need to think about alternative sources, and “know where and when to panic.”

    And understanding which materials are most subject to disruption could help guide research directions, in deciding “where do we put our development efforts. It does make sense to think of cathodes that use less cobalt,” Olivetti says.

    Overall, she says, “in most cases there are reasonable supplies” of the critical materials, “but there are potential challenges that should be approached with eyes wide open. What we tried to present is a framework by which to think about these challenges in a bit more quantitative way than you usually see.”

    The work was supported by the National Science Foundation.

    12:00p
    Brain waves reflect different types of learning

    Figuring out how to pedal a bike and memorizing the rules of chess require two different types of learning, and now for the first time, researchers have been able to distinguish each type of learning by the brain-wave patterns it produces.

    These distinct neural signatures could guide scientists as they study the underlying neurobiology of how we both learn motor skills and work through complex cognitive tasks, says Earl K. Miller, the Picower Professor of Neuroscience at the Picower Institute for Learning and Memory and the Department of Brain and Cognitive Sciences, and senior author of a paper describing the findings in the Oct. 11 edition of Neuron.

    When neurons fire, they produce electrical signals that combine to form brain waves that oscillate at different frequencies. “Our ultimate goal is to help people with learning and memory deficits,” notes Miller. “We might find a way to stimulate the human brain or optimize training techniques to mitigate those deficits.”

    The neural signatures could help identify changes in learning strategies that occur in diseases such as Alzheimer’s, with an eye to diagnosing these diseases earlier or enhancing certain types of learning to help patients cope with the disorder, says Roman F. Loonis, a graduate student in the Miller Lab and first author of the paper. Picower Institute research scientist Scott L. Brincat and former MIT postdoc Evan G. Antzoulatos, now at the University of California at Davis, are co-authors.

    Explicit versus implicit learning

    Scientists used to think all learning was the same, Miller explains, until they learned about patients such as the famous Henry Molaison or “H.M.,” who developed severe amnesia in 1953 after having part of his brain removed in an operation to control his epileptic seizures. Molaison couldn’t remember eating breakfast a few minutes after the meal, but he was able to learn and retain motor skills that he learned, such as tracing objects like a five-pointed star in a mirror.

    “H.M. and other amnesiacs got better at these skills over time, even though they had no memory of doing these things before,” Miller says.

    The divide revealed that the brain engages in two types of learning and memory — explicit and implicit.

    Explicit learning “is learning that you have conscious awareness of, when you think about what you’re learning and you can articulate what you’ve learned, like memorizing a long passage in a book or learning the steps of a complex game like chess,” Miller explains.

    “Implicit learning is the opposite. You might call it motor skill learning or muscle memory, the kind of learning that you don’t have conscious access to, like learning to ride a bike or to juggle,” he adds. “By doing it you get better and better at it, but you can’t really articulate what you’re learning.”

    Many tasks, like learning to play a new piece of music, require both kinds of learning, he notes.

    Brain waves from earlier studies

    When the MIT researchers studied the behavior of animals learning different tasks, they found signs that different tasks might require either explicit or implicit learning. In tasks that required comparing and matching two things, for instance, the animals appeared to use both correct and incorrect answers to improve their next matches, indicating an explicit form of learning. But in a task where the animals learned to move their gaze one direction or another in response to different visual patterns, they only improved their performance in response to correct answers, suggesting implicit learning.

    What’s more, the researchers found, these different types of behavior are accompanied by different patterns of brain waves.

    During explicit learning tasks, there was an increase in alpha2-beta brain waves (oscillating at 10-30 hertz) following a correct choice, and an increase delta-theta waves (3-7 hertz) after an incorrect choice. The alpha2-beta waves increased with learning during explicit tasks, then decreased as learning progressed. The researchers also saw signs of a neural spike in activity that occurs in response to behavioral errors, called event-related negativity, only in the tasks that were thought to require explicit learning.

    The increase in alpha-2-beta brain waves during explicit learning “could reflect the building of a model of the task,” Miller explains. “And then after the animal learns the task, the alpha-beta rhythms then drop off, because the model is already built.”

    By contrast, delta-theta rhythms only increased with correct answers during an implicit learning task, and they decreased during learning. Miller says this pattern could reflect neural “rewiring” that encodes the motor skill during learning.

    “This showed us that there are different mechanisms at play during explicit versus implicit learning,” he notes.

    Future Boost to Learning

    Loonis says the brain wave signatures might be especially useful in shaping how we teach or train a person as they learn a specific task. “If we can detect the kind of learning that’s going on, then we may be able to enhance or provide better feedback for that individual,” he says. “For instance, if they are using implicit learning more, that means they’re more likely relying on positive feedback, and we could modify their learning to take advantage of that.”

    The neural signatures could also help detect disorders such as Alzheimer’s disease at an earlier stage, Loonis says. “In Alzheimer’s, a kind of explicit fact learning disappears with dementia, and there can be a reversion to a different kind of implicit learning,” he explains. “Because the one learning system is down, you have to rely on another one.”

    Earlier studies have shown that certain parts of the brain such as the hippocampus are more closely related to explicit learning, while areas such as the basal ganglia are more involved in implicit learning. But Miller says that the brain wave study indicates “a lot of overlap in these two systems. They share a lot of the same neural networks.”

    The research was funded by the National Institute of Mental Health and the Picower Institute Innovation Fund.

    12:00p
    Making renewable power more viable for the grid

    Wind and solar power are increasingly popular sources for renewable energy. But intermittency issues keep them from connecting widely to the U.S. grid: They require energy-storage systems that, at the cheapest, run about $100 per kilowatt hour and function only in certain locations.

    Now MIT researchers have developed an “air-breathing” battery that could store electricity for very long durations for about one-fifth the cost of current technologies, with minimal location restraints and zero emissions. The battery could be used to make sporadic renewable power a more reliable source of electricity for the grid.

    For its anode, the rechargeable flow battery uses cheap, abundant sulfur dissolved in water. An aerated liquid salt solution in the cathode continuously takes in and releases oxygen that balances charge as ions shuttle between the electrodes. Oxygen flowing into the cathode causes the anode to discharge electrons to an external circuit. Oxygen flowing out sends electrons back to the anode, recharging the battery.

    “This battery literally inhales and exhales air, but it doesn’t exhale carbon dioxide, like humans — it exhales oxygen,” says Yet-Ming Chiang, the Kyocera Professor of Materials Science and Engineering at MIT and co-author of a paper describing the battery. The research appears today in the journal Joule.

    The battery’s total chemical cost — the combined price of the cathode, anode, and electrolyte materials — is about 1/30th the cost of competing batteries, such as lithium-ion batteries. Scaled-up systems could be used to store electricity from wind or solar power, for multiple days to entire seasons, for about $20 to $30 per kilowatt hour.

    Co-authors with Chiang on the paper are: first author Zheng Li, who was a postdoc at MIT during the research and is now a professor at Virginia Tech; Fikile R. Brushett, the Raymond A. and Helen E. St. Laurent Career Development Professor of Chemical Engineering; research scientist Liang Su; graduate students Menghsuan Pan and Kai Xiang; and undergraduate students Andres Badel, Joseph M. Valle, and Stephanie L. Eiler.

    Finding the right balance

    Development of the battery began in 2012, when Chiang joined the Department of Energy’s Joint Center for Energy Storage Research, a five-year project that brought together about 180 researchers to collaborate on energy-saving technologies. Chiang, for his part, focused on developing an efficient battery that could reduce the cost of grid-scale energy storage.

    A major issue with batteries over the past several decades, Chiang says, has been a focus on synthesizing materials that offer greater energy density but are very expensive. The most widely used materials in lithium-ion batteries for cellphones, for instance, have a cost of about $100 for each kilowatt hour of energy stored.

    “This meant maybe we weren’t focusing on the right thing, with an ever-increasing chemical cost in pursuit of high energy-density,” Chiang says. He brought the issue to other MIT researchers. “We said, ‘If we want energy storage at the terawatt scale, we have to use truly abundant materials.’”

    The researchers first decided the anode needed to be sulfur, a widely available byproduct of natural gas and petroleum refining that’s very energy dense, having the lowest cost per stored charge next to water and air. The challenge then was finding an inexpensive liquid cathode material that remained stable while producing a meaningful charge. That seemed improbable — until a serendipitous discovery in the lab.

    On a short list of candidates was a compound called potassium permanganate. If used as a cathode material, that compound is “reduced” — a reaction that draws ions from the anode to the cathode, discharging electricity. However, the reduction of the permanganate is normally impossible to reverse, meaning the battery wouldn’t be rechargeable.

    Still, Li tried. As expected, the reversal failed. However, the battery was, in fact, recharging, due to an unexpected oxygen reaction in the cathode, which was running entirely on air. “I said, ‘Wait, you figured out a rechargeable chemistry using sulfur that does not require a cathode compound?’ That was the ah-ha moment,” Chiang says.

    Using that concept, the team of researchers created a type of flow battery, where electrolytes are continuously pumped through electrodes and travel through a reaction cell to create charge or discharge. The battery consists of a liquid anode (anolyte) of polysulfide that contains lithium or sodium ions, and a liquid cathode (catholyte) that consists of an oxygenated dissolved salt, separated by a membrane.

    Upon discharging, the anolyte releases electrons into an external circuit and the lithium or sodium ions travel to the cathode. At the same time, to maintain electroneutrality, the catholyte draws in oxygen, creating negatively charged hydroxide ions. When charging, the process is simply reversed. Oxygen is expelled from the catholyte, increasing hydrogen ions, which donate electrons back to the anolyte through the external circuit.

    “What this does is create a charge balance by taking oxygen in and out of the system,” Chiang says.

    Because the battery uses ultra-low-cost materials, its chemical cost is one of the lowest — if not the lowest — of any rechargeable battery to enable cost-effective long-duration discharge. Its energy density is slightly lower than today’s lithium-ion batteries.

    “It’s a creative and interesting new concept that could potentially be an ultra-low-cost solution for grid storage,” says Venkat Viswanathan, an assistant professor of mechanical engineering at Carnegie Mellon University who studies energy-storage systems.

    Lithium-sulfur and lithium-air batteries — where sulfur or oxygen are used in the cathode — exist today. But the key innovation of the MIT research, Viswanathan says, is combining the two concepts to create a lower-cost battery with comparable efficiency and energy density. The design could inspire new work in the field, he adds: “It’s something that immediately captures your imagination.”

    Making renewables more reliable

    The prototype is currently about the size of a coffee cup. But flow batteries are highly scalable, Chiang says, and cells can be combined into larger systems.

    As the battery can discharge over months, the best use may be for storing electricity from notoriously unpredictable wind and solar power sources. “The intermittency for solar is daily, but for wind it’s longer-scale intermittency and not so predictable. When it’s not so predictable you need more reserve — the capability to discharge a battery over a longer period of time — because you don’t know when the wind is going to come back next,” Chiang says. Seasonal storage is important too, he adds, especially with increasing distance north of the equator, where the amount of sunlight varies more widely from summer to winter.

    Chiang says this could be the first technology to compete, in cost and energy density, with pumped hydroelectric storage systems, which provide most of the energy storage for renewables around the world but are very restricted by location.

    “The energy density of a flow battery like this is more than 500 times higher than pumped hydroelectric storage. It’s also so much more compact, so that you can imagine putting it anywhere you have renewable generation,” Chiang says.

    The research was supported by the Department of Energy.

    4:40p
    Grad students earn Department of Energy computational fellowships

    Four first-year graduate students have been awarded U.S. Department of Energy (DOE) Computational Science Graduate Fellowships to support their research. Fellows receive full tuition and fees plus an annual stipend and academic allowance, renewable for up to four years. Less than 5 percent of applicants are chosen for the fellowship each year.

    The computational science fellowship program is administered by the Krell Institute and funded by the DOE’s Office of Science and the National Nuclear Security Administration. Each year, the program grants fellowships to support doctoral students who focus on using high-performance computers to solve complex science and engineering problems of national importance. Recipients must complete courses in a scientific or engineering discipline, plus computer science and applied mathematics. They also must do a three-month research practicum at one of 21 DOE laboratories or sites across the country.

    Four MIT students were awarded DOE Computational Science Graduate Fellowships for 2017.

    Peter J. Ahrens is a graduate student in the Department of Electrical Engineering and Computer Science, and his research focuses on using computer science to improve numerical software for scientists. Writing performance engineered code is difficult because one must optimize for each combination of numerical operation, scientific application, and available hardware, so Ahrens creates algorithms and software which can programmatically generate optimized code for each use case. He also uses code generation within the Julia programming language to write novel interfaces that make it easier to use numerical software.

    Miriam Rathbun is a graduate student in the Department of Nuclear Science and Engineering and a member of Benoit Forget’s research group. Rathbun’s computational reactor physics research focuses on high-fidelity modeling of nuclear reactors. In particular, she is interested in multiphysics problems where several physical phenomena influence each other. Multiphysics research seeks to create a platform for solvers to be more compatible with each other and to make simulations that more accurately predict reality.

    Kevin Silmore of the Department of Chemical Engineering studies the dynamics and self-assembly of anisotropic colloidal particles, or particles that are not spherical. There are countless examples of particles that either occur in nature or are engineered in the laboratory that are not spherical. One prime example is carbon nanotubes, which exhibit many interesting properties and can be used in applications ranging from biosensors to energy harvesters. A better understanding of the physical processes that govern the behavior of such particles could therefore help inform the design of advanced materials with tunable electronic, optical, or mechanical properties.

    Annie Yuan Wei, a student in the Department of Physics, intends to work in quantum information quantum algorithms, and quantum computing. He research involves thinking about how quantum mechanics relates to information processing and how researchers can come up with ways to do things with quantum computers that might not be possible today.

    Since it was launched in 1991, the fellowship program has supported 436 students at more than 65 universities. These students are among 20 first-year recipients, bringing the total number of current fellows to 79 in 14 states.

    For more information on the DOE Computational Science Graduate Fellowships, please visit the Krell Institute website.

    5:30p
    Establishing interdisciplinary approaches to agriculture and fundamental biological processes

    From optimizing food production to feed a growing population to discovering the fundamental behaviors and processes of biopolymers, faculty in the Department of Civil and Environmental Engineering (CEE) are leveraging the interdisciplinary nature of the department to establish two new, innovative projects.

    H.M. King Bhumibol Professor Dennis McLaughlin, a hydrologist, and Mitsui Chair Professor Serguei Saavedra, a network and community ecologist, are working together to examine the impact of resource allocation and community ecology on crop productivity and resilience. Additionally, assistant professors Tal Cohen and Otto X. Cordero are joining forces in an effort to discover the relationship between the physical and biological processes that underlie how biopolymers are consumed by bacteria.

    The projects are funded by CEE’s Cross-Disciplinary Seed Funds, which invites CEE faculty from differing disciplines to apply for funding to support the creation of research projects.

    “Through the Cross-Disciplinary Seed Funds and department events such as CEE Research Speed Dating, I encourage faculty members to reflect on how their expertise could be used in conjunction with another faculty member in a disparate area of the department to create new and innovative approaches to a topic,” says Markus Buehler, head of CEE, McAfee Professor of Engineering, and creator of the initiative. “The aim of the Cross-Disciplinary Seed Funds is to allow faculty to act on these ideas and make them realities. Every year I look forward to seeing the creative applications and the exciting research and findings these collaborations produce.”

    For McLaughlin and Saavedra, their new research project will allow them to carve out a new direction for approaching the increasing demand for land and water resources that are necessary to feed the growing global population. By combining Saavedra’s proven ecological theories with McLaughlin’s experience studying the optimization of resources, the pair are taking a new direction towards ecological diversity and food production.

    “Questions that are relevant to this project are issues of land use in areas with high biological diversity that are fragile ecosystems, but that are attractive, at least superficially, for agricultural development,” McLaughlin said. “We’ve got this growing global population that is pushing food demand, so there’s a demand for more food, but it’s not clear where it’s going to be grown, except places where the local ecosystem might have to be greatly changed.”

    Previous research on food production has considered factors such as climate, water, and soil. McLaughlin and Saavedra are adding factors of ecological and landscape diversity, including resource competition among plants and the ecosystem’s tolerance to environmental stresses, to this list of considerations. Through their newly launched project, the pair are attempting to identify the effects of plant diversity and plot scale on crop productivity and resilience. Using this information, they are aiming to create a model for allocating land and water resources in environments where ecological factors, such as biodiversity and environmental stress, are present.

    “We are trying to combine our research into an ‘eco-optimization model’ that optimizes production, but at the same time takes into account the ecological implications of this growth. These two things cannot be separated, but so far they have been treated separately,” Saavedra explained.

    The project, which has its genesis in a thesis project by one of McLaughlin’s graduate students, uses Hawaii as its case study. CEE faculty and students have been pursuing various research projects in Hawaii, including air quality sensing and plant health on farms, during the department’s Traveling Research Environmental eXperiences (TREX) program. McLaughlin and Saavedra’s project offers a new potential topic of research for TREX students in upcoming years.

    The project will also be useful as supplementary case studies for Saavedra’s course 1.s977 (Modeling Community Diversity) and McLaughlin’s course 1.74 (Land, Water, Food, and Climate).

    The other project funded by this year’s CEE Seed Fund award is being spearheaded by Cohen, who specializes in nonlinear solid mechanics and material instabilities, and Cordero, a microbiologist who studies micro-scale ecology.

    By combining forces, the pair are initiating a project to understand how bacteria colonize and decompose complex biological materials, a process that is crucial to the global carbon cycle.

    Materials such as plant fibers can only be broken down by microorganisms that release extracellular enzymes capable dissolving complex biopolymer networks. This breakdown of polymers releases into the environment considerable amounts of carbon, which are otherwise sequestered inside the biological material.

    This decomposition of plant fibers fragments, which occurs in soils, oceans or animal guts, drives the carbon cycle of the planet. This process, however, depends on the interplay between mechanical instabilities that emerge within the plant tissue and the ecology of microbes growing on the surface of the material. Such interplay between physics and ecology has never been studied thus far, and it is unclear to what extent the speed of degradation depends on this relationship. To address this issue, Cohen and Cordero are seeking to integrate mathematical models and experiments, creating a multidisciplinary approach to tackle this question.

    To fully understand how biopolymers are decomposed by bacteria, Cordero’s lab is creating a micro-scale model ecosystem to understand how the individual bacteria break down a controlled hydrogel, representing the biopolymer. An understanding of a single particle will provide insight into the fundamental principles that govern the global function of microbial ecosystems.

    “It was clear to us that there were biological and ecological problems that were directly interfacing with physics. Some of this physics we can take care of in our lab, but once it starts to get to the physics of the materials and why they break and the forces at play, it becomes much more complicated. And that was actually something Tal was working on in different contexts,” Cordero said. 

    Thus, Cohen’s lab is creating a theoretical model to describe the mechanics of polymer surface growth and degradation such as the ones in Cordero’s controlled setting. The theoretical models of the system will then be applied to the model system developed by the Cordero lab to get a complete understanding of the process from both a biological and mechanical perspective.

    Simultaneously, Cohen’s lab is creating a theoretical model to describe the mechanics of polymer surface growth and degradation, such as the ones seen in Cordero’s controlled setting. The theoretical models of the system will then be applied to the model system developed by the Cordero lab to get a complete understanding of the process from both a biological and mechanical perspective.

    Together, the two labs plan to develop a mathematical model integrating the mechanical approach with biological observations.

    “I think it would be super exciting if we find something that shows a non-trivial interaction between the physics and biology determining how fast those things can break up,” Cordero said. With a fundamental understanding of how the degradation process works using his model ecosystem and Cohen’s theories, Cordero notes that the model could potentially be used for more complex organisms. 

    “For me, primarily being a theoretician, anytime I get this opportunity to work with people who are working on ideas that are very different from the types of problems I’ve been working on, but I can apply the same tools to it, is a super exciting opportunity,” Cohen said.

    The project stems from a conversation between graduate students from Cohen’s lab and Cordero’s lab at CEE’s annual Research Speed Dating event, and the students will be integral to the theoretical and experimental components of the research project.

    “I think [the collaboration is] a great opportunity for students in the sense that they get acquainted with a different community and a different type of thinking. We tend to have these disciplines that think about problems in very specific ways but it’s very fruitful to talk and kind of cross-pollinate,” Cohen said.

    Now in its fourth year, the Cross-Disciplinary Seed Funds support one graduate student from each of the selected projects for one year. The student is chosen by the faculty members to receive the funding, and will be subsequently mentored by both faculty members. This unique style of project also allows the graduate student to receive a wider perception of the project at hand. Selected projects also demonstrate the potential for a long-term collaboration beyond the one year of department funding.

    CEE faculty often collaborate to create multi-faceted, comprehensive solutions to a variety of issues and to solve major problems. Researchers in CEE also partner with other departments, labs and centers at MIT and at other institutions, including universities, governmental organizations, and industry practitioners, to further the potential impact of their work and applying their work to other domains.

    “In order to do good research, you have to collaborate with the experts in those fields. These days, science is not done by a single person, it has two or more people involved, so definitely that helps a lot,” Saavedra said. “There are many things that you don’t see because you’re so biased by your own expertise, that once you start explaining things to people that are not necessarily experts or see the world in a different way, that is hugely interesting and beneficial, not only for the project but also for your work.”

    << Previous Day 2017/10/11
    [Calendar]
    Next Day >>

MIT Research News   About LJ.Rossia.org