MIT Research News' Journal
 
[Most Recent Entries] [Calendar View]

Thursday, November 13th, 2014

    Time Event
    10:30a
    3 Questions: Thomas Malone on the Climate CoLab’s progress

    Last week, the Climate CoLab — an international effort, based at MIT’s Center for Collective Intelligence, that aims to harness the collective intelligence of people around the world to address climate change — hosted its annual conference. Participants discussed how new technology-enabled, crowd-based approaches can help to develop creative new ideas and take meaningful action on climate change. Thomas Malone, the Patrick J. McGovern Professor of Management at the MIT Sloan School of Management and principal investigator of the Climate CoLab, spoke with MIT News about the event.

    Q. How is the Climate CoLab doing? 

    A. The Climate CoLab’s online community has more than tripled in size in the last 14 months. We now have more than 33,000 members from all over the world, including some of the world’s leading climate-change experts as well as scientists, businesspeople, students, policymakers, and many others. In our second annual Climate CoLab Conference at MIT, the winners of this year’s contests presented their ideas and discussed them with experts and others who could help put the ideas into practice.

    For instance, this year’s winning ideas included proposals to (a) institute a U.S. carbon tax and use the revenue to benefit poor households, reduce corporate income taxes, and reduce the federal deficit; (b) better educate building technicians to take advantage of the often-unused energy-saving potential of sophisticated heating and cooling systems; and (c) develop radio programs to help residents of coastal areas in Tanzania, and other developing countries, adapt to changing weather and other effects of climate change.

    Q. What is the Climate CoLab learning about climate change?

    A. This insight is not novel, but I have become increasingly convinced that many of the key problems about climate change today are not just about the physical sciences and technology, but about the social sciences of how to change human systems. Part of our work in the Climate CoLab has involved creating a systematic taxonomy of what we humans can do about the whole problem of climate change, including not only the physical actions we could take, but also the social, political, and economic ones, too.

    And we’ve been pleased by the kinds of creative ideas we’ve been getting from the Climate CoLab community. For instance, one of last year’s winning proposals was about how to create and promote the “China dream,” an aspirational lifestyle for Chinese consumers that would be more sustainable than the “American dream.” And several of this year’s winners suggested possible ways of putting a price on carbon emissions using currencies like Bitcoin that wouldn’t require government involvement.

    By the way, in a recent survey of Climate CoLab members, we found evidence that winning ideas in the CoLab contests were just as likely to come from people who didn’t have previous climate-change experience as from those who did.

    Q. What is the Climate CoLab learning about crowdsourcing hard problems?

    A. We believe the general approach to solving problems we’ve been using in the Climate CoLab can be applied to many other kinds of hard problems, in society, business, and other areas. The core of the approach involves (a) breaking a problem down into pieces; (b) attracting a wide community of people to work on those pieces; (c) evaluating the potential solutions; and then (d) reassembling the pieces into solutions for the whole problem. None of these steps is easy, but we’ve been pleased by the breadth of insight that’s possible when this happens.

    For instance, one lesson we learned is about a way of combining experts and crowds that takes advantage of the best of both. In the first year of the Climate CoLab contests, anyone could create a proposal, and anyone could vote for the proposal they liked best. The proposal that won the vote that year was one called “350 ppm or bust,” which proposed reducing all emissions in all regions of the world by almost 99 percent in 20 years.

    Many people liked this proposal because it resulted in low carbon concentration in the atmosphere, but all the experts we talked to said it would be completely impossible to reduce emissions that fast unless there were some kind of global catastrophe. So in all the subsequent contests, we’ve had expert judges pick finalists that are at least feasible before we ask community members to vote on the options they like best. In this way, we take advantage of the experts’ deep knowledge about what is technically and economically feasible, and we take advantage of the crowd’s knowledge of what the public wants.

    1:15p
    Herman Eisen, professor emeritus of biology, dies at 96

    Herman Eisen, a professor emeritus of biology and founding member of the MIT Center for Cancer Research, died Nov. 2 at age 96.

    Over a 70-year career, Eisen forged a path as a pioneering immunologist whose research has significantly shaped the field. He joined the MIT faculty in 1973, having been recruited as a founding member of MIT’s Center for Cancer Research (now the Koch Institute for Integrative Cancer Research).

    Eisen retired from MIT in 1989, albeit only in the official sense: As a professor emeritus, he maintained an active laboratory and continued to research, publish, and advise students and postdocs until his passing. In all, Eisen spent 41 years at MIT, during which he taught, mentored, and collaborated with thousands of students, faculty members, and staff.

    Early years

    Born in 1918 in Brooklyn, N.Y., Eisen developed a keen interest in science at an early age, when a high-school chemistry class helped frame his perception of the world as a collection of atoms and molecules. Eisen began premedical studies at New York University in 1934, but halfway through his undergraduate career he needed to leave school — and his position as first baseman on NYU’s baseball team — after developing tuberculosis. Though Eisen’s TB kept him out of school for one year, the illness sparked a curiosity about the immune system that would endure for the rest of his life.

    Eisen returned to NYU to complete his bachelor’s degree, and he then enrolled at the university's medical school. He graduated with an MD in 1943 and then worked as an assistant in the pathology department at the Columbia University College of Physicians and Surgeons before going back to NYU for his residency.

    Eisen had a strong interest in basic science research, particularly in trying to better understand the body’s immune system. Though career options for physician-scientists had historically been limited, the federal government began to increase its funding of biomedical research through the National Institutes of Health (NIH) following World War II. Seizing these new opportunities, Eisen became one of the first recipients of an NIH fellowship, which supported his research on sulfonamide-induced antibodies at NYU. These investigations helped him and colleague Fred Karush to determine the number of antigen-binding sites on antibodies.

    After his two-year NIH fellowship, Eisen worked briefly at the Sloan Kettering Institute before returning once again to NYU as a faculty member. Inspired by the work of his recently deceased role model, physician-scientist Karl Landsteiner, he studied immune reactions of the skin. In doing so, he clarified the basis for certain allergic responses and showed that only those chemicals capable of forming covalent bonds to skin proteins could cause a characteristic itchy rash.

    In 1955, Washington University in St. Louis recruited Eisen to join the faculty of its School of Medicine. There, he served as dermatologist-in-chief for five years before moving to the Department of Microbiology to serve as chair. While at WUSTL, Eisen published groundbreaking research in which he described affinity maturation: the process by which activated B cells produce antibodies with an increasingly higher affinity for invading pathogens after infection. This process is fundamental to the development of potent immune responses.

    “Our understanding of affinity maturation begins with Herman’s papers,” says Arup K. Chakraborty, director of MIT’s Institute for Medical Engineering and Science and the Robert T. Haslam Professor in Chemical Engineering, Physics, Chemistry, and Biological Engineering. “Understanding this evolutionary process is critical for vaccine design, and affinity maturation is also mimicked in countless academic laboratories and companies to design antibody-based therapies.” 

    Joining the MIT cancer research community

    In response to the signing of the National Cancer Act of 1971, MIT tasked Nobel Prize-winning biology professor Salvador Luria with establishing and leading a new MIT Center for Cancer Research. Wanting to include cancer immunology as a focus of this new center, Luria approached Eisen about joining as a founding faculty member. Eisen accepted the role, and arrived at MIT in 1973 as a professor in the Department of Biology.

    Eisen brought his immunology expertise to MIT’s new cancer center to study how cancer cells evade the body’s natural immune response. Much of his work focused on studying myeloma tumors in mice and screening their associated proteins. He found that if he used myeloma proteins from one mouse to immunize other mice from the same strain, they were resistant when challenged with cancer cells.

    Eisen and his laboratory went on to study how CD8 T cells develop into cytotoxic, or “killer,” T cells and long-lived memory T cells. Therapeutic vaccines that exploit CD8 responses have not yet been developed for human populations; existing immunotherapies rely on helper T cells and other immune cells, and they do not mount the same aggressive offense against targets. Eisen was working to understand and overcome the barriers to creating effective CD8 vaccines, and his research on the subject was of particular importance to the advancement of cancer immunology. As part of this research, Eisen collaborated closely with Koch Institute faculty members Jianzhu Chen and Richard Young, who is also a member of the Whitehead Institute.

    “Herman's lifelong pursuit of science, even to the very last day of his life, has been an inspiration to many of us,” says Chen, the Ivan R. Cottrell Professor of Immunology. “He was a great human being with a great attitude and a clear mind. He will be missed greatly.”

    As a fixture of MIT’s cancer research community since its formal inception, Eisen could, in recent years, often be found in his second-floor Koch Institute office consulting with students and younger investigators.

    “Herman was a true treasure: an inspiring colleague, a caring mentor, and a wonderful human being,” says Tyler Jacks, director of the Koch Institute and the David H. Koch Professor of Biology. “We all aspire to be Herman Eisen.”

    Other colleagues remember Eisen not only as a groundbreaking immunologist, but also as a hardworking collaborator and a generous man of integrity. He continued to be an active scientist and had been working with Chakraborty on a paper until his passing.

    “Herman was a giant in the field of immunology, with many seminal discoveries,” Chakraborty says. “He was also the kindest and most generous and moral person I have known. Until the end, he was working on scientific problems with junior colleagues and students who benefited from his wisdom. I am lucky to have worked with this great scientist and wonderful human being.”

    “Herman was a wonderful colleague and a terrific person with the highest integrity — a true mensch,” adds Alan Grossman, the Praecis Professor of Biology and head of MIT's biology department. “We will miss his wisdom, thoughtfulness, and vitality.”

    Eisen was elected to the American Academy of Arts and Sciences in 1965, the National Academy of Sciences in 1969, and the Institute of Medicine of the National Academies in 1974. He received numerous other awards and honors throughout his career, including the Behring-Heidelberger Award from the American Association of Immunologists, an Outstanding Investigator Award from the National Cancer Institute, and the Lifetime Service Award from the American Association of Immunologists, of which he served as president from 1968 to 1969.

    Eisen is survived by his wife Natalie; their children, Ellen, Jane, Jim, Tom, and Matthew; and 12 grandchildren. 

    2:00p
    Bacteria become “genomic tape recorders”

    MIT engineers have transformed the genome of the bacterium E. coli into a long-term storage device for memory. They envision that this stable, erasable, and easy-to-retrieve memory will be well suited for applications such as sensors for environmental and medical monitoring.

    “You can store very long-term information,” says Timothy Lu, an associate professor of electrical engineering and computer science and biological engineering. “You could imagine having this system in a bacterium that lives in your gut, or environmental bacteria. You could put this out for days or months, and then come back later and see what happened at a quantitative level.”

    The new strategy, described in the Nov. 13 issue of the journal Science, overcomes several limitations of existing methods for storing memory in bacterial genomes, says Lu, the paper’s senior author. Those methods require a large number of genetic regulatory elements, limiting the amount of information that can be stored.

    The earlier efforts are also limited to digital memory, meaning that they can record only all-or-nothing memories, such as whether a particular event occurred. Lu and graduate student Fahim Farzadfard, the paper’s lead author, set out to create a system for storing analog memory, which can reveal how much exposure there was, or how long it lasted. To achieve that, they designed a “genomic tape recorder” that lets researchers write new information into any bacterial DNA sequence.

    Stable memory

    To program E. coli bacteria to store memory, the MIT researchers engineered the cells to produce a recombinase enzyme, which can insert DNA, or a specific sequence of single-stranded DNA, into a targeted site. However, this DNA is produced only when activated by the presence of a predetermined molecule or another type of input, such as light.

    After the DNA is produced, the recombinase inserts the DNA into the cell’s genome at a preprogrammed site. “We can target it anywhere in the genome, which is why we’re viewing it as a tape recorder, because you can direct where that signal is written,” Lu says.

    Once an exposure is recorded through this process, the memory is stored for the lifetime of the bacterial population and is passed on from generation to generation.

    There are a couple of different ways to retrieve this stored information. If the DNA is inserted into a nonfunctional part of the genome, sequencing the genome will reveal whether the memory is stored in a particular cell. Or, researchers can target the sequences to alter a gene. For example, in this study, the new DNA sequence turned on an antibiotic resistance gene, allowing the researchers to determine how many cells had gotten the memory sequence by adding antibiotics to the cells and observing how many survived.

    By measuring the proportion of cells in the population that have the new DNA sequence, researchers can determine how much exposure there was and how long it lasted. In this paper, the researchers used the system to detect light, a lactose metabolite called IPTG, and an antibiotic derivative called aTc, but it could be tailored to many other molecules or even signals produced by the cell, Lu says.

    The information can also be erased by stimulating the cells to incorporate a different piece of DNA in the same spot. This process is currently not very efficient, but the researchers are working to improve it.

    “This work is very exciting because it integrates many useful capabilities in a single system: long-lasting, analog, distributed genomic storage with a variety of readout options,” says Shawn Douglas, an assistant professor at the University of California at San Diego who was not involved in the study. “Rather than treating each individual cell as a digital storage device, Farzadfard and Lu treat an entire population of cells as an analog ‘hard drive,’ greatly increasing the total amount of information that can be stored and retrieved.”

    Bacterial sensors

    Environmental applications for this type of sensor include monitoring the ocean for carbon dioxide levels, acidity, or pollutants. In addition, the bacteria could potentially be designed to live in the human digestive tract to monitor someone’s dietary intake, such as how much sugar or fat is being consumed, or to detect inflammation from irritable bowel disease.

    These engineered bacteria could also be used as biological computers, Lu says, adding that they would be particularly useful in types of computation that require a lot of parallel processing, such as picking patterns out of an image.

    “Because there are billions and billions of bacteria in a given test tube, and now we can start leveraging more of that population for memory storage and for computing, it might be interesting to do highly parallelized computing. It might be slow, but it could also be energy-efficient,” he says.

    Another possible application is engineering brain cells of living animals or human cells grown in a petri dish to allow researchers to track whether a certain disease marker is expressed or whether a neuron is active at a certain time. “If you could turn the DNA inside a cell into a little memory device on its own and then link that to something you care about, you can write that information and then later extract it,” Lu says.

    The research was funded by the National Institutes of Health, the Office of Naval Research, and the Defense Advanced Research Projects Agency.

    2:00p
    Q&A: Christopher Knittel on the EPA’s greenhouse gas plan

    With cap-and-trade legislation on greenhouse-gas emissions having stalled in Congress in 2010, the Obama administration has taken a different approach to climate policy: It has used the mandate of the Environmental Protection Agency (EPA) to propose a policy limiting power-plant emissions, since electricity consumption produces about 40 percent of U.S. greenhouse gases. (The administration also announced a bilateral agreement with China this week, which sets overall emissions-reductions targets.)

    The EPA’s initial proposal is now under public review, before the agency issues a final rule in 2015. Christopher Knittel, the William Barton Rogers Professor of Energy Economics at the MIT Sloan School of Management, is one of 13 economists who co-authored an article about the policy in the journal Science this week. While the plan offers potential benefits, the economists assert, some of its details might limit the policy’s effectiveness. MIT News talked with Knittel about the issue.

    Q. How is the EPA’s policy for power plants intended to work?

    A. The Clean Power Plan calls for different emissions reductions depending on the state. This state-specific formula has four “buckets:” efficiency increases at the power plant; shifting from coal to natural gas; increases in generation from low-carbon renewables such as wind; and increases in energy efficiency within the state. So they applied these four things and asked what changes were “adequately demonstrated” to generate state-specific required reductions.

    Q. The Science piece emphasizes that the EPA’s plan uses a ratio-based means of limiting emissions: the amount of greenhouse gases divided by the amount of electricity consumed. So a state could add renewable energy, lower its ratio, but not reduce total emissions. What are the advantages and disadvantages of doing this?

    A. The targets are an emissions rate: tons of CO2 [emitted] per megawatt-hour of electricity generation. Then it’s really up to the states to determine how they’re going to achieve the reductions in this rate. So one strategy is to increase total electricity generated. This compliance strategy, unfortunately, is what makes rate-based regulation economically inefficient.

    The states also have the option to convert that rate-based ratio target into what the EPA is calling a mass-based target, total tons of greenhouse-gas emissions. This would effectively imply the state is going to adopt a cap-and-trade program to reach its requirements.

    In current work, we — scholars Jim Bushnell, Stephen Holland, Jonathan Hughes, and I — are investigating the incentives states have to adopt to convert their rate-based mandate into a mass-based mandate. Unfortunately, we are finding that states rarely want to [use a mass-based target], which is a pity, because the mass-based regulation is the most efficient regulation, from an economist’s perspective. Holland, Hughes, and I have done work in the transportation sector that shows that when you do things on a rate base, as opposed to a mass base, it is at least three times more expensive, and more costly to society — often more than five times more costly.

    Q. Why did the EPA approach it this way?

    A. I can only speculate as to why the EPA chose to define the regulation as a rate instead of total greenhouse gas emissions. Regulating a rate is often cheaper from the firm’s perspective, even though it is economically inefficient. Why the EPA chose to define things at the state level is more clear: The Clean Air Act … is written in such a way to leave it up to the states.

    But if everyone’s doing their own rate- or mass-based standard, then you don’t take advantage of potentially a large efficiency benefit from trading compliance across states. That is, it might be cheaper for one state to increase its reductions, allowing another state to abate less.

    The most ideal regulatory model is that everyone’s under one giant mass-based standard, one big cap-and-trade market. Even if every state’s doing its own cap-and-trade market, that’s unlikely to lead to the efficient outcome. It might be cheaper for California or Montana or Oregon to reduce their greenhouse-gas emissions, but as soon as they meet their standard, they’re going to stop.

    Q. The Science article says that certifying efficiency-based gains is a crucial factor. Could you explain this?

    A. Given how the regulation treats efficiency, it really puts in the forefront the importance of understanding the real-world reduction in energy consumption coming from efficiency investments. Let’s say I reduce electricity consumption by 100 megawatt-hours through increasing efficiency in buildings. Within the [EPA’s] policy, that reduction is treated as if I’m generating 100 megawatt-hours from a zero-carbon technology. So that increases the denominator in the ratio [of greenhouse gases produced to electricity consumed]. One concern, though, is that often the actual returns from energy-efficiency investments aren’t as large as the predicted returns. And that can be because of rebound [the phenomenon by which better energy efficiency allows people to consume more of it], which is a hot topic now, or other behavioral changes.

    Behavioral changes can make those efficiency gains larger or smaller, so getting the right number is very important. I’ve heard stories of people who get all-new windows, and the old windows used to let in air, but now they think the house is stuffy, so they keep their windows cracked. We should be doing more field experiments, more randomized controlled trials, to measure the actual returns to energy efficiency.

    Another related concern is that it might be left up to the states to tell the EPA what the reduction was from these energy-efficiency investments. And the state might not have any incentive at all to measure them correctly. So there has to be an increase in oversight, and it likely has to be federal oversight.

    Q. While you clearly have concerns about the efficacy of the policy, isn’t this one measure among others, intended to lessen the magnitude of the climate crisis?

    A. For many of us, the potential real benefit from the clean power rule is that it will change the dynamic in Paris in the [forthcoming international climate] negotiations. For a long time the U.S. could say it was doing some improvements in transportation, but they really weren’t doing anything in electricity, for climate change. My view is there are a lot of countries out there that aren’t going to do anything unless the U.S. does. This might bring some of those countries on board.

    2:00p
    Pulling together the early solar system

    Infant planetary systems are usually nothing more than swirling disks of gas and dust. Over the course of a few million years, this gas gets sucked into the center of the disk to build a star, while the remaining dust accumulates into larger and larger chunks — the building blocks for terrestrial planets.

    Astronomers have observed this protoplanetary disk evolution throughout our galaxy — a process that our own solar system underwent early in its history. However, the mechanism by which planetary disks evolve at such a rapid rate has eluded scientists for decades.

    Now researchers at MIT, Cambridge University, and elsewhere have provided the first experimental evidence that our solar system’s protoplanetary disk was shaped by an intense magnetic field that drove a massive amount of gas into the sun within just a few million years. The same magnetic field may have propelled dust grains along collision courses, eventually smashing them together to form the initial seeds of terrestrial planets.

    The team analyzed a meteorite known as Semarkona — a space rock that crashed in northern India in 1940, and which is considered one of the most pristine known relics of the early solar system. In their experiments, the researchers painstakingly extracted individual pellets, or chondrules, from a small sample of the meteorite, and measured the magnetic orientations of each grain to determine that, indeed, the meteorite was unaltered since its formation in the early galactic disk.

    The researchers then measured the magnetic strength of each grain, and calculated the original magnetic field in which those grains were created. Based on their calculations, the group determined that the early solar system harbored a magnetic field as strong as 5 to 54 microteslas — up to 100,000 times stronger than what exists in interstellar space today. Such a magnetic field would be strong enough to drive gas toward the sun at an extremely fast rate.

    “Explaining the rapid timescale in which these disks evolve — in only a few million years — has always been a big mystery,” says Roger Fu, a graduate student in MIT’s Department of Earth, Atmospheric and Planetary Sciences. “It turns out that this magnetic field is strong enough to affect the motion of gas at a large scale, in a very significant way.”

    Fu and his colleagues, including Ben Weiss, a professor of planetary sciences at MIT, publish their results today in the journal Science.

    High fidelity

    More than 99 percent of mass in a primordial galactic disk is composed of ionized gas, leaving less than 1 percent as solid particles — the dusty seeds of terrestrial planets. Observations of far-off galaxies have revealed that such massive amounts of gas are accreted, or absorbed, into the central star within just a few million years. However, theoretical models have been unable to identify a mechanism to explain such a rapid accretion rate.

    “The idea that the disk gets depleted within just 3 million years is fundamental to understanding how planets form,” Fu says. “But theoretically, that’s difficult to do, and people have had to invoke all these intricate mechanisms to make that happen.”

    There are theoretical models that incorporate magnetic fields as a mechanism for disk evolution, but until now, there has been no observational data to support the theories.

    Fu points out that researchers have been searching since the 1960s — “with little success” — for evidence of early magnetic fields in meteorite samples. That’s because, for the most part, the meteorites studied had been altered in some form or other.

    “Most of these meteorites … were heated, or had water coursing through them, so the chances of any one meteorite retaining a recording of the most primordial magnetic field in the nebula was almost zero,” Fu says.

    He and his colleagues chose to analyze the Semarkona meteorite because of its reputation as a pristine sample from the early solar system.

    “This thing has the unusual advantage of being unaltered, but also happens to be a really excellent magnetic recording device,” Weiss says. “When it formed, it formed the right kind of metal. Many things, even though pristine, didn’t form the right magnetic recording properties. So this thing is really high-fidelity.”

    From millimeter- to kilometer-sized planets

    To determine whether the meteorite was indeed unchanged since its formation, the group identified and extracted a handful of millimeter-sized grains, or chondrules, from a small sample of the meteorite, and then measured their individual magnetic orientations.

    As the meteorite likely formed from the accumulation of individual grains that tumbled onto the meteorite parent body during its assembly, their collective magnetic directions should be random if they have not been remagnetized since they were free-floating in space. If, however, the meteorite underwent heating at some point after its assembly, the individual magnetic orientations would have been wiped clean, replaced by a uniform orientation.

    The researchers found that each grain they analyzed bore a unique magnetic orientation — proof that the meteorite was indeed pristine.

    “There’s no other alternative but to say this recording is coming from an original nebular field,” Fu says.

    The group then calculated the strength of the original magnetic field, based on the magnetic strength of each chondrule. Their result could support one of two theories of early planetary disk formation: magnetorotational instability, the theory that a turbulent configuration of magnetic fields drove gas toward the sun, or magnetocentrifugal wind, the idea that gas accreted onto the sun via a more orderly, hourglass-shaped pattern of magnetic fields.

    The group’s data also supports two theories of very early planet formation, while ruling out a third.

    “A persistent challenge for understanding how planets form is how to go from micron-sized dust to kilometer-sized planets in only a few million years,” Fu says. “How chondrules formed was probably instrumental to how planets formed.”

    Now, based on the group’s results, Fu says it’s likely that chondrules formed either as molten droplets resulting from the collisions of 10- to 1,000-kilometer rocky bodies, or through the spontaneous compression of surrounding gas, which melted dust particles together.

    It’s unlikely that chondrules formed via electric currents, or X-wind — flash-heating events that occur close to the sun. According to theoretical models, such events can only take place within magnetic fields stronger than 100 microteslas — far greater than what Fu and his colleagues measured.

    “Until now, we were missing data,” Fu says. “Now there is a data point. And to understand fully the implications of what 50 microteslas can do in a gas, there’s a lot more theoretical work to be done.”

    Jerome Gattacceca, research director at the European Centre for Research and Education in Environmental Sciences, says the solar system would have looked very different today if it had not been exposed to magnetic fields.

    “Without this kind of mechanism, all the matter in the solar system would have ended up in the sun, and we would not be here to discuss it,” says Gattacceca, who was not involved in the research. “There has to be a mechanism to prevent that. Several models exist, and this paper provides a viable mechanism, based on the existence of a significant magnetic field, to form the solar system as we know it.”

    This work was funded in part by NASA and the National Science Foundation.

    << Previous Day 2014/11/13
    [Calendar]
    Next Day >>

MIT Research News   About LJ.Rossia.org