MIT Research News' Journal
[Most Recent Entries]
[Calendar View]
Monday, November 13th, 2017
| Time |
Event |
| 9:40a |
MIT to construct new, cutting edge Wright Brothers Wind Tunnel MIT has announced it will replace its venerable 79-year-old Wright Brothers Wind Tunnel with a new facility that will be the largest and most advanced academic wind tunnel in the United States.
To facilitate construction of the new tunnel and ongoing operations, Boeing has made a funding pledge to become the $18-million-project’s lead donor. Boeing’s gift reflects a century-long relationship between the company and MIT that helped ignite the global aerospace industry, and it confirms a commitment to research and development that will fuel future innovation.
Like its predecessor, the new tunnel will be operated by the MIT Department of Aeronautics and Astronautics, and it will retain the Wright Brothers Wind Tunnel name.
The new tunnel will:
- permit increased test speeds, from the current 150 miles per hour to 200 miles per hour;
- greatly improve research data acquisition;
- halve the power requirements of the original 2,000 horsepower fan motor;
- increase test section volume from 850 cubic feet to 1,600 cubic feet, and test section area from 57 square feet to 80 square feet;
- improve ability to test autonomous vehicles (“drones”) and aerodynamic components including wings, bodies, and wind turbines; and
- enable new MIT classes in advanced aerodynamics and fluid mechanics.
Boeing Chief Technology Officer and Senior Vice President, Engineering Test and Technology Greg Hyslop says, “Few relationships in aerospace can compare to the ties between MIT and Boeing. We’re thrilled and gratified to be part of this critically important renovation that will launch our relationship into the second century of aerospace.”
Hyslop noted that a number of Boeing founding leaders studied at MIT, including Donald Douglas Sr. and James S. McDonnell, and Wong Tsu, the first Boeing engineer employed by company founder Bill Boeing. Currently, Boeing employs more than 800 MIT alumni around the world. More than 50 Boeing executives, as well as more than 60 members of the Boeing Technical Fellowship, hold MIT degrees. In addition, Boeing hires on the order of 25 MIT students as interns each year.
“We’ve worked with the great people and facilities at MIT over the decades, and with this gift, we will continue in the years to come,” Hyslop says.
The current tunnel was dedicated in September 1938. From its early days during World War II, when technicians worked around the clock designing military aircraft, testing has branched out to include ground antenna configurations, aircraft and ground structure aeroelasticity, ski gear, space suits, bicycles, motorcycles, subway station entrances, ship sails, wind turbines, solar cars, and, most recently, a design for a clean, quiet, super-efficient commercial aircraft.
Now at the end of its eighth decade, the tunnel is showing its age. It has a drive system that is inefficient, and all aspects of the structure and adjacent controls building are in need of renovations and modernization. As AeroAstro and other MIT departments, labs, and centers use the Wright Brothers Wind Tunnel as a teaching and research tool for classes and projects, Boeing's support of the renovation is vital in enabling MIT to use the tunnel to its maximum potential.
“The new Wright Brothers Wind Tunnel will present MIT with a state-of-the-art research and teaching tool for many years to come,” says AeroAstro department head Jaime Peraire. “We greatly appreciate Boeing’s generosity and commitment to future generations of aerospace engineers and their research.”
The new tunnel will be constructed on the site of the current one, which will be dismantled. The MIT Museum has indicated an interest in preserving artifacts from the 1938 tunnel when they become available. Renovations will be made to MIT Building 17, which houses the control facilities, and direct connection made to AeroAstro workshops. It is expected the project will be completed in 2020.
In acknowledgement of the Wright Brothers Wind Tunnel’s storied history, AeroAstro and the MIT Employees’ Activity Committee will sponsor a tunnel open house on Thursday, November 16, from noon until 1:30 p.m. All members of the MIT community are invited to visit and step inside the tunnel (Building 17) which will be running (at low speed) throughout the event. | | 10:59a |
Synthetic circuits can harvest light energy By organizing pigments on a DNA scaffold, an MIT-led team of researchers has designed a light-harvesting material that closely mimics the structure of naturally occurring photosynthetic structures.
The researchers showed that their synthetic material can absorb light and efficiently transfer its energy along precisely controlled pathways. This type of structure could be incorporated into materials such as glass or textiles, enabling them to harvest or otherwise control incoming energy from sunlight, says Mark Bathe, an associate professor of biological engineering at MIT.
“This is the first demonstration of a purely synthetic mimic of a natural light-harvesting circuit that consists of densely packed clusters of dyes that are precisely organized spatially at the nanometer scale, as found in bacterial systems,” Bathe says. One nanometer is one billionth of a meter, or 1/10,000 the thickness of a human hair.
Bathe is one of the senior authors of the new study, along with Alan Aspuru-Guzik, a professor of chemistry and chemical biology at Harvard University, and Hao Yan, a professor of chemistry and biochemistry at Arizona State University. Lead authors of the paper, which appears in the Nov. 13 issue of Nature Materials, are former MIT postdoc Etienne Boulais, Harvard graduate student Nicolas Sawaya, and MIT postdoc Rémi Veneziano.
Capturing light
Over billions of years, plants and photosynthetic bacteria have evolved efficient cellular structures for harvesting energy from the sun. This process requires capturing photons (packets of light energy) and converting them into excitons — a special type of quasiparticle that can carry energy. Energy from these excitons is then passed to other molecules at a complex of protein and pigments known as a reaction center, and eventually used by the plant to build sugar molecules.
While scientists have developed reliable techniques for carrying electrons (such as semiconductors) and photons (fiber optics), coming up with ways to control excitons has proven more challenging.
Four years ago, Bathe, Aspuru-Guzik, and Yan began working on synthetic structures that could mimic natural light-harvesting assemblies. These assemblies, usually found in cell organelles called chloroplasts, have an intricate structure that efficiently captures and transports solar energy at the scale of nanometers.
“What’s really amazing about photosynthetic light-harvesting is how well it meets the organism’s needs,” says Gabriela Schlau-Cohen, an MIT assistant professor of chemistry who is also an author of the paper. “When it is required, every absorbed photon can migrate through the network of proteins that surrounds the reaction center, to generate electricity.”
The researchers set out to mimic these structures by attaching light-harvesting pigments to study scaffolds made of DNA. Over the past several years, Bathe’s lab has devised new ways to program DNA to fold in particular shapes, and last year Bathe and his colleagues created a new computer-programming tool that automates the process of designing DNA scaffolds of nearly any shape.
For this study, the researchers wanted to use DNA scaffolds to spatially organize densely packed clusters of pigments similar to those found in nature. Boulais found a 1977 paper that showed that a synthetic pigment called pseudoisocyanine (PIC) aggregates onto specific sequences of naturally occurring DNA to form the type of structure the researchers were seeking, called a J-aggregate. However, because this approach used naturally occurring DNA, there was no way to control the spacing, size, or 3-D spatial organization of the clusters.
Veneziano tested the researchers’ ability to template these J-aggregates into discrete clusters with distinct 2-D organizations using synthetic DNA, and Boulais and Sawaya worked to computationally design customizable, synthetic DNA scaffolds that organize these aggregates into circuits that absorb photons and transport the resulting excitons along a predictable path. By programming specific DNA sequences, the researchers can control the precise location and density of the clusters of dye molecules, which sit on a rigid, double-stranded DNA scaffold. They computationally modeled how factors such as the number of dye molecules, their orientation, and the distances between them would affect the efficiency of the resulting circuits, analyzing many versions of the circuits for their efficiency of energy transfer.
“Photosynthetic organisms organize their light-harvesting molecules precisely using a protein scaffold. Up to now, this kind of structural control has been difficult to realize in synthetic systems. It looks like DNA origami provides a means of mimicking many of the principles of photosynthetic light-harvesting complexes,” says Gregory Scholes, a professor of chemistry at Princeton University who was not involved in the study.
Part of the ASU team, led by co-author Su Lin, performed a series of spectroscopic measurements to demonstrate that the designed DNA structures produced the desired J-aggregates, and to characterize their photophysical properties. Schlau-Cohen, who uses advanced spectroscopy techniques to analyze light-harvesting systems, both natural and synthetic, showed that these dense pigment assemblies were able to efficiently absorb light energy and transport it along specific pathways.
“We demonstrated the ability to control the traffic patterns using J-aggregated dyes, not just how far the excitons can travel. That’s important because it offers versatility in designing such circuits for functional materials,” Bathe says.
“Bottom-up design of excitonics systems has been a focused goal of our Energy Frontiers Research Center (EFRC). I am glad to see an important stepping stone toward demonstrating bottom-up control of exciton flow,” Aspuru-Guzik says. He adds that “multidisciplinary research that tightly couples synthesis, theory, and characterization was required to get to this point.”
New materials
The researchers believe that these synthetic structures could be integrated into 2-D and 3-D materials such as glass or textiles, giving those materials the ability to absorb sunlight and convert it into other forms of energy such as electricity, or to otherwise store or harness the energy. The structures might also form a new basis for quantum computers, implemented at the nanoscale, using excitonic circuits as quantum logic gates.
The researchers now plan to explore ways to make these synthetic light-harvesting systems even better, including looking for more efficient pigments, which may lie in the recently announced Max Weaver Dye Library at North Carolina State University, which houses 98,000 unique dyes.
“There are still a lot of ways that we can imagine improving this,” Schlau-Cohen says. “We have the ability to control individual molecular parameters to explore the basic science questions of how can we transport energy efficiently in a disordered material.” Schlau-Cohen is also the senior author of a companion publication that will be published in the Journal of Physical Chemistry Letters next week.
Other authors of the Nature Materials paper are MIT postdocs James Banal and Toru Kondo, who led the Journal of Physical Chemistry Letters paper; former ASU postdoc Alessio Andreoni; ASU postdoc Sarthak Mandal; ASU Senior Research Professor Su Lin; and ASU Professor Neal Woodbury.
The research was funded by the U.S. Department of Defense’s Multidisciplinary University Research Initiative, the U.S. Department of Energy through MIT’s Center for Excitonics, the Office of Naval Research, a Smith Family Graduate Science and Engineering Fellowship, and the Natural Sciences and Engineering Research Council of Canada. | | 10:59a |
CRISPR-carrying nanoparticles edit the genome In a new study, MIT researchers have developed nanoparticles that can deliver the CRISPR genome-editing system and specifically modify genes in mice. The team used nanoparticles to carry the CRISPR components, eliminating the need to use viruses for delivery.
Using the new delivery technique, the researchers were able to cut out certain genes in about 80 percent of liver cells, the best success rate ever achieved with CRISPR in adult animals.
“What’s really exciting here is that we’ve shown you can make a nanoparticle that can be used to permanently and specifically edit the DNA in the liver of an adult animal,” says Daniel Anderson, an associate professor in MIT’s Department of Chemical Engineering and a member of MIT’s Koch Institute for Integrative Cancer Research and Institute for Medical Engineering and Science (IMES).
One of the genes targeted in this study, known as Pcsk9, regulates cholesterol levels. Mutations in the human version of the gene are associated with a rare disorder called dominant familial hypercholesterolemia, and the FDA recently approved two antibody drugs that inhibit Pcsk9. However these antibodies need to be taken regularly, and for the rest of the patient’s life, to provide therapy. The new nanoparticles permanently edit the gene following a single treatment, and the technique also offers promise for treating other liver disorders, according to the MIT team.
Anderson is the senior author of the study, which appears in the Nov. 13 issue of Nature Biotechnology. The paper’s lead author is Koch Institute research scientist Hao Yin. Other authors include David H. Koch Institute Professor Robert Langer of MIT, professors Victor Koteliansky and Timofei Zatsepin of the Skolkovo Institute of Science and Technology, and Professor Wen Xue of the University of Massachusetts Medical School.
Targeting disease
Many scientists are trying to develop safe and efficient ways to deliver the components needed for CRISPR, which consists of a DNA-cutting enzyme called Cas9 and a short RNA that guides the enzyme to a specific area of the genome, directing Cas9 where to make its cut.
In most cases, researchers rely on viruses to carry the gene for Cas9, as well as the RNA guide strand. In 2014, Anderson, Yin, and their colleagues developed a nonviral delivery system in the first-ever demonstration of curing a disease (the liver disorder tyrosinemia) with CRISPR in an adult animal. However, this type of delivery requires a high-pressure injection, a method that can also cause some damage to the liver.
Later, the researchers showed they could deliver the components without the high-pressure injection by packaging messenger RNA (mRNA) encoding Cas9 into a nanoparticle instead of a virus. Using this approach, in which the guide RNA was still delivered by a virus, the researchers were able to edit the target gene in about 6 percent of hepatocytes, which is enough to treat tyrosinemia.
While that delivery technique holds promise, in some situations it would be better to have a completely nonviral delivery system, Anderson says. One consideration is that once a particular virus is used, the patient will develop antibodies to it, so it couldn’t be used again. Also, some patients have pre-existing antibodies to the viruses being tested as CRISPR delivery vehicles.
In the new Nature Biotechnology paper, the researchers came up with a system that delivers both Cas9 and the RNA guide using nanoparticles, with no need for viruses. To deliver the guide RNAs, they first had to chemically modify the RNA to protect it from enzymes in the body that would normally break it down before it could reach its destination.
The researchers analyzed the structure of the complex formed by Cas9 and the RNA guide, or sgRNA, to figure out which sections of the guide RNA strand could be chemically modified without interfering with the binding of the two molecules. Based on this analysis, they created and tested many possible combinations of modifications.
“We used the structure of the Cas9 and sgRNA complex as a guide and did tests to figure out we can modify as much as 70 percent of the guide RNA,” Yin says. “We could heavily modify it and not affect the binding of sgRNA and Cas9, and this enhanced modification really enhances activity.”
Reprogramming the liver
The researchers packaged these modified RNA guides (which they call enhanced sgRNA) into lipid nanoparticles, which they had previously used to deliver other types of RNA to the liver, and injected them into mice along with nanoparticles containing mRNA that encodes Cas9.
They experimented with knocking out a few different genes expressed by hepatocytes, but focused most of their attention on the cholesterol-regulating Pcsk9 gene. The researchers were able to eliminate this gene in more than 80 percent of liver cells, and the Pcsk9 protein was undetectable in these mice. They also found a 35 percent drop in the total cholesterol levels of the treated mice.
The researchers are now working on identifying other liver diseases that might benefit from this approach, and advancing these approaches toward use in patients.
“I think having a fully synthetic nanoparticle that can specifically turn genes off could be a powerful tool not just for Pcsk9 but for other diseases as well,” Anderson says. “The liver is a really important organ and also is a source of disease for many people. If you can reprogram the DNA of your liver while you’re still using it, we think there are many diseases that could be addressed.”
“We are very excited to see this new application of nanotechnology open new avenues for gene editing,” Langer adds.
The research was funded by the National Institutes of Health (NIH), the Russian Scientific Fund, the Skoltech Center, and the Koch Institute Support (core) Grant from the National Cancer Institute. | | 3:00p |
Texas’ odds of Harvey-scale rainfall to increase by end of century As the city of Houston continues to recover and rebuild following the historic flooding unleashed by Hurricane Harvey, the region will also have to prepare for a future in which storms of Harvey’s magnitude are more likely to occur.
A new MIT study, published online this week in the Proceedings of the National Academy of Sciences, reports that as climate change progresses, the city of Houston, and Texas in general, will face an increasing risk of devastating, Harvey-scale rainfall.
According to the study, the state of Texas had a 1 percent chance of experiencing rainfall of Harvey’s magnitude for any given year between 1981 and 2000. By the end of this century, the annual probability of Hurricane Harvey’s record rainfall returning to Texas will rise to 18 percent, if the growth of greenhouse gas emissions to the atmosphere continues unmitigated.
If the risk for such an event during this century increased in a steady, linear fashion, it would mean that there was a 6 percent chance of having Harvey’s magnitude of rainfall in Texas this year.
“You’re rolling the dice every year,” says study author Kerry Emanuel, the Cecil and Ida Green Professor of Atmospheric Science and co-director of the Lorenz Center at MIT. “And we believe the odds of a flood like Harvey are changing.”
When the past isn’t a guide
In the wake of a large disaster, Emanuel says it is natural, and in some cases essential, to ask whether and how soon such an event will occur again.
“Suppose you’re the mayor of Houston, and you’ve just had a terrible disaster that cost you an unbelievable fortune, and you’re going to try over the next few years to put things back in order in your city,” Emanuel says. “Should you be putting in a more advanced storm-sewer system that may cost billions of dollars, or not? The answer to that question depends upon whether you think Harvey was a one-off — very unlikely to happen any time in the next 100 years — or whether it may be more common than you thought.”
Looking at historical records of extreme rainfall will not provide much insight into the future, Emanuel says. That’s because past measurements have been spotty and difficult to extrapolate across larger regions, and the period over which rainfall data have been recorded is relatively short. What’s more, climate change is shifting the odds in terms of the frequency of high-intensity storms around the world.
“If the underlying statistics are changing, the past may not be a good guide to the future,” Emanuel notes in the paper.
Instead, scientists are turning to climate models to try and forecast the future of storms like Harvey. But there, challenges also arise, as models that simulate changing climate at a global scale do so at relatively coarse resolution, of around hundreds of kilometers, while hurricanes require resolutions of a few kilometers.
“[Climate models] do simulate slushy hurricane-like storms, but they’re very poorly resolved,” Emanuel says. “We don’t have the computational firepower to resolve storms like hurricanes in today’s climate models.”
Hurricanes, embedded
Emanuel and his colleagues had previously devised a technique to simulate hurricane development in a changing climate, using a specialized computational model they developed that simulates hurricanes at high spatial resolutions. The model is designed so that they can embed it within coarser global climate models — a combination that results in precise simulations of hurricanes in the context of a globally changing climate.
Emanuel used the team’s technique to model past and future hurricane activity for both the city of Houston and the state of Texas. To do so, he first embedded the hurricane model in three gridded climate analyses — simulations of global climate, based on actual data from the past — to simulate hurricane activity near Houston between 1980 and 2016.
He randomly seeded each climate model with hundreds of thousands of “proto-hurricanes,” or early-stage storms, the majority of which naturally peter out and don’t grow to become full-fledged hurricanes. Of the remaining storms, he focused on the 3,700 storms that passed within 300 kilometers of Houston between 1980 and 2016. He then noted the frequency of storms that produced 500 millimeters of rainfall or more — the amount of rain that was initially estimated immediately following Hurricane Harvey.
During this historical period, he calculated that the probability of a Harvey-like storm producing at least 500 millimeters of rain in Houston was around once in 2,000 years. Such an event, he writes, was “‘biblical’ in the sense that it likely occurred around once since the Old Testament was written.”
Stormy odds
To get a sense for how this probability, or risk of such a storm, will change in the future, he performed the same analysis, this time embedding the hurricane model within six global climate models, and running each model from the years 2081 to 2100, under a future scenario in which the world’s climate changes as a result of unmitigated growth of greenhouse gas emissions.
While Houston’s yearly risk of experiencing a 500-millimeter rainfall event was around 1 in 2,000 at the end of the last century, Emanuel found the city’s annual odds will increase significantly, to one in 100 by the end of this century.
When he performed the same set of analyses for Texas as a whole, he found that, at the end of the 20th century, the state faced a 1 percent risk each year of experiencing a Harvey-scale storm. By the end of this century, that annual risk will increase to 18 percent. If this increase happens linearly, he calculates that this year, the state’s odds were at about 6 percent — a sixfold increase since the late 20th century.
“When you take a very, very rare, extreme rainfall event like Hurricane Harvey, and you shift the distribution of rain toward heavier amounts because of climate change, you get really big changes in the probability of those rare events,” Emanuel says. “People have to understand that damage is usually caused by extreme events.”
Emanuel hopes that the study’s results will help city planners and government officials to decide where and how to rebuild and fortify infrastructure, as well as whether to recode building standards to stand up to stronger storms and more damaging floods.
“We’re seeing for Texas an event whose annual probability was 1 percent at the end of the last century, and it might be 18 percent by the end of this century,” Emanuel says. “That’s a huge increase in the probability of that event. So people had better plan for that.”
This research was supported, in part, by the National Science Foundation. | | 3:30p |
Next-generation optogenetic molecules control single neurons Researchers at MIT and Paris Descartes University have developed a new optogenetic technique that sculpts light to target individual cells bearing engineered light-sensitive molecules, so that individual neurons can be precisely stimulated.
Until now, it has been challenging to use optogenetics to target single cells with such precise control over both the timing and location of the activation. This new advance paves the way for studies of how individual cells, and connections among those cells, generate specific behaviors such as initiating a movement or learning a new skill.
“Ideally what you would like to do is play the brain like a piano. You would want to control neurons independently, rather than having them all march in lockstep the way traditional optogenetics works, but which normally the brain doesn’t do,” says Ed Boyden, an associate professor of brain and cognitive sciences and biological engineering at MIT, and a member of MIT’s Media Lab and McGovern Institute for Brain Research.
The new technique relies on a new type of light-sensitive protein that can be embedded in neuron cell bodies, combined with holographic light-shaping that can focus light on a single cell.
Boyden and Valentina Emiliani, a research director at France’s National Center for Scientific Research (CNRS) and director of the Neurophotonics Laboratory at Paris Descartes University, are the senior authors of the study, which appears in the Nov. 13 issue of Nature Neuroscience. The lead authors are MIT postdoc Or Shemesh and CNRS postdocs Dimitrii Tanese and Valeria Zampini.
Precise control
More than 10 years ago, Boyden and his collaborators first pioneered the use of light-sensitive proteins known as microbial opsins to manipulate neuron electrical activity. These opsins can be embedded into the membranes of neurons, and when they are exposed to certain wavelengths of light, they silence or stimulate the cells.
Over the past decade, scientists have used this technique to study how populations of neurons behave during brain tasks such as memory recall or habit formation. Traditionally, many cells are targeted simultaneously because the light shining into the brain strikes a relatively large area. However, as Boyden points out, neurons may have different functions even when they are near each other.
“Two adjacent cells can have completely different neural codes. They can do completely different things, respond to different stimuli, and play different activity patterns during different tasks,” he says.
To achieve independent control of single cells, the researchers combined two new advances: a localized, more powerful opsin and an optimized holographic light-shaping microscope.
For the opsin, the researchers used a protein called CoChR, which the Boyden lab discovered in 2014. They chose this molecule because it generates a very strong electric current in response to light (about 10 times stronger than that produced by channelrhodopsin-2, the first protein used for optogenetics).
They fused CoChR to a small protein that directs the opsin into the cell bodies of neurons and away from axons and dendrites, which extend from the neuron body. This helps to prevent crosstalk between neurons, since light that activates one neuron can also strike axons and dendrites of other neurons that intertwine with the target neuron.
Boyden then worked with Emiliani to combine this approach with a light-stimulation technique that she had previously developed, known as two-photon computer-generated holography (CGH). This can be used to create three-dimensional sculptures of light that envelop a target cell.
Traditional holography is based on reproducing, with light, the shape of a specific object, in the absence of that original object. This is achieved by creating an “interferogram” that contains the information needed to reconstruct an object that was previously illuminated by a reference beam. In computer generated holography, the interferogram is calculated by a computer without the need of any original object. Years ago, Emiliani’s research group demonstrated that combined with two-photon excitation, CGH can be used to refocus laser light to precisely illuminate a cell or a defined group of cells in the brain.
In the new study, by combining this approach with new opsins that cluster in the cell body, the researchers showed they could stimulate individual neurons with not only precise spatial control but also great control over the timing of the stimulation. When they target a specific neuron, it responds consistently every time, with variability that is less than one millisecond, even when the cell is stimulated many times in a row.
“For the first time ever, we can bring the precision of single-cell control toward the natural timescales of neural computation,” Boyden says.
Mapping connections
Using this technique, the researchers were able to stimulate single neurons in brain slices and then measure the responses from cells that are connected to that cell. This paves the way for possible diagramming of the connections of the brain, and analyzing how those connections change in real time as the brain performs a task or learns a new skill.
One possible experiment, Boyden says, would be to stimulate neurons connected to each other to try to figure out if one is controlling the others or if they are all receiving input from a far-off controller.
“It’s an open question,” he says. “Is a given function being driven from afar, or is there a local circuit that governs the dynamics and spells out the exact chain of command within a circuit? If you can catch that chain of command in action and then use this technology to prove that that’s actually a causal link of events, that could help you explain how a sensation, or movement, or decision occurs.”
As a step toward that type of study, the researchers now plan to extend this approach into living animals. They are also working on improving their targeting molecules and developing high-current opsins that can silence neuron activity.
The research was funded by the National Institutes of Health, France’s National Research Agency, the Simons Foundation for the Social Brain, the Human Frontiers Science Program, John Doerr, the Open Philanthropy Project, the Howard Hughes Medical Institute, and the Defense Advanced Research Projects Agency. | | 5:40p |
MIT researchers release evaluation of solar pumps for irrigation and salt mining in India In 2014, the government of India made an ambitious goal to replace 26 million groundwater pumps run on costly diesel, for more efficient and environmentally-friendly options such as solar pumps.
Groundwater pumps are a critical technology in India, especially for small scale farmers who depend on them for irrigating crops during dry seasons. With the lack of a reliable electrical grid connection, and the high price and variable supply of diesel fuel, solar-powered pumps have great potential to meet farmers’ needs while reducing costs and better preserving natural resources.
MIT researchers have just released a new report evaluating a range of solar pump technologies and business models available in India for irrigation and salt mining to better understand which technologies can best fit farmers’ needs.
The report, “Solar Water Pumps: Technical, Systems, and Business Model Approaches to Evaluation,” details the study design and findings of the latest experimental evaluation implemented by the Comprehensive Initiative on Technology Evaluation (CITE), a program supported by the U.S. Agency for International Development (USAID) and led by a multidisciplinary team of faculty, staff, and students at MIT.
Launched at MIT in 2012, CITE is a pioneering program dedicated to developing methods for product evaluation in global development. CITE researchers evaluate products from three perspectives, including suitability (how well a product performs its purpose), scalability (how well the product’s supply chain effectively reaches consumers), and sustainability (how well the product is used correctly, consistently, and continuously by users over time).
Designing the study to fill information gaps in the market
Despite the tremendous potential for solar pumps to fill a technological need, there is little information available to consumers about what works best for their needs and a wide range of products available for selection.
“There’s a lot of potential for these technologies to make a difference, but there is a large variance in the cost and performance of these pumps, and lot of confusion in finding the right-sized pump for your application,” says Jennifer Green, CITE sustainability research lead and MIT Sociotechnical Systems Research Center research scientist. “In many areas, the only people to turn to for information are the people selling the pumps, so an independent evaluation of the pumps working with our partners provides a third-party, non-biased information alternative.”
To conduct the evaluation, MIT researchers worked closely with the Technology Exchange Lab in Cambridge, Massachusetts, as well as the Gujarat, India-based Self Employed Women’s Association, a trade union that organizes women in India’s informal economy toward full employment and is currently piloting use of solar pumps in their programs.
Researchers tested the technical performance of small solar pump systems in the workshop at MIT D-Lab, and tested larger solar pump systems in communities in India where they were in active use. This allowed for more rigorous, controlled lab testing as well as a more real-life, grounded look at how systems operated in the environment in which they would be deployed. Researchers also used a complex systems modeling technique to examine how the pumps impacted the social, economic, and environmental conditions around them, and how different government policies might impact these conditions at a macro level.
“That was very important because although these are ‘clean pumps’ from the perspective of using solar, there is a concern that there is not a cost incentive to pump less and use less water,” Green says. “When people are using diesel, they pay by the liter, so they use as little as possible. With solar, once people make the capital investment to purchase the equipment, they’re incentivized to pump as much as possible to get a good return on investment and have potential to do serious harm to the groundwater supply.”
Identifying the most appropriate, accessible technologies
In the lab, MIT researchers procured and tested five pumps — the Falcon FCM 115, the Harbor Freight, the Kirloskar SKDS116++, the Rotomag MBP30, and the Shakti SMP1200-20-30. Lab tests on flow rate, priming ease, and overall efficiency demonstrated that two of the lower-cost pumps — the Falcon and the Rotomag — performed the best, and the most expensive pump — the Shakti — performed poorly. MIT researchers also studied pump usage, installing remote sensors in panels and pumps being used in Gujarat, India to ensure that the pumps were being used consistently over the course of a day, and operating properly.
Because solar pumps are often too expensive for small-scale farmers, CITE also conducted a business case analysis to understand what financing mechanisms might make solar pump technology more affordable for these critical end users. For example, researchers looked at government policies such as subsidizing the cost of solar equipment and paying for excess electricity production as a combination that might help farmers make this transition.
“The cost of solar pumps is still prohibitively high for individual farmers to buy them straight out,” Green says. “It will be critical to ensure financing mechanisms are accessible to these users. Coupling solar pump systems with well-thought out government policies and other technologies for minimizing water use is the best approach to optimizing the food-water-energy nexus.”
In addition to the evaluation, CITE created a pump sizing tool that can be used to help farmers understand what size pump they need given their particular field sizes, water requirements, and other factors.
“That gives them more knowledge and power when they go to talk to the water pump manufacturers,” Green says. “If they know what they need, they’re less likely to be talked into buying something too big for their needs. We don’t want them to overpay.”
“CITE’s evaluation work has been a great value-add [for the Self Employed Women’s Association] because we can better understand which pumps are most efficient,” says Reema Nanavaty, director of the Self Employed Women’s Association. “We’re not a technical organization and we did not want to set the livelihoods of these poor salt pan workers by bringing in the wrong kind of pump or an inefficient pump.”
CITE’s research is funded by the USAID U.S. Global Development Lab. CITE is led by principal investigator Bishwapriya Sanyal of MIT’s Department of Urban Studies and Planning, and supported by MIT faculty and staff from D-Lab, the Priscilla King Gray Public Service Center, Sociotechnical Systems Research Center, the Center for Transportation and Logistics, School of Engineering, and Sloan School of Management.
In addition to Green, co-authors on this report include CITE research assistants Amit Gandhi, Jonars B. Spielberg, and Christina Sung; Technology Exchange Lab’s Brennan Lake and Éadaoin Ilten; as well as Vandana Pandya and Sara Lynn Pesek. |
|