MIT Research News' Journal
 
[Most Recent Entries] [Calendar View]

Thursday, July 12th, 2018

    Time Event
    12:00a
    Could gravitational waves reveal how fast our universe is expanding?

    Since it first exploded into existence 13.8 billion years ago, the universe has been expanding, dragging along with it hundreds of billions of galaxies and stars, much like raisins in a rapidly rising dough.

    Astronomers have pointed telescopes to certain stars and other cosmic sources to measure their distance from Earth and how fast they are moving away from us — two parameters that are essential to estimating the Hubble constant, a unit of measurement that describes the rate at which the universe is expanding.

    But to date, the most precise efforts have landed on very different values of the Hubble constant, offering no definitive resolution to exactly how fast the universe is growing. This information, scientists believe, could shed light on the universe’s origins, as well as its fate, and whether the cosmos will expand indefinitely or ultimately collapse.

    Now scientists from MIT and Harvard University have proposed a more accurate and independent way to measure the Hubble constant, using gravitational waves emitted by a relatively rare system: a black hole-neutron star binary, a hugely energetic pairing of a spiraling black hole and a neutron star. As these objects circle in toward each other, they should produce space-shaking gravitational waves and a flash of light when they ultimately collide.

    In a paper published today in Physical Review Letters, the researchers report that the flash of light would give scientists an estimate of the system’s velocity, or how fast it is moving away from the Earth. The emitted gravitational waves, if detected on Earth, should provide an independent and precise measurement of the system’s distance. Even though black hole-neutron star binaries are incredibly rare, the researchers calculate that detecting even a few should yield the most accurate value yet for the Hubble constant and the rate of the expanding universe.

    “Black hole-neutron star binaries are very complicated systems, which we know very little about,” says Salvatore Vitale, assistant professor of physics at MIT and lead author of the paper. “If we detect one, the prize is that they can potentially give a dramatic contribution to our understanding of the universe.”

    Vitale’s co-author is Hsin-Yu Chen of Harvard.

    Competing constants

    Two independent measurements  of the Hubble constant were made recently, one using NASA's Hubble Space Telescope and another using the European Space Agency's Planck satellite. The Hubble Space Telescope’s measurement is based on observations of a type of star known as a Cepheid variable, as well as on observations of supernovae. Both of these objects are considered “standard candles,” for their predictable pattern of brightness, which scientists can use to estimate the star’s distance and velocity.

    The other type of estimate is based on observations of the fluctuations in the cosmic microwave background — the electromagnetic radiation that was left over in the immediate aftermath of the Big Bang, when the universe was still in its infancy. While the observations by both probes are extremely precise, their estimates of the Hubble constant disagree significantly.

    “That’s where LIGO comes into the game,” Vitale says.

    LIGO, or the Laser Interferometry Gravitational-Wave Observatory, detects gravitational waves — ripples in the Jell-O of space-time, produced by cataclysmic astrophysical phenomena.

    “Gravitational waves provide a very direct and easy way of measuring the distances of their sources,” Vitale says. “What we detect with LIGO is a direct imprint of the distance to the source, without any extra analysis.”

    In 2017, scientists got their first chance at estimating the Hubble constant from a gravitational-wave source, when LIGO and its Italian counterpart Virgo detected a pair of colliding neutron stars for the first time. The collision released a huge amount of gravitational waves, which researchers measured to determine the distance of the system from Earth. The merger also released a flash of light, which astronomers focused on with ground and space telescopes to determine the system’s velocity.

    With both measurements, scientists calculated a new value for the Hubble constant. However, the estimate came with a relatively large uncertainty of 14 percent, much more uncertain than the values calculated using the Hubble Space Telescope and the Planck satellite.

    Vitale says much of the uncertainty stems from the fact that it can be challenging to interpret a neutron star binary’s distance from Earth using the gravitational waves that this particular system gives off.   

    “We measure distance by looking at how ‘loud’ the gravitational wave is, meaning how clear it is in our data,” Vitale says. “If it’s very clear, you can see how loud it is, and that gives the distance. But that’s only partially true for neutron star binaries.”

    That’s because these systems, which create a whirling disc of energy as two neutron stars spiral in toward each other, emit gravitational waves in an uneven fashion. The majority of gravitational waves shoot straight out from the center of the disc, while a much smaller fraction escapes out the edges. If scientists detect a “loud” gravitational wave signal, it could indicate one of two scenarios: the detected waves stemmed from the edge of a system that is very close to Earth, or the waves emanated from the center of a much further system.

    “With neutron star binaries, it’s very hard to distinguish between these two situations,” Vitale says.

    A new wave

    In 2014, before LIGO made the first detection of gravitational waves, Vitale and his colleagues observed that a binary system composed of a black hole and a neutron star could give a more accurate distance measurement, compared with neutron star binaries. The team was investigating how accurately one could measure a black hole’s spin, given that the objects are known to spin on their axes, similarly to Earth but much more quickly.

    The researchers simulated a variety of systems with black holes, including black hole-neutron star binaries and neutron star binaries. As a byproduct of this effort, the team noticed that they were able to more accurately determine the distance of black hole-neutron star binaries, compared to neutron star binaries. Vitale says this is due to the spin of the black hole around the neutron star, which can help scientists better pinpoint from where in the system the gravitational waves are emanating.

    “Because of this better distance measurement, I thought that black hole-neutron star binaries could be a competitive probe for measuring the Hubble constant,” Vitale says. “Since then, a lot has happened with LIGO and the discovery of gravitational waves, and all this was put on the back burner.”

    Vitale recently circled back to his original observation, and in this new paper, he set out to answer a theoretical question:

    “Is the fact that every black hole-neutron star binary will give me a better distance going to compensate for the fact that potentially, there are far fewer of them in the universe than neutron star binaries?” Vitale says.

    To answer this question, the team ran simulations to predict the occurrence of both types of binary systems in the universe, as well as the accuracy of their distance measurements. From their calculations, they concluded that, even if neutron binary systems outnumbered black hole-neutron star systems by 50-1, the latter would yield a Hubble constant similar in accuracy to the former.

    More optimistically, if black hole-neutron star binaries were slightly more common, but still rarer than neutron star binaries, the former would produce a Hubble constant that is four times as accurate.

    “So far, people have focused on binary neutron stars as a way of measuring the Hubble constant with gravitational waves,” Vitale says. “We’ve shown there is another type of gravitational wave source which so far has not been exploited as much: black holes and neutron stars spiraling together,” Vitale says. “LIGO will start taking data again in January 2019, and it will be much more sensitive, meaning we’ll be able to see objects farther away. So LIGO should see at least one black hole-neutron star binary, and as many as 25, which will help resolve the existing tension in the measurement of the Hubble constant, hopefully in the next few years.”

    This research was supported, in part, by the National Science Foundation and the LIGO Laboratory.

    4:50p
    Scientists sharpen the edges of cancer chemotherapy

    Tackling unsolved problems is a cornerstone of scientific research, propelled by the power and promise of new technologies. Indeed, one of the shiniest tools in the biomedical toolkit these days is the genome editing system known as CRISPR/Cas9. Whitehead Institute Member David Sabatini and his colleagues pioneered the use of this tool as a foundation for large-scale genetic screens in human cells, turning up a treasure trove of new insights into cellular metabolism, in both normal cells and cancer cells.

    When Naama Kanarek, a postdoc in Sabatini’s laboratory, pondered how to apply these state-of-the-art CRISPR/Cas9 screens to her own research, her thoughts turned to a classic cancer chemotherapy drug, methotrexate, which has been in clinical use for nearly seven decades. Often used to treat a form of pediatric leukemia, known as acute lymphoblastic leukemia (ALL), the drug, when deployed as part of a multifaceted treatment plan, can be highly effective. But its power comes at a cost. Because methotrexate can damage not only cancer cells but also healthy tissues, it must be administered with great care. For children who receive high doses of the drug, a mainstay of ALL treatment, that can mean several days spent in the hospital with rigorous clinical monitoring.

    In other forms of cancer, methotrexate’s efficacy is more uncertain. For example, in pediatric osteosarcoma, only 65 percent of patients respond. Unfortunately, there is currently no way for doctors to pinpoint who will and who will not. 

    “From a scientific standpoint, methotrexate is quite special because it was the first metabolic drug to be developed, but much of its biology remains to be discovered — particularly what drives these different responses in patients,” Kanarek says. “So, this is really one of these old, classic questions that has been lingering in the field for some time. We thought we could learn something new.”

    And they did. In the July 11 online issue of the journal Nature, Kanarek, Sabatini, and their colleagues report the findings of a CRISPR/Cas9 screen for factors involved in methotrexate sensitivity. The team’s work yielded a surprising set of discoveries that point to the breakdown of histidine — one of several amino acids used by the body to construct proteins — as a critical gatekeeper of cancer cells’ vulnerability to methotrexate. The researchers’ findings not only help illuminate the biology of a well-known cancer chemotherapy, but also suggest a simple dietary supplement that could help broaden its therapeutic window and reduce its toxicity.

    “This study is an example of the power of modern genomic tools to shine a bright light on longstanding questions in human biology,” says senior author David Sabatini, who is also a professor of biology at MIT and investigator with the Howard Hughes Medical Institute (HHMI). “While cancer chemotherapies can be quite effective, their biological effects are often poorly understood. By laying bare their biology, we may be able to devise ways to utilize them more wisely.”

    Attack the cancer, not the patient

    The history of methotrexate stretches back to the 1940s, a time when strikingly little was known about the origins of cancer, much less how best to treat it. The birth of methotrexate as a chemotherapeutic agent was sparked by the astute observations of Sidney Farber, a pediatric pathologist at Boston Children’s Hospital who cared for children with a variety of maladies, including ALL. In the course of caring for patients with ALL, Farber recognized that cancer cells depended on the nutrient folic acid for their own proliferation. That gave him the idea of using folate antagonists to treat ALL. Methotrexate was developed in 1949 precisely for this purpose and was subsequently shown to induce remission in children with ALL. Fast forward to today, and the drug has evolved into a significant tool in oncologists’ toolkit.

    “Methotrexate is a major part of the backbone of chemotherapy treatment across many human cancers,” says Loren Walensky, a pediatric hematologist/oncologist at the Dana-Farber Cancer Institute who is not a study co-author but served as an early adviser on the project and will also play a deeper role in planning future follow-up studies. “It is also used outside of the cancer field for the treatment of several autoimmune diseases.”

    He added, “But as with all chemotherapy, the critical issue is how to best use it to inflict maximal damage on the cancer without irreparably harming the patient.”

    The basic mechanics of methotrexate are fairly well known. The drug inhibits dihydrofolate reductase (DHFR), an enzyme that generates the functional form of folate, known as tetrahydrofolate (THF). THF is essential for preparing the raw materials needed to make nucleic acids, such as DNA, which carries cells’ genetic information, and RNA, a close chemical relative involved in making proteins. “Proliferating cells must duplicate their DNA, so they need a lot of THF,” Kanarek explains. “But even cells that are not dividing need to make RNA, and that requires THF, too.”

    The results of Kanarek’s CRISPR/Cas9 screen now bring greater clarity to this molecular picture. She and her colleagues uncovered another enzyme, called FTCD, which is involved in the breakdown of histidine. Interestingly, FTCD also requires THF for its function — though not nearly as much as the main target of methotrexate, DHFR. Despite the differential demands of the two enzymes, they both draw from the same, shared pool of THF.

    “Under normal conditions, this pool is sufficiently full, so there is no competition for resources, even in rapidly dividing cells,” Kanarek says.

    But when the amount of THF becomes limiting — as it does in cells that are treated with methotrexate — the story is quite different, the Whitehead Institute team discovered. In that case, the activity of FTCD poses serious problems, because there isn’t enough THF in the pool to support both cell proliferation and histidine breakdown. When that happens, the cells die.

    That got Kanarek thinking more about histidine: Could the nutrient provide a way to tinker with FTCD activity and, by virtue of the cancer cells’ own metabolism, make them more vulnerable to methotrexate?

    To explore this question, the researchers used mouse models of leukemia, engineered by transplanting human leukemia cells under the skin of immunocompromised mice. A subset of the mice received injections of methotrexate together with histidine. This one-two punch, Kanarek hypothesized, should ramp up the function of FTCD and more rapidly drain the THF pool, thereby making the cells more sensitive to the cancer-killing effects of methotrexate.

    That is precisely what the team observed. Notably, these experiments involved lower than normal doses of methotrexate, suggesting the cells had indeed been made more sensitive to the cancer drug. Moreover, the studies included a human leukemia cell line, called SEM, which harbors a specific genetic mutation that is associated with a particularly poor prognosis in patients — further underscoring the power of the histidine degradation pathway to weaken cells’ defenses.

    Now, Kanarek and her colleagues are working to extend these initial findings with additional preclinical studies and, together with Walensky, determine how to best evaluate the potential benefits of histidine supplementation in cancer patients. Their ultimate goal: to pursue clinical trials that will assess histidine’s ability to improve the effectiveness of methotrexate in humans.

    In addition to making cancer cells more vulnerable to methotrexate, the Whitehead Institute team’s research also holds promise for another therapeutic challenge: identifying which patients will or will not respond to the drug.

    Two other enzymes cooperate with FTCD in breaking down histidine. The levels of one of the enzymes, known as HAL, appears to correlate with cells’ sensitivity to methotrexate: That is, cancer cells with high levels of HAL tend to be more sensitive to the drug. More work is needed to determine whether this correlation extends to a broader swath of patient samples and if it has predictive value in the clinic. Nevertheless, Kanarek and her colleagues are already beginning work on this front. Together with Abner Louissaint Jr., a hematopathologist at Massachusetts General Hospital who also served as an early adviser on the Nature study, the Whitehead Institute team will launch a second clinical study to examine whether HAL levels can predict methotrexate response in patients with lymphoma.

    “Being able to understand who is going to respond to methotrexate and who is not, and how to achieve a therapeutic benefit while mitigating the drug’s potential side effects, could have a profound impact on patient care,” Walensky says. “The insights from this study bring an entirely new dimension to our understanding of a decades-old and critically important cancer medicine. And as a physician and a scientist, that’s truly exciting.”

    11:59p
    A solution for urban storm flooding

    Flooding, on the rise due to climate change, can devastate urban areas and result in drawn-out, costly repairs. Cities are in dire need of new strategies to manage the influx of stormwater. An interdisciplinary team of engineers and urban planners at MIT has now developed a solution: multifunctional urban stormwater wetlands and ponds that seamlessly integrate the control and cleaning of stormwater with ecological and recreational benefits.

    Stormwater flooding in cities is exacerbated by urban infrastructure, as many of the natural ecosystems that would absorb rainfall have been replaced with pavement, which greatly limits an area’s infiltration capacity. This keeps stormwater on the surface, where it picks up all kinds of pollutants — trash, heavy metals, industrial chemicals — that are eventually carried into nearby bodies of water, often including the local water supply.

    Many cities do not have adequate systems in place to handle stormwater runoff, the largest single cause of stream impairment in urban areas. Stormwater treatment plants are large investments that need to be integrated into existing drainage and water treatment systems. Without spaces or processes that can sequester and purify contaminated water before it reenters circulation or the natural environment, urban centers lose fresh water that could be available for drinking and groundwater recharge, among other ecosystem needs.

    Natural stormwater management systems — engineered green spaces — are becoming more popular options for cities, in part due to their affordability. The MIT team’s wetlands have been designed to be much more effective than existing designs, such as simple basins and serpentines, at controlling water circulation and purifying stormwater, while also delivering ecosystem and recreational benefits.

    The MIT team has released the details of their study in a freely available report, "Design Guidelines for Urban Stormwater Wetlands," in the hope that cities will adopt this approach. The report is based on two years of research funded by a seed grant from MIT’s Abdul Latif Jameel World Water and Food Security Lab (J-WAFS) and further supported by the MIT Norman B. Leventhal Center for Advanced Urbanism (LCAU) in MIT’s School of Architecture and Planning. These guidelines are based on physical experiments undertaken in the MIT Nepf Environmental Fluid Mechanics Lab and recently published in the journal Ecological Engineering.

    “The goal of our study is to help cities mitigate their own problems in the face of rapidly changing climates, large storms, and a lack of economically feasible solutions,” says co-author Alan M. Berger, the Norman B. and Muriel Leventhal Professor of Advanced Urbanism and LCAU co-director. Berger and his co-authors welcome interested city representatives to reach out to them to discuss how to implement their designs. In May, the group conducted an outreach campaign to ensure that these open-sourced designs reach urban stakeholders such as government officials and regional planners across the U.S.

    The guidelines combine engineering, urban planning, and landscape architecture expertise to design a versatile green space. On top of managing stormwater, the wetland or pond creates greenery for the city, recreational space for the community, and valuable wildlife habitats.

    The designs, which feature a series of clustered islands, are modular and scalable, so they can be tailored to fit the needs and resources of varying urban settings. The work was developed with two specific case studies, Houston and Los Angeles, to help ensure the adaptability of the guidelines to different localities.

    “We picked L.A. and Houston because they are both large cities in warm climates, rapidly growing, mostly suburban, with good prospects for green space,” says lead researcher and lead author Celina Balderas Guzmán ’07, MCP ’13, SM ’13. “Moreover, one is very dry and one is very wet. We wanted to show our design’s adaptability to different conditions.” Balderas Guzmán, then an LCAU member and now at the University of California at Berkeley, is an alum of MIT’s School of Architecture and Planning, where she developed a master's thesis on stormwater wetlands that eventually led to this collaborative, interdisciplinary project.

    The guidelines have yet to be used in practice. However the team is currently in contact with city leaders in several locations about the prospect of building a pilot wetland systems. Unaffiliated members of the research community speak positively about the merit of the guidelines.  

    “As far as I know, there is nothing available to the practitioner community that translates research findings from engineers and landscape architects into reality so cleanly,” says David L. Sedlak, professor of environmental engineering at UC Berkeley and co-director of the Berkeley Water Center.

    To develop the guidelines, researchers in the Environmental Fluid Mechanics Lab led by Heidi Nepf, the MIT Donald and Martha Harleman Professor of Civil and Environmental Engineering, tested more than 30 different wetland system designs. They monitored water circulation through sculpted models to determine which topography was most effective in slowing down the stormwater and evenly distributing its flow, in order to best enable the natural processes that cleanse the water of pollutants. This comprehensive testing strategy led to designs based on clusters of streamlined islands placed close together near the wetland inlets.

    Controlling the water’s movement so it lingers in the wetland is crucial to give the ecosystem time to improve water quality. Wetlands purify water through a combination of biological and chemical processes, including giving contaminants time to settle out of the water. Wetland vegetation is another good filter, as plant surfaces and the biofilms they support are very effective at capturing pollutants and excess nutrients.

    Determining the most effective design for stormwater treatment was a key aspect of the project, but the team emphasizes that the value of their wetland system is more than its water management functionality. Collaboration between engineers and urban planners led to a design that maximized efficiency without sacrificing aesthetic, ecological, or recreational quality.

    “Stormwater management guidelines are typically written by engineers and they are very prescriptive. They are not traditionally designed to promote ecology or facilitate recreation,” Balderas Guzmán says.

    The team was able to create a multifaceted wetland system designs thanks to its unique interdisciplinary makeup. Nepf, co-author of the study, says the engineers contributed hydraulic function innovations while the landscape architects envisioned how to make the wetland a valued part of the fabric of the city.

    Sparking interdisciplinary collaborations is a goal of J-WAFS seed grants, and Nepf credits J-WAFS with helping the engineers and urban planners to work together, bridging their different design processes and “different languages.”

    “J-WAFS provided a place where we could learn how to talk to each other,” Nepf says.

    Because of this unique collaboration, the guidelines offer a rich variety of benefits. They include recreational trails, which bridge the island clusters and connect city streets to inviting green space. The largest islands can hold event spaces for public programming, while floodplains beside the wetland can be used as sports fields, picnic areas, or playgrounds. The islands provide multiple ecological habitat zones, from dry upland to shallow and then deeper water. This habitat could be especially valuable to wetland species as natural wetlands disappear.

    The multiuse designs have a political advantage as well. They can help cities win public approval to implement stormwater wetlands, which have often proved to be challenging projects to get local residents to support. Communities unaware of the extent to which stormwater pollutes their water supply may not support using a space that could be a park or a playground for such a project. The addition of recreational features makes artificial wetlands an easier sell.

    “I hope these guidelines open people’s eyes to how they can multipurpose land in urban areas,” Nepf says. “I hope we make them think, ‘Okay, I need something to deal with stormwater runoff, so how do I make something that might also benefit the environment and the livability of the city.’”

    << Previous Day 2018/07/12
    [Calendar]
    Next Day >>

MIT Research News   About LJ.Rossia.org