MIT Research News' Journal
 
[Most Recent Entries] [Calendar View]

Tuesday, June 2nd, 2020

    Time Event
    9:00a
    Counting your antigens

    Normally, the immune system is able to differentiate between healthy and abnormal cells. Peptides, fragments created by the synthesis and breakdown of proteins inside each cell, are presented on the surface as antigens and act as signals to immune cells whether the cell should be left alone or flagged for destruction and removal.

    Because cancer cells display a small number of tumor-associated antigens and antigens that result from genetic mutations, they can be targeted by the immune system. However, cancer cells can develop strategies for evading detection by the immune system. Cancer immunotherapies counteract those strategies, but only for some cancers and only in some patients. Those that do work produce powerful results.

    Researchers and clinicians are exploring how to improve the success rate of immunotherapies for more cancer types and patients. In this effort, they are combining immunotherapies with targeted therapies, small molecules designed to inhibit selected protein targets in the cell. To design effective combinations, a better understanding of how targeted therapies change the immunopeptidome — the repertoire of surface-presenting peptide antigens — is needed.

    A team of researchers including Koch Institute members Forest White, the Ned C. and Janet Bemis Rice Professor and member of the MIT Center for Precision Cancer Medicine, and Douglas Lauffenburger, Ford Professor of Biological Engineering, Chemical Engineering, and Biology, developed a technique for accurately quantifying changes in the immunopeptidome.

    In a study led by graduate student Lauren Stopfer and appearing in Nature Communications, researchers used the platform to analyze the effect of CDK4/6 inhibitors, a class of known anticancer agents, on the immunopeptidome of melanoma cell lines. In addition to identifying  potential antigen targets for drug development, their results highlighted the potential of CDK4/6 inhibitors to make an effective partner for certain kinds of immunotherapies. Ultimately, the platform could help cancer researchers design new targeted drugs and immunotherapies or clinical trials for combinations of these types of therapies.

    High-quality quantification

    Currently, in order to examine how a cell changes its immunopeptidome in response to exposure to a drug or other perturbation, researchers perform a technique known as mass spectrometry to quantify the foldchange, or relative change in magnitude between subsequent measurements, of the expression of peptide antigens. However, most current mass spectrometry-based methods do not provide a complete — or even reliably accurate — picture of immunopeptidome dynamics.

    The process of preparing a sample for mass spectrometry analysis can result in substantial losses of antigens. In isolating the relatively small number of antigen peptides from the entire contents of cells, there can be significant variation in the proportion of peptide antigens recovered from sample to sample or from peptide to peptide. Existing methods for accounting for how many antigens are lost are laborious and have limited effectiveness.

    Foldchange alone does not indicate the magnitude of a change in peptide antigen levels. For example, a three-fold increase in antigens may mean an increase from 10 to 30 antigens, or it may mean an increase of 1,000 to 3,000. Because different drugs require different antigens to be present at different quantities in order to be effective, an accurate count of the change in antigen is needed to identify drugs that elicit the optimal response in the cell. Furthermore, the measurement may be undermined by underlying “noise” in the sample — data that can cloud the relative proportion of observable “signal” produced by the antigen of interest.

    “People will say that you need a certain number of a peptide antigen in order for an immunotherapy to work, but, right now, that number is typically based on anecdotal evidence,” says White. “To make truly informed decisions about immunotherapy options, there needs to be a way to quantify antigens very accurately and very reliably.”

    The new platform enables the accurate quantification of peptide antigens presented at the cell surface, accounting for variation in sample processing and giving an absolute number of detectable peptides. Using a widely available ultraviolet light-based technology, the method inserts peptides loaded with heavy isotopes into genetically engineered versions of the molecules that present the antigens on the cell surface, class I major histocompatibility complexes (MHCs). The labeled peptide-MHC (pMHC) complexes are then added to samples of the contents of whole cells. When the antigen peptides are extracted, the heavy isotope labeled peptides can be used to account for how many antigens have been lost to processing.

    To determine how many of a specific antigen are presented on cells, heavy isotope labeled pMHCs can be added to samples of cell contents at different concentrations. The resulting standard curve, or graph, can be used to extrapolate the number of peptide antigens.

    Making antigens count

    The researchers used the new platform to quantify how CDK4/6 inhibitors change the repertoire of antigens presented on the surface of melanoma cells.

    Melanoma can be treated effectively with a class of immunotherapy called immune checkpoint blockade inhibitors, but as many as 40 percent of patients do not respond to these therapies. Recent studies have suggested that checkpoint blockade immunotherapies may be more effective in more patients when combined with other anticancer agents, particularly those that stimulate an immune response, such as CDK4/6 inhibitors. CDK4/6 inhibitors are thought to strengthen the immune system’s response to cancer in part by increasing expression of MHCs, thereby rendering cancer cells more visible to the immune system.

    Researchers profiled peptide antigen repertoires in four cell lines of melanoma treated with the CDK4/6 inhibitor palbociclib at low and high doses, finding that low doses of the palbociclib resulted in a larger increase of MHC presentation than the higher-dose therapy. At lower doses, the immunopeptidome showed increases in tumor-associated peptide antigens derived from intracellular pathways known to be affected by the inhibition of CDK4 and CDK6. These results add to a growing body of evidence that CDK4/6 could be used together with checkpoint blockade to increase the immune system’s ability to respond to tumors, and suggest that CDK4/6 inhibitors and other treatments like them could be used to tune which peptides are presented to the immune system.

    The researchers were also able to identify an antigen, a serine-phosphorylated IRS2 peptide, that occurs exclusively in malignant tumors. They found that it was expressed at high levels, demonstrating that the platform could also be used to help cancer researchers identify immunotherapy targets.

    Because of its sensitivity and speed, the new platform could be used in the clinic to develop treatment strategies on a patient-specific basis. The multiplexed platform can analyze many samples in tandem, allowing for the short time scale critical to clinical trials. Its sensitivity allows it to be used on small samples, including samples from individual patients’ tumors. Analysis of peptide antigen repertoire changes could be used to optimize the order and timing of therapies for the greatest impact, in addition to calibrating cancer cells’ antigen presentation for targeting by immunotherapies.

    “One of the most promising applications for this tool is to better understand how much of some of these peptide antigen targets are presented, not just on cell lines, but in real tumors,” says Stopfer. “Knowing how much antigen is present in tumor cells could inform what kind of therapies we develop and our ability to make informed decisions about immunotherapy options.”

    The research was funded by the National Institutes of Health, a Melanoma Research Alliance Team Science Award, the MIT Center for Precision Cancer Medicine, the Koch Institute Frontier Research Program through the Kathy and Curt Marble Cancer Research Fund, and the Takeda Pharmaceuticals Immune Oncology Research Fund.

    9:10a
    Study: Reflecting sunlight to cool the planet will cause other global changes

    How can the world combat the continued rise in global temperatures? How about shading the Earth from a portion of the sun’s heat by injecting the stratosphere with reflective aerosols? After all, volcanoes do essentially the same thing, albeit in short, dramatic bursts: When a Vesuvius erupts, it blasts fine ash into the atmosphere, where the particles can linger as a kind of cloud cover, reflecting solar radiation back into space and temporarily cooling the planet.

    Some researchers are exploring proposals to engineer similar effects, for example by launching reflective aerosols into the stratosphere — via planes, balloons, and even blimps — in order to block the sun’s heat and counteract global warming. But such solar geoengineering schemes, as they are known, could have other long-lasting effects on the climate.

    Now scientists at MIT have found that solar geoengineering would significantly change extratropical storm tracks — the zones in the middle and high latitudes where storms form year-round and are steered by the jet stream across the oceans and land. Extratropical storm tracks give rise to extratropical cyclones, and not their tropical cousins, hurricanes. The strength of extratropical storm tracks determines the severity and frequency of storms such as nor’easters in the United States.

    The team considered an idealized scenario in which solar radiation was reflected enough to offset the warming that would occur if carbon dioxide were to quadruple in concentration. In a number of global climate models under this scenario, the strength of storm tracks in both the northern and southern hemispheres weakened significantly in response.

    Weakened storm tracks would mean less powerful winter storms, but the team cautions that weaker storm tracks also lead to stagnant conditions, particularly in summer, and less wind to clear away air pollution. Changes in winds could also affect the circulation of ocean waters and, in turn, the stability of ice sheets.

    “About half the world’s population lives in the extratropical regions where storm tracks dominate weather,” says Charles Gertler, a graduate student in MIT’s Department of Earth, Atmospheric and Planetary Sciences (EAPS). “Our results show that solar geoengineering will not simply reverse climate change. Instead, it has the potential itself to induce novel changes in climate.”

    Gertler and his colleagues have published their results this week in the journal Geophysical Research Letters. Co-authors include EAPS Professor Paul O’Gorman, along with Ben Kravitz of Indiana State University, John Moore of Beijing Normal University, Steven Phipps of the University of Tasmania, and Shingo Watanabe of the Japan Agency for Marine-Earth Science and Technology

    A not-so-sunny picture

    Scientists have previously modeled what Earth’s climate might look like if solar geoengineering scenarios were to play out on a global scale, with mixed results. On the one hand, spraying aerosols into the stratosphere would reduce incoming solar heat and, to a degree, counteract the warming caused by carbon dioxide emissions. On the other hand, such cooling of the planet would not prevent other greenhouse gas-induced effects such as regional reductions in rainfall and ocean acidification.

    There have also been signs that intentionally reducing solar radiation would shrink the temperature difference between the Earth’s equator and poles or, in climate parlance, weaken the planet’s meridional temperature gradient, cooling the equator while the poles continue to warm. This last consequence was especially intriguing to Gertler and O’Gorman.

    “Storm tracks feed off of meridional temperature gradients, and storm tracks are interesting because they help us to understand weather extremes,” Gertler says. “So we were interested in how geoengineering affects storm tracks.”  

    The team looked at how extratropical storm tracks might change under a scenario of solar geoengineering known to climate scientists as experiment G1 of the Geoengineering Model Intercomparison Project (GeoMIP), a project that provides various geoengineering scenarios for scientists to run on climate models to assess their various climate effects.

    The G1 experiment assumes an idealized scenario in which a solar geoengineering scheme blocks enough solar radiation to counterbalance the warming that would occur if carbon dioxide concentrations were to quadruple.

    The researchers used results from various climate models run forward in time under the conditions of the G1 experiment. They also used results from a more sophisticated geoengineering scenario with doubling of carbon dioxide concentrations and aerosols injected into the stratosphere at more than one latitude. In each model they recorded the day-to-day change in air pressure at sea level pressure at various locations along the storm tracks. These changes reflect the passage of storms and measure a storm track’s energy.

    “If we look at the variance in sea level pressure, we have a sense of how often and how strongly cyclones pass over each area,” Gertler explains. “We then average the variance across the whole extratropical region, to get an average value of storm track strength for the northern and southern hemispheres.”

    An imperfect counterbalance

    Their results, across climate models, showed that solar geoengineering would weaken storm tracks in both Northern and Southern hemispheres. Depending on the scenario they considered, the storm track in the Northern Hemisphere would be 5 to 17 percent weaker than it is today.

    “A weakened storm track, in both hemispheres, would mean weaker winter storms but also lead to more stagnant weather, which could affect heat waves,” Gertler says. “Across all seasons, this could affect ventilation of air pollution. It also may contribute to a weakening of the hydrological cycle, with regional reductions in rainfall. These are not good changes, compared to a baseline climate that we are used to.”

    The researchers were curious to see how the same storm tracks would respond to just global warming alone, without the addition of social geoengineering, so they ran the climate models again under several warming-only scenarios. Surprisingly, they found that, in the northern hemisphere, global warming would also weaken storm tracks, by the same magnitude as with the addition of solar geoengineering. This suggests solar geoengineering, and efforts to cool the Earth by reducing incoming heat, would not do much to alter global warming’s effects, at least on storm tracks — a puzzling outcome that the researchers are unsure how to explain.

    In the Southern Hemisphere, there is a slightly different story. They found that global warming alone would strengthen storm tracks there, whereas the addition of solar geoengineering would prevent this strengthening, and even further, would weaken the storm tracks there.

    “In the Southern Hemisphere, winds drive ocean circulation, which in turn could affect uptake of carbon dioxide, and  the stability of the Antarctic ice sheet,” O’Gorman adds. “So how storm tracks change over the Southern Hemisphere is quite important.”

    The team also observed that the weakening of storm tracks was strongly correlated with changes in temperature and humidity. Specifically, the climate models showed that in response to reduced incoming solar radiation, the equator cooled significantly as the poles continued to warm. This reduced temperature gradient appears to be sufficient to explain the weakening storm tracks — a result that the group is the first to demonstrate.

    “This work highlights that solar geoengineering is not reversing climate change, but is substituting one unprecedented climate state for another,” Gertler says. “Reflecting sunlight isn’t a perfect counterbalance to the greenhouse effect.”

    Adds O’Gorman: “There are multiple reasons to avoid doing this, and instead to favor reducing emissions of CO2 and other greenhouse gases.”

    This research was funded, in part, by the National Science Foundation, NASA, and the Industry and Foundation sponsors of the MIT Joint Program on the Science and Policy of Global Change.

    2:59p
    Citizen scientists spot closest young brown dwarf disk yet

    Brown dwarfs are the middle child of astronomy, too big to be a planet yet not big enough to be a star. Like their stellar siblings, these objects form from the gravitational collapse of gas and dust. But rather than condensing into a star’s fiery hot nuclear core, brown dwarfs find a more zen-like equilibrium, somehow reaching a stable, milder state compared to fusion-powered stars.

    Brown dwarfs are considered to be the missing link between the most massive gas giant planets and the smallest stars, and because they glow relatively dimly they have been difficult to spot in the night sky. Like stars, some brown dwarfs can retain the disk of swirling gas and dust left over from their initial formation. This material can collide and accumulate to form planets, though it’s unclear exactly what kind of planets brown dwarfs can generate.

    Now researchers at MIT, the University of Oklahoma, and elsewhere, with the help of citizen scientists, have identified a brown dwarf with a disk that is the youngest of its kind within about 100 parsecs of Earth. The brown dwarf, named W1200-7845, appears to have the kind of disk that could potentially form planets. It is about 3.7 million years old and sits at a nearby 102 parsecs, or about 332 light years from Earth.

    At this proximity, scientists may be able to zoom in on the young system with future high-powered telescopes, to examine the earliest conditions of a brown dwarf’s disk and perhaps learn more about the kind of planets brown dwarfs might support.

    The new system was discovered through Disk Detective, a crowdsourced project funded by NASA and hosted by Zooniverse that provides images of objects in space for the public to classify, with the aim of picking out objects that are likely stars with disks that could potentially host planets.

    The researchers are presenting their findings, as well as announcing a new version of the Disk Detective website, this week at the all-virtual meeting of the American Astronomical Society.

    “Within our solar neighborhood”

    Users of Diskdetective.org, which first launched in 2014, can look through “flipbooks” — images of the same object in space, taken by NASA’s Wide-field Infrared Survey Explorer, or WISE, which detects infrared emissions such as thermal radiation given off by the gas and dust debris in stellar disks. A user could classify an object based on certain criteria such as whether the object appears oval — a shape that more resembles a galaxy — or round —a sign that the object is more likely a disk-hosting star.

    “We have multiple citizen scientists look at each object and give their own independent opinion, and trust the wisdom of the crowd to decide what things are probably galaxies and what things are probably stars with disks around them,” says study co-author Steven Silverberg, a postdoc in MIT’s Kavli Institute for Astrophysics and Space Research.

    From there, a science team including Silverberg follows up on crowd-classified disks, using more sophisticated methods and telescopes to determine if indeed they are disks, and what characteristics the disks may have.

    In the case of the newly discovered W1200-7845, citizen scientists first classified the object as a disk in 2016. The science team, including Silverberg and Maria Schutte, a graduate student at the University of Oklahoma, then looked more closely at the source with an infrared instrument on the Magellan 6.5-meter telescopes at Las Campanas Observatory in Chile.

    With these new observations, they determined that the source was indeed a  disk around a brown dwarf that lived within a “moving group” — a cluster of stars that tend to move as one across the night sky. In astronomy, it’s far easier to determine the age of a group of objects rather than one alone. Because the brown dwarf was part of a moving group of about 30 stars, previous researchers were able to estimate an average age for the group, about 3.7 million years old, that was likely also the age of the brown dwarf.

    The brown dwarf is also very close to the Earth, at about 102 parsecs away, making it the closest, young brown dwarf detected yet. For comparison, our nearest star, Alpha Centauri, is 1 parsec from Earth.

    “When it’s this close, we consider it to be within the solar neighborhood,” Schutte says. “That proximity is really important, because brown dwarfs are lower in mass and inherently less bright than other objects like stars. So the closer these objects are to us, the more detail we’ll be able to see.”

    Looking for Peter Pan

    The team plans to zoom further in on W1200-7845 with other telescopes, such as ALMA, the Atacama Large Millimeter Array in Chile, comprising 66 huge radio dishes that work together as one powerful telescope to observe the universe between the radio and infrared bands. At this range and precision, the researchers hope to see the brown dwarf’s disk itself, to measure its mass and radius.

    “A disk’s mass just tells you how much stuff is in the disk, which would tell us if planet formation happens around these systems, and what sorts of planets you’d be able to produce,” Silverberg says. “You could also use that data to determine what kinds of gas are in the system which would tell you about the disk’s composition.”

    In the meantime, the researchers are launching a new version of Disk Detective. In April 2019, the website went on hiatus, as its hosting platform, the popular citizen scientist portal Zooniverse, briefly retired its previous software platform in favor of an updated version. The updated platform has prompted Silverberg and his colleagues to revamp Disk Detective. The new version, launching this week, will include images from a full-sky survey, PanSTARRS, that observes most of the sky in high-resolution optical bands.

    “We’re getting more current images with different telescopes with better spatial resolution this time around,” says Silverberg, who will be managing the new site at MIT.

    Where the site’s previous version was aimed at finding any disks around stars and other objects, the new site is designed to pick out “Peter Pan” disks — disks of gas and dust that should be old enough to have formed planets, but for some reason haven’t quite yet.

    “We call them Peter Pan disks because they seem to never grow up,” Silverberg says.

    The team identified its first Peter Pan disk with Disk Detective in 2016. Since then, seven others have been found, each at least 20 million years old. With the new site, they hope to identify and study more of these disks, which could help to nail down conditions under which planets, and possibly life, may form.

    “The disks we find will be excellent places to look for exoplanets,” Silverberg says.

    “If planets take longer to form than we previously thought, the star they orbit will have fewer gigantic flares when the planets finally form. If the planet receives fewer flares than it would around a younger star, that could significantly impact our expectations for discovering life there.”

    This research was funded, in part, by NASA.

    << Previous Day 2020/06/02
    [Calendar]
    Next Day >>

MIT Research News   About LJ.Rossia.org