MIT Research News' Journal
[Most Recent Entries]
[Calendar View]
Monday, April 27th, 2020
Time |
Event |
11:11a |
New model of the GI tract could speed drug development MIT engineers have devised a way to speed up the development of new drugs by rapidly testing how well they are absorbed in the small intestine. This approach could also be used to find new ways to improve the absorption of existing drugs so that they can be taken orally.
Developing drugs that can be easily absorbed in the gastrointestinal tract is a particular challenge for treating neglected tropical diseases, tuberculosis, and malaria, says Giovanni Traverso, an assistant professor of mechanical engineering at MIT and a gastroenterologist at Brigham and Women’s Hospital.
“Many of the drugs that are being developed today for neglected tropical diseases are insoluble and poorly permeable,” Traverso says. “We can potentially identify better formulations much faster using this new system.”
With their new method, based on pig intestinal tissue grown in the lab, the researchers can test thousands of different versions of a drug in just hours. In a paper appearing today in Nature Biomedical Engineering, the researchers used this approach to identify a formulation of the hormone oxytocin that can accumulate in the blood at concentrations about10 times higher than regular oxytocin.
Traverso and Robert Langer, the David H. Koch Institute Professor at MIT and a member of the Koch Institute for Integrative Cancer Research, are the senior authors of the paper. The lead author is former MIT postdoc Thomas von Erlach.
A better model
Drugs ingested by mouth are absorbed primarily in the small intestine. Most small-molecule drugs are readily absorbed by the intestine, but some do not dissolve well because they are hydrophobic. Drug developers can overcome that to some extent by adding compounds called excipients, which help to stabilize drugs and can make them less hydrophobic.
However, certain drugs cannot be absorbed despite being completely dissolved in the intestine. These drugs are unable to get past the mucus layer and lining of the intestine. Due to the complexity of the GI tissue, one of the only ways to try to solve this problem is by studying model systems that mimic intestinal absorption.
Currently, the most widely used industrial approach is to test these formulations — different combinations of drugs and excipients — in human colorectal cancer cells. Because these cells are cancerous, they can survive for extended periods outside the body, but they don’t always accurately mimic the structure and function of the human intestine. Furthermore, the process is time-consuming, requiring several weeks for the cells to be ready for evaluation.
“There isn’t really a good GI model system for drug delivery applications,” von Erlach says. “We wanted to develop something that measures GI drug absorption similar to animal studies but also allows us to test many formulations at the same time.”
To more accurately replicate the intestine, the MIT team decided to take large sections of a mammalian GI tract and try to keep the tissue alive for a prolonged period of time. They decided on pigs as a model because their size and genomes are relatively similar to those of humans.
The researchers tried growing the tissue in many types of growth media and found one that has the right combination of nutrients to help the tissue to stay viable and functional for up to a week. They also found that the tissue grows best when it includes the deep muscle layers that surround the intestine.
The researchers also developed new mechanical devices to press the tissue between two plates and systematically expose it to different drug formulations and measure their absorption. By exposing tissue to a specific drug on one plate, then measuring how much of it passes through the tissue to the other plate, the researchers can determine how well that particular drug is absorbed. This system can be used to test up to 10,000 samples per day.
To see how well the system predicts human drug absorption, the researchers tested about 60 FDA-approved drugs, with varying properties and oral absorption, that had already been tested on human cancer cells. They found that the absorption rates predicted by their new system were about 90 percent accurate compared to available data on how well these drugs are absorbed in the human digestive tract. In contrast, predictions generated by tests in colorectal cancer cells had near 50 percent accuracy.
Improved absorption
The researchers then decided to try testing some new formulations of oxytocin, a drug that currently has to be given intravenously because it is very difficult to absorb in the small intestine. Oxytocin is commonly used to stimulate uterine contractions during labor or to reduce bleeding after delivery.
The researchers mixed oxytocin with a variety of excipients, creating nearly 3,000 variants, which could now be tested in all in less than a day for their ability to be absorbed across the intestine. “What that screen showed us was that we could quickly tell which combinations were actually enhancing the transport of oxytocin,” Traverso says.
The researchers selected two of the most promising formulations to test in pigs, and found that the best one boosted oxytocin absorption by 11 times.
“This new system really opens up the possibility of accurately predicting, for the first time, drug transport in the gastrointestinal tract, in a high-throughput manner,” Langer says. “This will allow us to test ways to greatly improve oral drug bioavailability.”
Langer, Traverso, and von Erlach have founded a company called Vivtex that is now working on commercializing the technology. They hope to use this approach to find effective ways to deliver biologic drugs such as peptides, antibodies, and nucleic acids, says von Erlach, who is the company’s chief scientific officer.
“Despite decades of research, peptides, proteins and antibodies, as well as oligonucleotide therapeutics, still cannot be delivered orally. We believe that our technology has an enormous potential to address this challenge,” he says.
Other authors of the paper include Sarah Saxton, Yunhua Shi, Daniel Minahan, Daniel Reker, Farhad Javid, Young-Ah Lucy Lee, Carl Schoellhammer, Tina Esfandiary, Cody Cleveland, Lucas Booth, Jiaqi Lin, Hannah Levy, Sophie Blackburn, and Alison Hayward.
The research was funded by the National Institutes of Health, the Bill & Melinda Gates Foundation, and the Swiss National Science Foundation. | 11:36a |
Study analyzes contamination in drug manufacturing plants Over the past few decades, there have been a handful of incidents in which manufacturing processes for making protein drugs became contaminated with viruses at manufacturing plants. These were all discovered before the drugs reached patients, but many of the incidents led to costly cleanups and in one instance a drug shortage.
A new study from an MIT-led consortium has analyzed 18 of these incidents, most of which had not been publicly reported until now. The report offers insight into the most common sources of viral contamination and makes several recommendations to help companies avoid such incidents in the future.
While the study focused on biopharmaceuticals (protein drugs produced by living cells), the findings could also help biotech companies to create safety guidelines for the manufacture of new gene therapies and cell-based therapies, many of which are now in development and could face similar contamination risks.
“As the biotech industry starts to think about manufacturing these really exciting new products, which are highly effective and even in some cases curative, we want to make sure that the viral safety aspects of manufacturing them are considered,” says Stacy Springs, senior director of programs for MIT’s Center for Biomedical Innovation (CBI).
Springs is the senior author of the study, which appears today in Nature Biotechnology. Paul Barone, co-director of the CBI’s Biomanufacturing Program and director of the Consortium on Adventitious Agent Contamination in Biomanufacturing (CAACB), is the lead author. The other authors from CBI are Michael Wiebe and James Leung.
Sharing information
Many therapeutic proteins are produced using recombinant DNA technology, which allows bacterial, yeast, or mammalian cells to be engineered to produce a desired protein. While this practice has a strong safety record, there is a risk that the cultured mammalian cells can be infected with viruses. The CAACB, which performed the study, was launched in 2010 following a well-publicized contamination incident at a Genzyme manufacturing plant in Boston. The plant had to shut down for about 10 months when some of its production processes became infected with a virus in 2009.
When such incidents occur, drug companies aren’t required to make them public unless the incident affects their ability to provide the drug. The CBI team assembled a group of 20 companies that were willing to share information on such incidents, on the condition that the data would be released anonymously.
“We thought it would be very valuable to have industry share their experience of viral contamination, since most companies have had none of these incidents if they’re lucky, or maybe one or two at the most,” Springs says. “All of that knowledge about how they discovered and managed the event, identified the virus and its source, disinfected and restarted the production facility, and took action to prevent a recurrence was all siloed within individual companies.”
The study, which focused on protein drugs produced by mammalian cells, revealed 18 viral contamination incidents since 1985. These occurred at nine of the 20 biopharmaceutical companies that reported data. In 12 of the incidents, the infected cells were Chinese hamster ovary (CHO) cells, which are commonly used to produce protein drugs. The other incidents involved human or nonhuman primate cells.
The viruses that were found in the human and nonhuman primate cells included herpesvirus; human adenovirus, which causes the common cold; and reovirus, which can cause mild gastroenteritis. These viruses may have spread from workers at the plants, the researchers suggest.
In many cases, contamination incidents were first detected because cells were dying or didn’t look healthy. In two cases, the cells looked normal but the viral contamination was detected by required safety testing. The most commonly used test takes at least two weeks to yield results, so the contaminating virus can spread further through the manufacturing process before it is detected.
Some companies also use a faster test based on polymerase chain reaction (PCR) technology, but this test has to be customized to look for specific DNA sequences, so it works best when the manufacturers know of specific viruses that are most likely to be found in their manufacturing processes.
“This work demonstrates how sharing of data and information yield better understanding of difficult problems in manufacturing biologics,” says Janet Woodcock, director of the Center for Drug Evaluation and Research at the FDA, who was not involved in the study. “Innovators contemplating new processes or product may be able to avoid pitfalls or disasters by learning from the collective past experiences documented here.”
New technology
Many of the CAACB member companies are exploring new technologies to inactivate or remove viruses from cell culture media before use, and from products during purification. Additionally, companies are developing rapid virus detection systems that are both sensitive and able to detect a broad spectrum of viruses.
CBI researchers are also working on several technologies that could enable more rapid tests for viral contamination. Much of this research is taking place within a new interdisciplinary research group at the Singapore-MIT Alliance for Science and Technology (SMART), called the Critical Analytics for Manufacturing Personalized Medicines. Led by Krystyn Van Vliet, MIT associate provost and a professor of biological engineering and materials science and engineering, this group, which includes several other MIT faculty members from across departments, is working on about half a dozen technologies to more rapidly detect viruses and other microbes.
“I think that there’s a lot of potential for technology development to ameliorate some of the challenges we see,” Barone says.
Another strategy that the report recommends, and that some companies are already using, is to reduce or eliminate the use of cell growth medium components that are derived from animal products such as bovine serum. When that isn’t possible, another strategy is to perform virus removal or inactivation processes on media before use, which can prevent viruses from entering and contaminating manufacturing processes. Some companies are using a pasteurization-like process called high temperature short time (HTST) treatment, while others use ultraviolet light or nanofiltration.
The researchers hope that their study will also help guide manufacturers of new gene- and cell-therapy products. These therapies, which make use of genes or cells to either replace defective cells or produce a therapeutic molecule within the body, could face similar safety challenges as biopharmaceuticals, the researchers say, as they are often grown in media containing bovine serum or human serum.
“Having done this sharing of information in a systematic way, I think we can accelerate the dissemination of information on best practices, not only within the protein manufacturing industry but also the new industry of cell-based modalities,” says James Leung.
The research was funded by the members of the CAACB. | 1:30p |
Muscle signals can pilot a robot Albert Einstein famously postulated that “the only real valuable thing is intuition,” arguably one of the most important keys to understanding intention and communication.
But intuitiveness is hard to teach — especially to a machine. Looking to improve this, a team from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) came up with a method that dials us closer to more seamless human-robot collaboration. The system, called “Conduct-A-Bot,” uses human muscle signals from wearable sensors to pilot a robot’s movement.
“We envision a world in which machines help people with cognitive and physical work, and to do so, they adapt to people rather than the other way around,” says Professor Daniela Rus, director of CSAIL, deputy dean of research for the MIT Stephen A. Schwarzman College of Computing, and co-author on a paper about the system.
To enable seamless teamwork between people and machines, electromyography and motion sensors are worn on the biceps, triceps, and forearms to measure muscle signals and movement. Algorithms then process the signals to detect gestures in real time, without any offline calibration or per-user training data. The system uses just two or three wearable sensors, and nothing in the environment — largely reducing the barrier to casual users interacting with robots.
While Conduct-A-Bot could potentially be used for various scenarios, including navigating menus on electronic devices or supervising autonomous robots, for this research the team used a Parrot Bebop 2 drone, although any commercial drone could be used.
By detecting actions like rotational gestures, clenched fists, tensed arms, and activated forearms, Conduct-A-Bot can move the drone left, right, up, down, and forward, as well as allow it to rotate and stop.
If you gestured toward the right to your friend, they could likely interpret that they should move in that direction. Similarly, if you waved your hand to the left, for example, the drone would follow suit and make a left turn.
In tests, the drone correctly responded to 82 percent of over 1,500 human gestures when it was remotely controlled to fly through hoops. The system also correctly identified approximately 94 percent of cued gestures when the drone was not being controlled.
“Understanding our gestures could help robots interpret more of the nonverbal cues that we naturally use in everyday life,” says Joseph DelPreto, lead author on the new paper. “This type of system could help make interacting with a robot more similar to interacting with another person, and make it easier for someone to start using robots without prior experience or external sensors.”
This type of system could eventually target a range of applications for human-robot collaboration, including remote exploration, assistive personal robots, or manufacturing tasks like delivering objects or lifting materials.
These intelligent tools are also consistent with social distancing — and could potentially open up a realm of future contactless work. For example, you can imagine machines being controlled by humans to safely clean a hospital room, or drop off medications, while letting us humans stay a safe distance.
Muscle signals can often provide information about states that are hard to observe from vision, such as joint stiffness or fatigue.
For example, if you watch a video of someone holding a large box, you might have difficulty guessing how much effort or force was needed — and a machine would also have difficulty gauging that from vision alone. Using muscle sensors opens up possibilities to estimate not only motion, but also the force and torque required to execute that physical trajectory.
For the gesture vocabulary currently used to control the robot, the movements were detected as follows:
-
stiffening the upper arm to stop the robot (similar to briefly cringing when seeing something going wrong): biceps and triceps muscle signals;
-
waving the hand left/right and up/down to move the robot sideways or vertically: forearm muscle signals (with the forearm accelerometer indicating hand orientation);
-
fist clenching to move the robot forward: forearm muscle signals; and
-
rotating clockwise/counterclockwise to turn the robot: forearm gyroscope.
Machine learning classifiers detected the gestures using the wearable sensors. Unsupervised classifiers processed the muscle and motion data and clustered it in real time to learn how to separate gestures from other motions. A neural network also predicted wrist flexion or extension from forearm muscle signals.
The system essentially calibrates itself to each person's signals while they're making gestures that control the robot, making it faster and easier for casual users to start interacting with robots.
In the future, the team hopes to expand the tests to include more subjects. And while the movements for Conduct-A-Bot cover common gestures for robot motion, the researchers want to extend the vocabulary to include more continuous or user-defined gestures. Eventually, the hope is to have the robots learn from these interactions to better understand the tasks and provide more predictive assistance or increase their autonomy.
“This system moves one step closer to letting us work seamlessly with robots so they can become more effective and intelligent tools for everyday tasks,” says DelPreto. “As such collaborations continue to become more accessible and pervasive, the possibilities for synergistic benefit continue to deepen.”
DelPreto and Rus presented the paper virtually earlier this month at the ACM/IEEE International Conference on Human Robot Interaction. | 2:00p |
Dirty carbon reveals a sophisticated side Tar, the everyday material that seals seams in our roofs and driveways, has an unexpected and unappreciated complexity, according to an MIT research team: It might someday be useful as a raw material for a variety of high-tech devices including energy storage systems, thermally active coatings, and electronic sensors.
And it’s not just tar. Professor Jeffrey Grossman has a very different view of other fossil fuels as well. Rather than using these materials as cheap commodities to burn up, seal cracks with, or dispose of, he sees potential for a wide variety of applications that take advantage of the highly complex chemistry embedded in these ancient mixtures of biomass-derived carbon compounds.
A significant benefit of such applications is that they provide a way to repurpose materials that would otherwise be burned, adding to greenhouse gas emissions, or disposed of in landfills. These uses could lead to a “greening” of otherwise climate-damaging coal and other carbon-based materials, Grossman says.
In his latest research, Grossman, along with postdoc Xining Zang, research scientist Nicola Ferralis, and five others, found ways to use coal, tar, and pitch to produce thin coatings with highly controllable and reproducible electrical conductivity, porosity, and other properties. Using a laser, they were able make prototype devices from the inexpensive, ubiquitous materials, including a supercapacitor to store electricity, a flexible strain gauge, and a transparent heater.
The work, described in the journal Science Advances, explores alternative ways of using carbonaceous heavy hydrocarbons, which have formed over millions of years of geological processing of decayed plant matter through heat and pressure. These materials, Grossman says, provide a rich variety of atomic configurations with different chemical and structural properties, unmatched by any synthetic, processed carbon-based nanomaterials.
To make use of these material properties, the team used a process called laser annealing to create ultrathin layers of carbonaceous materials, deposited on a substrate. They produced specific functional devices by depositing and etching patterns in layers made of different carbon-based materials.
In a sense, what the team did is the inverse of traditional processing of fossil fuels, in which the complex mix of hydrocarbons undergoes stage after stage of breaking down chemical bonds and separating out different compounds. In this work, the various kinds of heavy hydrocarbon complexes were used just as they are, making use of the wide variety of properties to be found in the different materials — types of coal, petroleum steam cracked tar, and mesophase pitch, most of which are either byproducts that typically need to be disposed of or fuels that are rapidly being phased out.
Through a combination of selecting just the right feedstock material and varying the timing and strength of laser pulses used to anneal the material, the team was able to control a range of physical, optical, electrical, magnetic, and other properties. By combining different materials, they say, a whole range of devices could be produced at once on a single substrate.
“We can then create everything from graphene to some sort of aromatic rich polymers,” says Ferralis, “and with properties that could change widely, from being thermal and electrical insulators, to thermal and electric conductors. We can change porosity, so that allows us not only to create solid films, but also to create materials that are highly porous, so we can actually make membranes.”
This assortment of material properties can be mixed and matched, perhaps enabling, for example, the creation a variety of carbonaceous “inks” for 3D printing, he says.
“But rather than changing the colors,” Ferralis says, “you actually change the type of precursor you make. You add a little more tar, a little less pitch, or a little more of any of the other things that we highlighted in the paper. That could give, for example, the ability to make, within the same film, a membrane, an electrical device, and an energy storage system, and so on and so forth, on demand.”
The materials can be virtually any kind of heavy hydrocarbon, many of which exist in great abundance as waste products from petroleum production or chemical processing. “Essentially what we’re looking for is any material that it’s heavy in aromatics, meaning heavy hydrocarbons that people don't know what to do with it,” Zang says. “So we are pretty agnostic about what we can use.”
By using precisely timed and tuned pulses from a carbon dioxide laser, the team was able to control the properties of the coated material, blasting it with pulses that could generate highly localized temperatures as high as 2,000 degrees Celsius, while leaving surrounding areas so unaffected that the process could be carried out even on soft substrates such as plastics, they say.
“We have this highly heterogeneous, messy feedstock,” says Grossman, “but it’s so cheap and rich with useful chemistry.” The idea is to understand it well enough to be able to “apply simple, scalable manufacturing tools so that we can take advantage of this understanding to make it do something different for us.” In a nutshell, he says, “we’re finding this material that was previously thought of as limited in its use (as only a fuel to burn, for example), and by understanding its atomic structure, we're able to apply principles of materials design and engineering to make it useful in broader ways.”
While this initial work focused on thin films, the raw materials are so inexpensive that ultimately such materials might also be used for bulk applications, Ferralis says. “If we can scale up this process to bulk systems, this might be used in structural materials, for example, or insulation for homes. Stuff that actually requires a lot of the material.” It might even provide an economic boost for coal-producing regions now suffering from the collapse of the coal-powered electric power plant industry to become producers of a whole new family of higher-value products, he suggests.
The research team also included C. Jian at York University in Canada, S. Ingersoll and Z. Lu at MIT, Huashan Li at Sun Yat Sen University in China, J.J. Adams at the Western Research Institute, in Wyoming, and Z. Lu. The work was supported by ExxonMobil and the Natural Sciences and Engineering Research Council of Canada. |
|