MIT Research News' Journal
[Most Recent Entries]
[Calendar View]
Wednesday, January 30th, 2019
Time |
Event |
12:00a |
Engineers program marine robots to take calculated risks We know far less about the Earth’s oceans than we do about the surface of the moon or Mars. The sea floor is carved with expansive canyons, towering seamounts, deep trenches, and sheer cliffs, most of which are considered too dangerous or inaccessible for autonomous underwater vehicles (AUV) to navigate.
But what if the reward for traversing such places was worth the risk?
MIT engineers have now developed an algorithm that lets AUVs weigh the risks and potential rewards of exploring an unknown region. For instance, if a vehicle tasked with identifying underwater oil seeps approached a steep, rocky trench, the algorithm could assess the reward level (the probability that an oil seep exists near this trench), and the risk level (the probability of colliding with an obstacle), if it were to take a path through the trench.
“If we were very conservative with our expensive vehicle, saying its survivability was paramount above all, then we wouldn’t find anything of interest,” Ayton says. “But if we understand there’s a tradeoff between the reward of what you gather, and the risk or threat of going toward these dangerous geographies, we can take certain risks when it’s worthwhile.”
Ayton says the new algorithm can compute tradeoffs of risk versus reward in real time, as a vehicle decides where to explore next. He and his colleagues in the lab of Brian Williams, professor of aeronautics and astronautics, are implementing this algorithm and others on AUVs, with the vision of deploying fleets of bold, intelligent robotic explorers for a number of missions, including looking for offshore oil deposits, investigating the impact of climate change on coral reefs, and exploring extreme environments analogous to Europa, an ice-covered moon of Jupiter that the team hopes vehicles will one day traverse.
“If we went to Europa and had a very strong reason to believe that there might be a billion-dollar observation in a cave or crevasse, which would justify sending a spacecraft to Europa, then we would absolutely want to risk going in that cave,” Ayton says. “But algorithms that don’t consider risk are never going to find that potentially history-changing observation.”
Ayton and Williams, along with Richard Camilli of the Woods Hole Oceanographic Institution, will present their new algorithm at the Association for the Advancement of Artificial Intelligence conference this week in Honolulu.
A bold path
The team’s new algorithm is the first to enable “risk-bounded adaptive sampling.” An adaptive sampling mission is designed, for instance, to automatically adapt an AUV’s path, based on new measurements that the vehicle takes as it explores a given region. Most adaptive sampling missions that consider risk typically do so by finding paths with a concrete, acceptable level of risk. For instance, AUVs may be programmed to only chart paths with a chance of collision that doesn’t exceed 5 percent.
But the researchers found that accounting for risk alone could severely limit a mission’s potential rewards.
“Before we go into a mission, we want to specify the risk we’re willing to take for a certain level of reward,” Ayton says. “For instance, if a path were to take us to more hydrothermal vents, we would be willing to take this amount of risk, but if we’re not going to see anything, we would be willing to take less risk.”
The team’s algorithm takes in bathymetric data, or information about the ocean topography, including any surrounding obstacles, along with the vehicle’s dynamics and inertial measurements, to compute the level of risk for a certain proposed path. The algorithm also takes in all previous measurements that the AUV has taken, to compute the probability that such high-reward measurements may exist along the proposed path.
If the risk-to-reward ratio meets a certain value, determined by scientists beforehand, then the AUV goes ahead with the proposed path, taking more measurements that feed back into the algorithm to help it evaluate the risk and reward of other paths as the vehicle moves forward.
The researchers tested their algorithm in a simulation of an AUV mission east of Boston Harbor. They used bathymetric data collected from the region during a previous NOAA survey, and simulated an AUV exploring at a depth of 15 meters through regions at relatively high temperatures. They looked at how the algorithm planned out the vehicle’s route under three different scenarios of acceptable risk.
In the scenario with the lowest acceptable risk, meaning that the vehicle should avoid any regions that would have a very high chance of collision, the algorithm mapped out a conservative path, keeping the vehicle in a safe region that also did not have any high rewards — in this case, high temperatures. For scenarios of higher acceptable risk, the algorithm charted bolder paths that took a vehicle through a narrow chasm, and ultimately to a high-reward region.
The team also ran the algorithm through 10,000 numerical simulations, generating random environments in each simulation through which to plan a path, and found that the algorithm “trades off risk against reward intuitively, taking dangerous actions only when justified by the reward.”
A risky slope
Last December, Ayton, Williams, and others spent two weeks on a cruise off the coast of Costa Rica, deploying underwater gliders, on which they tested several algorithms, including this newest one. For the most part, the algorithm’s path planning agreed with those proposed by several onboard geologists who were looking for the best routes to find oil seeps.
Ayton says there was a particular moment when the risk-bounded algorithm proved especially handy. An AUV was making its way up a precarious slump, or landslide, where the vehicle couldn’t take too many risks.
“The algorithm found a method to get us up the slump quickly, while being the most worthwhile,” Ayton says. “It took us up a path that, while it didn’t help us discover oil seeps, it did help us refine our understanding of the environment.”
“What was really interesting was to watch how the machine algorithms began to ‘learn’ after the findings of several dives, and began to choose sites that we geologists might not have chosen initially,” says Lori Summa, a geologist and guest investigator at the Woods Hole Oceanographic Institution, who took part in the cruise. “This part of the process is still evolving, but it was exciting to watch the algorithms begin to identify the new patterns from large amounts of data, and couple that information to an efficient, ‘safe’ search strategy.”
In their long-term vision, the researchers hope to use such algorithms to help autonomous vehicles explore environments beyond Earth.
“If we went to Europa and weren’t willing to take any risks in order to preserve a probe, then the probability of finding life would be very, very low,” Ayton says. “You have to risk a little to get more reward, which is generally true in life as well.”
This research was supported, in part, by Exxon Mobile, as part of the MIT Energy Initiative, and by NASA. | 4:59a |
Ingestible, expanding pill monitors the stomach for up to a month MIT engineers have designed an ingestible, Jell-O-like pill that, upon reaching the stomach, quickly swells to the size of a soft, squishy ping-pong ball big enough to stay in the stomach for an extended period of time.
The inflatable pill is embedded with a sensor that continuously tracks the stomach’s temperature for up to 30 days. If the pill needs to be removed from the stomach, a patient can drink a solution of calcium that triggers the pill to quickly shrink to its original size and pass safely out of the body.
The new pill is made from two types of hydrogels — mixtures of polymers and water that resemble the consistency of Jell-O. The combination enables the pill to quickly swell in the stomach while remaining impervious to the stomach’s churning acidic environment.
The hydrogel-based design is softer, more biocompatible, and longer-lasting than current ingestible sensors, which either can only remain in the stomach for a few days, or are made from hard plastics or metals that are orders of magnitude stiffer than the gastrointestinal tract.
“The dream is to have a Jell-O-like smart pill, that once swallowed stays in the stomach and monitors the patient’s health for a long time such as a month,” says Xuanhe Zhao, associate professor of mechanical engineering at MIT.
Zhao and senior collaborator Giovanni Traverso, a visiting scientist who will join the MIT faculty in 2019, along with lead authors Xinyue Liu, Christoph Steiger, and Shaoting Lin, have published their results today in Nature Communications.
Pills, ping-pongs, and pufferfish
The design for the new inflatable pill is inspired by the defense mechanisms of the pufferfish, or blowfish. Normally a slow-moving species, the pufferfish will quickly inflate when threatened, like a spiky balloon. It does so by sucking in a large amount of water, fast.
The puffer’s tough, fast-inflating body was exactly what Zhao was looking to replicate in hydrogel form. The team had been looking for ways to design a hydrogel-based pill to carry sensors into the stomach and stay there to monitor, for example, vital signs or disease states for a relatively long period of time.
They realized that if a pill were small enough to be swallowed and passed down the esophagus, it would also be small enough to pass out of the stomach, through an opening known as the pylorus. To keep it from exiting the stomach, the group would have to design the pill to quickly swell to the size of a ping-pong ball.
“Currently, when people try to design these highly swellable gels, they usually use diffusion, letting water gradually diffuse into the hydrogel network,” Liu says. “But to swell to the size of a ping-pong ball takes hours, or even days. It’s longer than the emptying time of the stomach.”
The researchers instead looked for ways to design a hydrogel pill that could inflate much more quickly, at a rate comparable to that of a startled pufferfish.

A new hydrogel device swells to more than twice its size in just a few minutes in water.
An ingestible tracker
The design they ultimately landed on resembles a small, Jell-O-like capsule, made from two hydrogel materials. The inner material contains sodium polyacrylate — superabsorbent particles that are used in commercial products such as diapers for their ability to rapidly soak up liquid and inflate.
The researchers realized, however, that if the pill were made only from these particles, it would immediately break apart and pass out of the stomach as individual beads. So they designed a second, protective hydrogel layer to encapsulate the fast-swelling particles. This outer membrane is made from a multitude of nanoscopic, crystalline chains, each folded over another, in a nearly impenetrable, gridlock pattern — an “anti-fatigue” feature that the researchers reported in an earlier paper.
“You would have to crack through many crystalline domains to break this membrane,” Lin says. “That’s what makes this hydrogel extremely robust, and at the same time, soft.”
In the lab, the researchers dunked the pill in various solutions of water and fluid resembling gastric juices, and found the pill inflated to 100 times its original size in about 15 minutes — much faster than existing swellable hydrogels. Once inflated, Zhao says the pill is about the softness of tofu or Jell-O, yet surprisingly strong.
To test the pill’s toughness, the researchers mechanically squeezed it thousands of times, at forces even greater than what the pill would experience from regular contractions in the stomach.
“The stomach applies thousands to millions of cycles of load to grind food down,” Lin explains. “And we found that even when we make a small cut in the membrane, and then stretch and squeeze it thousands of times, the cut does not grow larger. Our design is very robust.”
The researchers further determined that a solution of calcium ions, at a concentration higher than what’s in milk, can shrink the swollen particles. This triggers the pill to deflate and pass out of the stomach.
Finally, Steiger and Traverso embedded small, commercial temperature sensors into several pills, and fed the pills to pigs, which have stomachs and gastrointestinal tracts very similar to humans. The team later retrieved the temperature sensors from the pigs’ stool and plotted the sensors’ temperature measurements over time. They found that the sensor was able to accurately track the animals’ daily activity patterns up to 30 days.
“Ingestible electronics is an emerging area to monitor important physiological conditions and biomarkers,” says Hanqing Jiang, a professor of mechanical and aerospace engineering at Arizona State University, who was not involved in the work. “Conventional ingestible electronics are made of non-bio-friendly materials. Professor Zhao’s group is making a big leap on the development of biocompatible and soft but tough gel-based ingestible devices, which significantly extends the horizon of ingestible electronics. It also represents a new application of tough hydrogels that the group has been devoted to for years.”
Down the road, the researchers envision the pill may safely deliver a number of different sensors to the stomach to monitor, for instance, pH levels, or signs of certain bacteria or viruses. Tiny cameras may also be embedded into the pills to image the progress of tumors or ulcers, over the course of several weeks. Zhao says the pill might also be used as a safer, more comfortable alternative to the gastric balloon diet, a form of diet control in which a balloon is threaded through a patient’s esophagus and into the stomach, using an endoscope.
“With our design, you wouldn’t need to go through a painful process to implant a rigid balloon,” Zhao says. “Maybe you can take a few of these pills instead, to help fill out your stomach, and lose weight. We see many possibilities for this hydrogel device.”
This research was supported, in part, by the National Science Foundation, National Institutes of Health, and the Bill and Melinda Gates Foundation. | 1:59p |
MIT robot combines vision and touch to learn the game of Jenga In the basement of MIT’s Building 3, a robot is carefully contemplating its next move. It gently pokes at a tower of blocks, looking for the best block to extract without toppling the tower, in a solitary, slow-moving, yet surprisingly agile game of Jenga.
The robot, developed by MIT engineers, is equipped with a soft-pronged gripper, a force-sensing wrist cuff, and an external camera, all of which it uses to see and feel the tower and its individual blocks.
As the robot carefully pushes against a block, a computer takes in visual and tactile feedback from its camera and cuff, and compares these measurements to moves that the robot previously made. It also considers the outcomes of those moves — specifically, whether a block, in a certain configuration and pushed with a certain amount of force, was successfully extracted or not. In real-time, the robot then “learns” whether to keep pushing or move to a new block, in order to keep the tower from falling.
Details of the Jenga-playing robot are published today in the journal Science Robotics. Alberto Rodriguez, the Walter Henry Gale Career Development Assistant Professor in the Department of Mechanical Engineering at MIT, says the robot demonstrates something that’s been tricky to attain in previous systems: the ability to quickly learn the best way to carry out a task, not just from visual cues, as it is commonly studied today, but also from tactile, physical interactions.
“Unlike in more purely cognitive tasks or games such as chess or Go, playing the game of Jenga also requires mastery of physical skills such as probing, pushing, pulling, placing, and aligning pieces. It requires interactive perception and manipulation, where you have to go and touch the tower to learn how and when to move blocks,” Rodriguez says. “This is very difficult to simulate, so the robot has to learn in the real world, by interacting with the real Jenga tower. The key challenge is to learn from a relatively small number of experiments by exploiting common sense about objects and physics.”
He says the tactile learning system the researchers have developed can be used in applications beyond Jenga, especially in tasks that need careful physical interaction, including separating recyclable objects from landfill trash and assembling consumer products.
“In a cellphone assembly line, in almost every single step, the feeling of a snap-fit, or a threaded screw, is coming from force and touch rather than vision,” Rodriguez says. “Learning models for those actions is prime real-estate for this kind of technology.”
The paper’s lead author is MIT graduate student Nima Fazeli. The team also includes Miquel Oller, Jiajun Wu, Zheng Wu, and Joshua Tenenbaum, professor of brain and cognitive sciences at MIT.
Push and pull
In the game of Jenga — Swahili for “build” — 54 rectangular blocks are stacked in 18 layers of three blocks each, with the blocks in each layer oriented perpendicular to the blocks below. The aim of the game is to carefully extract a block and place it at the top of the tower, thus building a new level, without toppling the entire structure.
To program a robot to play Jenga, traditional machine-learning schemes might require capturing everything that could possibly happen between a block, the robot, and the tower — an expensive computational task requiring data from thousands if not tens of thousands of block-extraction attempts.
Instead, Rodriguez and his colleagues looked for a more data-efficient way for a robot to learn to play Jenga, inspired by human cognition and the way we ourselves might approach the game.
The team customized an industry-standard ABB IRB 120 robotic arm, then set up a Jenga tower within the robot’s reach, and began a training period in which the robot first chose a random block and a location on the block against which to push. It then exerted a small amount of force in an attempt to push the block out of the tower.
For each block attempt, a computer recorded the associated visual and force measurements, and labeled whether each attempt was a success.
Rather than carry out tens of thousands of such attempts (which would involve reconstructing the tower almost as many times), the robot trained on just about 300, with attempts of similar measurements and outcomes grouped in clusters representing certain block behaviors. For instance, one cluster of data might represent attempts on a block that was hard to move, versus one that was easier to move, or that toppled the tower when moved. For each data cluster, the robot developed a simple model to predict a block’s behavior given its current visual and tactile measurements.
Fazeli says this clustering technique dramatically increases the efficiency with which the robot can learn to play the game, and is inspired by the natural way in which humans cluster similar behavior: “The robot builds clusters and then learns models for each of these clusters, instead of learning a model that captures absolutely everything that could happen.”
Stacking up
The researchers tested their approach against other state-of-the-art machine learning algorithms, in a computer simulation of the game using the simulator MuJoCo. The lessons learned in the simulator informed the researchers of the way the robot would learn in the real world.
“We provide to these algorithms the same information our system gets, to see how they learn to play Jenga at a similar level,” Oller says. “Compared with our approach, these algorithms need to explore orders of magnitude more towers to learn the game.”
Curious as to how their machine-learning approach stacks up against actual human players, the team carried out a few informal trials with several volunteers.
“We saw how many blocks a human was able to extract before the tower fell, and the difference was not that much,” Oller says.
But there is still a way to go if the researchers want to competitively pit their robot against a human player. In addition to physical interactions, Jenga requires strategy, such as extracting just the right block that will make it difficult for an opponent to pull out the next block without toppling the tower.
For now, the team is less interested in developing a robotic Jenga champion, and more focused on applying the robot’s new skills to other application domains.
“There are many tasks that we do with our hands where the feeling of doing it ‘the right way’ comes in the language of forces and tactile cues,” Rodriguez says. “For tasks like these, a similar approach to ours could figure it out.”
This research was supported, in part, by the National Science Foundation through the National Robotics Initiative. | 2:00p |
Eruption spurs creation of real-time air pollution network As red molten lava oozed out of Kilauea on the Island of Hawaii (“the Big Island”) in May 2018, destroying houses and property in its path, clouds of ash particles and toxic gases from the volcano — known as vog — filled the air and drifted across the island with the wind.
Even before this most recent phase of the Kilauea eruption, air quality was a major concern for citizens across the island. Researchers from MIT’s Department of Civil and Environmental Engineering (CEE) have worked closely with citizens on Hawaii Island for several years to monitor air quality from the volcano using low-cost sensors. The researchers were even planning to launch a large-scale air quality project funded by the U.S. Environmental Protection Agency (EPA), but the emergency conditions created by Kilauea starting in the spring of last year, and the urgent demands for air pollution data from community groups and state government officials, prompted the MIT researchers to jump into action months before schedule.
“We realized that because we’d been building these instruments for measuring gases and particles relatively quickly and inexpensively, we had the tools to help people in Hawaii understand the quality of the air they were breathing,” says Jesse Kroll, associate professor of civil and environmental engineering and chemical engineering, who leads the air quality research projects across the island with Colette Heald, a CEE professor. “In a period of just about two weeks, we organized this effort in which we built a number of sensor boxes and came over here to Hawaii to try to put them up all over the island.”
Since the researchers had a few sensors on hand, and because time was of the essence, they immediately sent the instruments they had to the Hawaii Department of Health (DoH) before getting to work building the new ones. These sensors were the first to be deployed in the affected zone, as the DoH awaited other air quality monitors from government agencies. The emergency-response initiative was supported entirely by CEE, which provided funds for Kroll and Heald, along with postdoc Ben Crawford and graduate student David Hagan, to purchase supplies to build the air quality sensors and travel to Hawaii to deploy the sensors around the island in May.
“We had been working with MIT for almost two years on developing a project and it was, on our part, to help MIT place monitors and sensors so that they could construct and test a group of sensors that would provide air quality information both back to the university and be set up as a way to inform the public in general,” says Betsy Cole, the director of strategic projects at The Kohala Center, a nonprofit organization that helped put the MIT researchers in contact with citizens, schools, and organizations across the island. Cole notes that an increase in the number of requests for information prompted her to contact Kroll to see if there was anything MIT could do to accelerate the process of providing sensors and measurements for citizens to understand the impact of the eruption — and its lasting impact — on their air.
The MIT sensors can detect and measure sulfur dioxide, which is an irritant and can be toxic in large quantities, as well as particulate matter, including sulfuric acid. “With this eruption, there was some concern about ash coming from the volcano as well. So we can measure that with particulate sensors, too,” Crawford says. The sensors provide real-time air quality data, and the information is published on a website created by the researchers. Currently, the website reflects data from 16 sensors across the Island, and more sensors will be added as the project progresses.
There are many benefits to deploying the MIT sensors in place of larger, more expensive instruments typically used by government agencies. Hagan, the developer of the website and one of the original creators of this these sensors, explains, “[our sensors] have a much smaller footprint, so you can put them in more places; they are solar powered, so you can really put them in remote areas, and they communicate wirelessly over a 3G network, so we get all this data remotely in real-time at very high spatial and temporal resolution.”
The design of the sensors makes it feasible for the researchers to install them in many areas across the island, but this required buy-in from local citizens. “When deploying a sensor network like this, where you want to get measurements made throughout a region, it’s really important to interact directly with members of the communities,” Kroll explains. In turn, The Kohala Center established connections with schools and health centers in preparation for the EPA-funded research project, and the researchers were able to leverage these connections early as part of their emergency response project. The locations were strategically selected for their positions as community congregation spaces, and for the educational opportunities afforded by the sensor’s data, as education and outreach are a central facet of the long-term research project.
Crawford explains that, as part of the EPA project, “we’re working with the teachers so they can use the sensors in different ways in their STEM curriculum, to engage with the students about data analysis, environmental science, [and] some programming skills,” He moved to Hawaii in September to both maintain the network and to provide professional development opportunities for teachers.
As Crawford and Hagan installed sensors at different locations shortly after the major eruption in May, teachers and administrators told the researchers about the impact of the eruption on their students, often reporting an increase in absences and in a few cases the loss of students’ homes. Steve Hirakami, the principal and founder of the Hawaii Academy of Arts and Science (HAAS), estimated that almost 40 percent of their staff and students had been impacted by the evacuation. “This has a major impact on [our] school,” he said in May when the Kilauea was still active. Hirakami used the MIT sensors to determine school closures and expressed gratitude to the researchers for providing him with the resource.
In the immediate wake of the fissures, Wendy Baker, a history teacher at HAAS, worked with Crawford to install a sensor that the researchers had sent through the mail on the school’s property, even before the researchers arrived on the island. She, too, highlighted the value of the sensors for the peace of mind for the community during the eruption, and also as a teaching tool. “The day that we came back, I pulled it up [on the projector], and we’ve been looking at it every morning, looking at the data and checking the air quality,” she recalled. Baker also explained that the sensor was helpful for connecting the science behind the air quality with what students were experiencing in their everyday lives.
Ted Brattstrom, a high school teacher at Ka‘u High School, was similarly enthusiastic about having a sensor installed at his school.
“The sensors are going to give us two benefits. The first and foremost benefit is, by having this data in one-minute intervals, we’re going to know when we actually have an SO2 event occurring,” he said.
“That lets us keep the kids inside, and in as air conditioned an area and as filtered an area as we can, and then say when it’s safe to go outside,” he explained in May as the sensor at his school was initially installed. “As a science geek for myself and my class, we now get to see how the atmosphere is running, how not only the caldera itself — the volcano itself — is operating and putting out gases, but also how that’s coming downwind, working with the topography of the Island, and getting the [vog] here.”
The sensors themselves are rooted in education. They were initially developed as part of the CEE subject 1.091 (Traveling Research Environmental Experiences, or TREX), an annual undergraduate fieldwork project which takes students to Hawaii Island to conduct research over Independent Activities Period. Over the years, the students discovered and worked through the glitches and issues with the sensors, leading to the development of the current iteration. It was thus natural for Kroll and Heald to engage with the EPA on a new project to use the sensors for real-time data but to also have a similar educational component with the schools and health centers.
“The ultimate goal is for each school to have one of these air quality monitors, and by doing that the students get information on the air that they’re breathing, really connecting these abstract concepts of chemistry and of measurements to something they actually know: the vog in the air they’re breathing,” Kroll says of the long-term project. “On top of that, it puts a data set in their hands. We make the data freely available so we can see all these numbers corresponding to concentrations of SO2 and particulate matter, and they can learn how to plot the data, how to analyze it, how to think about it in the larger context of environmental science.”
In early August, as abruptly as it started, the eruption suddenly ended. Kilauea is currently the quietest it has been in decades. While the immediate threat has dissipated, and the air quality in Hawaii is better than it has been since the beginning of the eruption in 1983, the network continues to collect and publish valuable data on background pollution levels.
Since installing the sensors, the researchers have collected a unique dataset on the air quality across the island. They are currently analyzing their measurements from the eruption to better understand the atmospheric transport and transformation of vog components. The researchers are also hoping to learn how sensor data relates to — and complements — air quality measurements from other platforms such as monitoring stations and satellites.
“Even though the eruption seems to be over, the network is still running. Right now, we’re measuring very low levels of pollutants, as expected. This is good not only for the local air quality but also for the science: When researching pollution, it’s not often you get to measure what the underlying background levels are,” Kroll says of the ongoing research. “More importantly, we now have the sensor network up, so we'll be ready to measure air quality across the whole island the next time Kilauea erupts.” |
|