MIT Research News' Journal
[Most Recent Entries]
[Calendar View]
Friday, July 14th, 2017
| Time |
Event |
| 12:00p |
Ultra-high-contrast digital sensing Virtually any modern information-capture device — such as a camera, audio recorder, or telephone — has an analog-to-digital converter in it, a circuit that converts the fluctuating voltages of analog signals into strings of ones and zeroes.
Almost all commercial analog-to-digital converters (ADCs), however, have voltage limits. If an incoming signal exceeds that limit, the ADC either cuts it off or flatlines at the maximum voltage. This phenomenon is familiar as the pops and skips of a “clipped” audio signal or as “saturation” in digital images — when, for instance, a sky that looks blue to the naked eye shows up on-camera as a sheet of white.
Last week, at the International Conference on Sampling Theory and Applications, researchers from MIT and the Technical University of Munich presented a technique that they call unlimited sampling, which can accurately digitize signals whose voltage peaks are far beyond an ADC’s voltage limit.
The consequence could be cameras that capture all the gradations of color visible to the human eye, audio that doesn’t skip, and medical and environmental sensors that can handle both long periods of low activity and the sudden signal spikes that are often the events of interest.
The paper’s chief result, however, is theoretical: The researchers establish a lower bound on the rate at which an analog signal with wide voltage fluctuations should be measured, or “sampled,” in order to ensure that it can be accurately digitized. Their work thus extends one of the several seminal results from longtime MIT Professor Claude Shannon’s groundbreaking 1948 paper “A Mathematical Theory of Communication,” the so-called Nyquist-Shannon sampling theorem.
Ayush Bhandari, a graduate student in media arts and sciences at MIT, is the first author on the paper, and he’s joined by his thesis advisor, Ramesh Raskar, an associate professor of media arts and sciences, and Felix Krahmer, an assistant professor of mathematics at the Technical University of Munich.
Wraparound
The researchers’ work was inspired by a new type of experimental ADC that captures not the voltage of a signal but its “modulo.” In the case of the new ADCs, the modulo is the remainder produced when the voltage of an analog signal is divided by the ADC’s maximum voltage.
“The idea is very simple,” Bhandari says. “If you have a number that is too big to store in your computer memory, you can take the modulo of the number. The act of taking the modulo is just to store the remainder.”
“The modulo architecture is also called the self-reset ADC,” Bhandari explains. “By self-reset, what it means is that when the voltage crosses some threshold, it resets, which is actually implementing a modulo. The self-reset ADC sensor was proposed in electronic architecture a couple years back, and ADCs that have this capability have been prototyped.”
One of those prototypes was designed to capture information about the firing of neurons in the mouse brain. The baseline voltage across a neuron is relatively low, and the sudden voltage spikes when the neuron fires are much higher. It’s difficult to build a sensor that is sensitive enough to detect the baseline voltage but won’t saturate during spikes.
When a signal exceeds the voltage limit of a self-reset ADC, it’s cut off, and it starts over again at the circuit’s minimum voltage. Similarly, if the signal drops below the circuit’s minimum voltage, it’s reset to the maximum voltage. If the signal’s peak voltage is several times the voltage limit, the signal can thus wrap around on itself again and again.
This poses a problem for digitization. Digitization is the process of sampling an analog signal — essentially, making many discrete measurements of its voltage. The Nyquist-Shannon theorem establishes the number of measurements required to ensure that the signal can be accurately reconstructed.
But existing sampling algorithms assume that the signal varies continuously up and down. If, in fact, the signal from a self-reset ADC is sampled right before it exceeds the maximum, and again right after the circuit resets, it looks to the standard sampling algorithm like a signal whose voltage decreases between the two measurements, rather than one whose voltage increases.
Big mistakes
Bhandari and his colleagues were interested in the theoretical question of how many samples are required to resolve that ambiguity, and the practical question of how to reconstruct the original signal. They found that the number of samples dictated by the Nyquist-Shannon theorem, multiplied by pi and by Euler’s number e, or roughly 8.5, would guarantee faithful reconstruction.
The researchers’ reconstruction algorithm relies on some clever mathematics. In a self-reset ADC, the voltage sampled after a reset is the modulo of the true voltage. Recovering the true voltage is thus a matter of adding some multiple of the ADC’s maximum voltage — call it M — to the sampled value. What that multiple should be, however — M, 2M, 5M, 10M — is unknown.
The most basic principle in calculus is that of the derivative, which provides a formula for calculating the slope of a curve at any given point. In computer science, however, derivatives are often approximated arithmetically. Suppose, for instance, that you have a series of samples from an analog signal. Take the difference between samples 1 and 2, and store it. Then take the difference between samples 2 and 3, and store that, then 3 and 4, and so on. The end result will be a string of values that approximate the derivative of the sampled signal.
The derivative of the true signal to a self-reset ADC is thus equal to the derivative of its modulo plus the derivative of a bunch of multiples of the threshold voltage — the Ms, 2Ms, 5Ms, and so on. But the derivative of the M-multiples is itself always a string of M-multiples, because taking the difference between two consecutive M-multiples will always yield another M-multiple.
Now, if you take the modulo of both derivatives, all the M-multiples disappear, since they leave no remainder when divided by M. The modulo of the derivative of the true signal is thus equivalent to the modulo of the derivative of the modulo signal.
Inverting the derivative is also one of the most basic operations in calculus, but deducing the original signal does require adding in an M-multiple whose value has to be inferred. Fortunately, using the wrong M-multiple will yield signal voltages that are wildly implausible. The researchers’ proof of their theoretical result involved an argument about the number of samples necessary to guarantee that the correct M-multiple can be inferred.
“If you have the wrong constant, then the constant has to be wrong by a multiple of M,” Krahmer says. “So if you invert the derivative, that adds up very quickly. One sample will be correct, the next sample will be wrong by M, the next sample will be wrong by 2M, and so on. We need to set the number of samples to make sure that if we have the wrong answer in the previous step, our reconstruction would grow so large that we know it can’t be correct.”
“Unlimited sampling is an intriguing concept that addresses the important and real issue of saturation in analog-to-digital converters,” says Richard Baraniuk, a professor of electrical and computer engineering at Rice University and one of the co-inventors of the single-pixel camera. “It is promising that the computations required to recover the signal from modulo measurements are practical with today’s hardware. Hopefully this concept will spur the development of the kind of sampling hardware needed to make unlimited sampling a reality.” | | 3:00p |
A rose by any other name would smell as yeast From afar, the multistory fermenters — towering metal cylinders encompassed by scaffolding, ladders, and pipes — look like rockets on a launch pad. Climbing to the top of the fermenter, visitors to the Mexico City manufacturing plant can peer down at a set of paddles churning 50,000 liters of frothy, golden broth. Within the mixture, genetically engineered yeast are synthesizing lactones — a family of molecules responsible for the aromas of fruits and flowers. MIT Department of Biology alumna Emily Havens Greenhagen ’05 has visited the plant over several weeks to monitor her company’s most promising project: a plan to make perfume from yeast cells.
Greenhagen is director of fermentation engineering at Ginkgo Bioworks, a Boston-based synthetic biology company seeking to turn microbes into customizable factories to produce products ranging from pesticides to perfumes. Scientists such as Greenhagen know that microbes can manufacture organic products more efficiently than any machine or production line. Instead of workers at an assembly line, yeast cells use enzymes — proteins that perform a specific task such as removing or adding a group of atoms to a molecule. Working together, teams of enzymes incrementally transform basic nutrients like sugar into other compounds such as alcohol and carbon dioxide. “By inserting DNA encoding for enzymes from different plants or animals into yeast cells,” Greenhagen says, “we can tailor microbes to produce different materials.”
Manipulating yeast cells is an old idea. For hundreds of years, people have been brewing beer through fermentation, a natural process where yeast consume sugar to create alcohol and different flavorings. In the 1930s, scientists began using microbes to create antibiotics and biofuels. “What makes Ginkgo cutting edge is the automation and scale,” Greenhagen explains of the company co-founded in 2008 by synthetic biologist Tom Knight ’69, SM ’79, PhD ’83. “Using robots, we can genetically modify hundreds of yeast strains and test how they perform in just one week.”
Ginkgo Bioworks’ automation played a critical role in creating the yeast strain that Greenhagen observed in the Mexico City manufacturing plant. These microbes, containing a set of enzymes needed to turn fatty acids into lactones, were engineered through thousands of trial-and-error experiments performed by robots. For each enzyme encoded in the final strain, a team of scientists and machines led by Greenhagen had tested hundreds of versions originating from diverse plant species and strategically mutated in different ways to work more efficiently in yeast cells.
During the testing process, researchers grew thousands of yeast strains, each genetically engineered for a single enzyme, in small tabletop fermenters before combining the most effective genetic modifications into a single cell to produce lactones. Ten years ago, when Greenhagen first started in industry, fermenting was a labor-intensive process done by hand, requiring a single researcher to manage four reactors at a time. In contrast, members of Greenhagen’s team — using robots to mix different concentrations of nutrients, add yeast extract, and monitor the output of desired molecules — routinely managed 24 experiments simultaneously. As a result of automation, the development process took six months instead of a year and half. After a purification process, the developed scent could then be delivered to Robertet, a French fragrance and flavor manufacturer and Ginkgo’s first commercial customer. “The robots don’t just speed up projects,” Greenhagen says, “they take care of the monotonous tasks so that we have more time to think about results and design new experiments — the fun stuff that scientists really want to do.”
Like Ginkgo Bioworks, Greenhagen embodies the convergence of biology and engineering. As a freshman at MIT, Greenhagen had decided to pursue chemical engineering but changed her plans when she learned about synthetic biology. “I couldn’t stop reading my textbook,” she recalls. “I was so excited by the idea that there are living organisms that we can modify to use as tools. That’s when I decided to become a scientist.”
Greenhagen credits one particular microbiology course, taught by Professor Graham Walker, for shaping her thinking as a biologist. While lecturing on organelles, Walker encouraged students to imagine the room as a cell and to point out where the nucleus or mitochondria might be located. “His teaching style inspired me to really visualize what’s going on in the fermentation vessel,” Greenhagen says. “Whereas fermentation engineers worry more about physical variables like oxygen transfer and think of organisms as black boxes, my training at MIT gave me an empathy for microbes. It’s an approach that a lot of chemical engineers don’t necessarily think about and has been key to my success.”
By harnessing biology and engineering, Greenhagen and others in the field of synthetic biology are beginning to transform manufacturing. Many consumer goods, from food to clothes to perfume, are produced using traditional ingredients and industrial processes that are increasingly unsustainable as planetary resources dwindle. By using microbes, companies can generate many of the raw resources they require with less energy and waste than factories.
In the case of flavors and fragrances, companies like Robertet had been extracting the majority of their starting compounds from plants and flowers, whose growth may vary dramatically from year to year. In contrast, Ginkgo Bioworks’s yeast feed on basic crops like corn and sugar cane, which are more economical to raise, and release their products in highly reliable reactions. Encouraged by its success with perfumes, Ginkgo Bioworks is using similar approaches to make organic pesticides, industrial enzymes, and products for human health.
For Greenhagen, scented yeast represent not only the potential of synthetic biology but also the culmination of years of work and education. Before joining Ginkgo Bioworks, Greenhagen had spent six years at another startup without seeing any of her projects commercialized. “When we were bought out by a larger company and taken off the commercialization path, I was told to sit back and relax but that’s not what I do,” she says.
Greenhagen traces her driving mentality back to MIT lab courses and an Undergraduate Research Opportunities Program (UROP) project in Professor Leona Samson’s laboratory. There, Greenhagen says, students were expected to read papers, design their own experiments, and place their findings in the context of published results. “This critical thinking and problem solving combined with real-life experience inspired me to think of my education as a gift that I should put to use,” she says. Years later, having traveled thousands of miles to see her first commercialized product, Greenhagen says she is still moved by MIT. “When I walk through campus, I get goosebumps from all the memories. It is such an amazing place.” | | 4:15p |
3 Questions: The future of the electric utility Francis O’Sullivan, director of research for the MIT Energy Initiative (MITEI), recently led discussions about the future of the electric grid and clean energy technologies with leaders in industry, government, and academia at MITEI’s Associate Member Symposium. In the wake of the symposium, O’Sullivan reflects on several of its main themes: current trends in the industry, changes in customer behavior, and innovative potential responses to the challenges facing the utility industry today.
Q: There’s been a lot of talk about three current megatrends in energy: decentralization, digitalization, and decarbonization. Can you address briefly what each of these entails, and what’s driving this movement?
A: These three megatrends are deeply connected. First, broadly, people appreciate that decarbonization is critical if we are to address climate change in a meaningful way, and electricity is the sector that can be decarbonized most rapidly. Today, ever-improving economics are driving a secular expansion in the use of clean energy technologies, particularly wind and solar for power generation. Solar is especially important, because as a technology, it’s unique. It can be effectively deployed at any scale, which adds flexibility to how power systems can be designed. It also provides end users with a new option for meeting their individual energy needs. People can choose solar on an individual, house-to-house basis.
In this way, decarbonization is connected to decentralization. It’s not just individual households driving decentralization, either — in fact, commercial and industry users are now in the vanguard of distributed energy adoption. The ambition is to realize a future energy system that is cleaner, more decentralized, and has lower operating costs and higher resiliency.
This is where digitalization comes in. Having these new assets connected to the system is one thing, but you need to be able to control and coordinate them in real time if their efficiency and resiliency potential is to be fully realized.
Q: How do you see electricity customers’ behavior changing, and what does this mean for utilities?
A: Historically, consumers had very little choice in how they got their electricity. Then, starting in the ’90s, the restructuring of the energy industry and the introduction of retail choice meant that consumers gained the ability to choose from whom they bought their electricity. However, the modes of generation were still traditional ones. Today’s improved technology means people have much more choice now in terms of not just who supplies their power, but also how it is generated. There’s a subset of the public that actively seeks that greater choice. They’re interested in the environmental impact of their energy decisions. Cost-effectiveness and added resiliency are also important drivers behind this desire for greater diversity in energy services.
For the first time we now have avenues for offering electricity customers more choice. Utilities are responding to the fact that consumers want more bespoke solutions. The adoption of smart energy devices like Nest, for example, are indicative of this larger movement towards greater transparency and customer empowerment.
Q: What kind of infrastructure challenges are utilities facing now, and what kinds of emerging technologies are needed to help overcome them?
A: The age of a utility’s infrastructure and the rate of demand growth across the region it covers are normal stresses that are going to affect any system over the years. More salient at this moment in time is the need to put in place the digital infrastructure that will support the effective integration of today’s new generation and storage technologies onto the grid. In addition to offering a pathway to greater resiliency and environmental benefits, a more digitized system has the potential to unlock new commercial value and improve overall welfare if it is used to communicate more accurate price signals for services up and down the electricity value chain that are more highly resolved spatially and temporally.
There’s pressure on utilities to make these infrastructure improvements, but there’s also a tension with regulators who must ensure that these investments are just and reasonable and for the broad benefit of ratepayers. The truth is, though, that we need this new digitized infrastructure if we wish to fully realize the technical and indeed economic benefits that the power sector’s newly expanded technology toolbox can offer. |
|