MIT Research News' Journal
 
[Most Recent Entries] [Calendar View]

Monday, November 16th, 2015

    Time Event
    12:00a
    “Shrinking bull’s-eye” algorithm speeds up complex modeling from days to hours

    To work with computational models is to work in a world of unknowns: Models that simulate complex physical processes — from Earth’s changing climate to the performance of hypersonic combustion engines — are staggeringly complex, sometimes incorporating hundreds of parameters, each of which describes a piece of the larger process.

    Parameters are often question marks within their models, their contributions to the whole largely unknown. To estimate the value of each unknown parameter requires plugging in hundreds, if not thousands, of values, and running the model each time to narrow in on an accurate value — a computation that can take days, and sometimes weeks.

    Now MIT researchers have developed a new algorithm that vastly reduces the computation of virtually any computational model. The algorithm may be thought of as a shrinking bull’s-eye that, over several runs of a model, and in combination with some relevant data points, incrementally narrows in on its target: a probability distribution of values for each unknown parameter.

    With this method, the researchers were able to arrive at the same answer as a classic computational approaches, but 200 times faster.

    Youssef Marzouk, an associate professor of aeronautics and astronautics, says the algorithm is versatile enough to apply to a wide range of computationally intensive problems.

    “We’re somewhat flexible about the particular application,” Marzouk says. “These models exist in a vast array of fields, from engineering and geophysics to subsurface modeling, very often with unknown parameters. We want to treat the model as a black box and say, ‘Can we accelerate this process in some way?’ That’s what our algorithm does.”

    Marzouk and his colleagues — recent PhD graduate Patrick Conrad, Natesh Pillai from Harvard University, and Aaron Smith from the University of Ottawa — have published their findings this week in the Journal of the American Statistical Association.

    Modeling “Monopoly”

    In working with complicated models involving multiple unknown parameters, computer scientists typically employ a technique called Markov chain Monte Carlo (MCMC) analysis — a statistical sampling method that is often explained in the context of the board game “Monopoly.”

    To plan out a monopoly, you want to know which properties players land on most often — essentially, an unknown parameter. Each space on the board has a probability of being landed on, determined by the rules of the game, the positions of each player, and the roll of two dice. To determine the probability distribution on the board — the range of chances each space has of being landed on — you could roll the die hundreds of times.

    If you roll the die enough times, you can get a pretty good idea of where players will most likely land. This, essentially, is how an MCMC analysis works: by running a model over and over, with different inputs, to determine a probability distribution for one unknown parameter. For more complicated models involving multiple unknowns, the same method could take days to weeks to compute an answer.

    Shrinking bull’s-eye

    With their new algorithm, Marzouk and his colleagues aim to significantly speed up the conventional sampling process.

    “What our algorithm does is short-circuits this model and puts in an approximate model,” Marzouk explains. “It may be orders of magnitude cheaper to evaluate.”

    The algorithm can be applied to any complex model to quickly determine the probability distribution, or the most likely values, for an unknown parameter. Like the MCMC analysis, the algorithm runs a given model with various inputs — though sparingly, as this process can be quite time-consuming. To speed the process up, the algorithm also uses relevant data to help narrow in on approximate values for unknown parameters.

    In the context of “Monopoly,” imagine that the board is essentially a three-dimensional terrain, with each space represented as a peak or valley. The higher a space’s peak, the higher the probability that space is a popular landing spot. To figure out the exact contours of the board — the probability distribution — the algorithm rolls the die at each turn and alternates between using the computationally expensive model and the approximation. With each roll of the die, the algorithm refers back to the relevant data and any previous evaluations of the model that have been collected.

    At the beginning of the analysis, the algorithm essentially draws large, vague bull’s-eyes over the board’s entire terrain. After successive runs with either the model or the data, the algorithm’s bull’s-eyes progressively shrink, zeroing in on the peaks in the terrain — the spaces, or values, that are most likely to represent the unknown parameter.

    “Outside the normal”

    The group tested the algorithm on two relatively complex models, each with a handful of unknown parameters. On average, the algorithm arrived at the same answer as each model, but 200 times faster.

    “What this means in the long run is, things that you thought were not tractable can now become doable,” Marzouk says. “For an intractable problem, if you had two months and a huge computer, you could get some answer, but you would not necessarily know how accurate it was. Now for the first time, we can say that if you run our algorithm, you can guarantee that you’ll find the right answer, and you might be able to do it in a day. Previously that guarantee was absent.”

    Marzouk and his colleagues have applied the algorithm to a complex model for simulating movement of sea ice in Antarctica, involving 24 unknown parameters, and found that the algorithm is 60 times faster arriving at an estimate than current methods. He plans to test the algorithm next on models of combustion systems for supersonic jets.

    “This is a super-expensive model for a very futuristic technology,” Marzouk says. “There might be hundreds of unknown parameters, because you’re operating outside the normal regime. That’s exciting to us.”

    This research was supported, in part, by the Department of Energy.

    11:59p
    Scaling commercial energy efficiency

    Big data may soon make buildings greener. With a recent major acquisition, MIT alumni-founded Retroficiency, which has assessed hundreds of thousands of buildings, is poised to bring its advanced energy analytics platform to millions of commercial buildings.

    According to the Environmental Protection Agency (EPA), 6 million U.S. commercial and industrial buildings account for roughly 45 percent of greenhouse gas emissions and, on average, 30 percent of the energy in those buildings is wasted. The EPA also estimates that improving the energy efficiency of those buildings by just 10 percent could significantly reduce greenhouse gas emissions, by an amount equivalent to taking 30 million vehicles off the road.

    Retroficiency, recently acquired by the energy and sustainability company Ecova, has been tackling that problem with a software data analytics platform that comprehensively analyzes the energy usage in thousands of buildings serviced by a utility company. It then shows the utility specific energy-conservation measures that buildings can take — such as replacing heating, ventilation, and air conditioning (HVAC) units, or even adjusting a thermostat — to cut wasted energy.

    Founded in 2009 by MIT Sloan School of Management alumni Bennett Fisher MBA ‘09 and Bryan Long MBA ’09, the startup has since analyzed about 3.5 billion square feet of building space worldwide, identifying about 6 terawatt hours of energy savings collectively — roughly the energy consumed by 600,000 U.S. homes.

    Now under Ecova, which services nearly 50 utilities and 700,000 commercial sites across the nation, the startup aims to fulfill its long-term mission of drastically curbing building emissions, says Fisher, now Retroficiency’s CEO.

    “We started this company with a very specific mission in mind: To scale energy efficiency in commercial buildings and to help change the world,” Fisher says. “The difference is we now have the platform and resources to go do it. The next few years will be the fulfillment of that mission.”

    Automating audits

    Increasingly, regulators and customers are pressing utilities to find better ways to save energy. Many times, companies send auditors from building to building to gauge the efficiency of lights and HVAC units and offer suggestions for savings, which can take weeks and cost thousands of dollars, Fisher says.

    But Retroficiency offers a fully automated approach, which makes it scalable, Fisher says. “Our ability to take in consumption data and analyze buildings without a manual touch point enables us to analyze those buildings in a matter of minutes, and scale cost-effectively to hundreds of thousands of assets,” he says.

    To do so, Retroficiency grabs energy consumption data — such as monthly bills and interval data from meters — from thousands to hundreds of thousands of commercial buildings serviced by a utility company. That data is combined with publicly available information — such as local weather, the age of the building’s equipment, and types of windows — to determine energy use patterns.  

    From that data analytics process, Retroficiency creates unique thermodynamic energy models for each building and can show current energy consumption at the end-use level and propose building-specific measures to improve efficiency.

    Some buildings, for instance, may leave the lights on overnight or when no one is there, or simply fail to adjust a thermostat when heating or cooling isn’t needed. Retroficiency will inform the utility company of the exact buildings that should implement solutions to those issues, such as adjusting lighting times or thermostat settings.

    The platform can also detect defective or outdated equipment for potential repairs or replacements, says Long, now Retroficiency’s chief technology officer. “We can see how the HVAC system responds to heat and we can tell, generally speaking, if it’s stressed, or undersized, or out of maintenance,” he says. The same goes for inefficient light bulbs that could use an upgrade. All data analytics is done remotely.  

    The utility companies can use this information, say, through a Retroficiency-powered Web portal to let customers check in on their energy usage and savings opportunities, or to give the data to account managers at energy service providers who engage with building managers about how to increase efficiency.

    Surprising savings

    Over the years, Retroficiency has uncovered surprising data for building efficiency.

    In a 2014 analysis of 30,000 commercial buildings in New York, for instance, the company predicted that small, seasonal changes in thermostat temperatures — even by just 1 degree — could save $145 million, or about 2 percent of the energy across all buildings studied. Additionally, replacing inefficient windows could save $227 million in energy costs, or about 4.5 percent of the energy.

    That study marked the first in Retroficiency’s Building Genome Project, which aims to map the distinct markers, equipment, and operational characteristics that influence how buildings function in major cities around the world. The company plans to launch similar studies at different cities in the future.   

    Energy savings also extend to outdated or replaceable technologies. The T12 fluorescent light bulbs, for instance, were supposedly largely phased out of commercial buildings due to their inefficiency — but that’s always not the case.

    “Someone in industry will say, ‘No one has T12s anywhere,’” Long says. “What we can find is they do exist and those folks, for whatever reason, haven’t done anything. There’s more opportunity [for savings] than people realize — the analytics allows you to figure out what area to investigate first.”

    Combining forces

    Fisher and Long met in 2007 as classmates at MIT Sloan. Fisher had a background in commercial real estate and understood building-inefficiency challenges; Long had spent 10 years as a software engineer at the U.S. Department of Transportation’s Volpe Center, a research and development facility for transportation issues.

    Both came to MIT with entrepreneurship in mind. “The beautiful part of MIT is they bring together some diverse people that are all passionate about similar things — one of which is building a company,” Fisher says.

    Seeking to design energy efficiency technologies, they combined forces to build an initial prototype of the Retroficiency software: Fisher had the industry knowledge; Long had the engineering chops. “It was a lot of Bennett saying, ‘This is what I want,’ and me saying, ‘Yeah, I can do that,’” Long says.

    In their final year at MIT Sloan, Fisher and Long launched Retroficiency with help from many experts and entrepreneurial mentors on campus. Among those were early advisor Sarah Slaughter, a visiting lecturer in civil and environmental engineering and then-director of MIT Sloan’s sustainability initiative; current advisor Shari Loessberg, a senior lecturer of technological innovation, entrepreneurship, and strategic management; and tech-savvy students from the Building Technology group.

    “The [MIT] community has led to some awesome colleagues to help us save the world together,” Fisher says. “Connecting with awesome people to answer the right questions, at the right time, has been an invaluable opportunity of being associated with MIT.”

    << Previous Day 2015/11/16
    [Calendar]
    Next Day >>

MIT Research News   About LJ.Rossia.org