A smooth-running electrical grid finely balances resources against need at superhuman speed, day in and day out. Pulling that off in the midst of a massive shift from giant coal and gas plants able to run 24/7 to scattershot arrays of wind turbines and solar panels fluctuating by the minute is a tough problem. But it is exactly the kind of problem that artificial intelligence (AI) should be good at.
Grids are vast, interconnected systems, and they already rely heavily on old-school automation to manage the ever-shifting dance of power generation and consumption among thousands of variables and millions of users. But that centralized, hardwired kind of automation has its limits.
In England, for instance, where I grew up, there has been much hand-wringing over the carbon footprint of the 60 billion nice cups of tea we drink each year. Terrific debates have raged over how much eco-friendlier loose-leaf is than packaged teabags, and over the damage that a milky cuppa does to the climate (thanks to habitually flatulent cows). Often overlooked, however, is the peculiar national phenomenon of “TV pickup.” After a big soccer match, a huge surge in demand saps energy from the grid as people switch on their electric kettles and open their fridges. In 1990, the nail-biting climax of the World Cup semifinal created a record TV pickup of 2.8 gigawatts—more than the output of two nuclear power stations.
Grid operators dread such spikes. When supply fails to match demand, the frequency of the alternating current swerves. In the United States, even a little dip from the standard 60-Hz frequency of electricity can make wired clocks run slow. Severe deviations can damage televisions and crash computers, so utilities typically resort to rolling blackouts in such cases. Earlier this year, for example, a heat wave in South Australia caused people to crank up their air conditioners, forcing network operators to pull the plug on tens of thousands of customers.
Utilities hold extra generators in reserve to react to TV pickup, heat waves, and other predictable surges in demand. But the swiftest response is rarely the greenest. Boil water for tea on a normal day, and you might use clean power from Britain’s off-shore wind farms. Boil it after the FA Cup final, and you will likely tap electricity from a much dirtier source, such as “diesel farms” or natural gas–fired “peaker” generators.
The cost for this reliability is measured in both dollars and tons of carbon dioxide. EnerNOC, an energy software company, estimates that 10 percent of all generating capacity in the United States is there just to meet the last 1 percent of demand. “Some gas power stations operate for only 100 to 200 hours a year, but they have to be kept open and staffed,” says Valentin Robu, an assistant professor in smart grid systems at Edinburgh’s Heriot-Watt University. “They’re extremely expensive.” And they are increasingly common, as many countries shift their power mixes to include more wind and solar farms, whose output can vary from minute to minute even on the briskest and sunniest days.
Utilities could get by with fewer back-up power plants—reducing costs and emissions simultaneously—if they could flatten the peaks and troughs of electricity demand on a national scale. But to do that efficiently and quickly enough, grid managers would need superhuman abilities to see spikes coming and to coordinate myriad complex adjustments. They also need better ways to store energy cheaply and to push some power uses from surge times to lulls.
Luckily, technology may offer an answer: put AIs in charge of our grids. Algorithms and software are already capable of weighing thousands of variables and making millions of tiny decisions a day. AI is also increasingly able to blend customer preferences with inferences about how customers usually behave—to anticipate when people will adjust their thermostats or switch on their washing machines, and even to do it for them. Some see in this technology the potential for a revolution in the way we organize the generation and delivery of energy.
A revolution of this kind is badly needed. The demand for energy—and electricity in particular—is all but certain to rise substantially throughout the first half of this century, even as carbon emissions must decline precipitously to avoid accelerating global warming even further. Researchers now think that handing over control of electricity grids to AIs could curb our carbon footprint without destroying our way of life. “Timing is critical,” says Robu. “This is exactly where we need AI.”
How AI Can Suck Carbon Out of Design & Manufacturing
Experts project that by 2050 emissions from aviation will consume about a quarter of the world’s remaining carbon budget. AI offers a way to slash airplanes’ weight—and do the same for their emissions.
Aircraft manufacturers are now experimenting with generative design, which mimics natural evolution in the way it arrives at an optimal plan for a part. The process starts with engineers’ inputting design goals for an engine component or a wing spar into software. Given the specified materials and structural requirements, the AI software then quickly generates many alternative designs. It simulates the performance of each candidate and calculates its weight and cost. Equally important, the AI learns from each iteration which aspects of each design work and which don’t. In the time it takes a human designer to come up with one idea, an AI can spin through thousands to home in on the optimal solution.
The results often look bizarre—organic shapes that bend and gape like skeletons or ancient trees—but the cost and weight savings can be remarkable. An aircraft part that Airbus designed last year using AI was 45 percent lighter than the best a human could manage.
The search for greener ways to meet surges in demand has led utilities to try some creative solutions. One is to build what amount to giant capacitors: systems that can fill up gradually with energy during quiet periods and then discharge it quickly when demand spikes. Where geography allows, power stations can pump water uphill into dams (or compressed air down into caverns), then discharge it to spin generator turbines at tea time. Other storage options include massive flywheels and even giant versions of the lithium-ion batteries that power your laptop. But the giant scale needed to smooth out the tallest peaks of demand poses tough economic and engineering challenges for all these approaches.
A smarter idea, Robu says, is to leverage high-capacity batteries that consumers are already connecting to the grid, such as those in fancy new electric vehicles (EVs) or in some houses equipped with rooftop solar panels. “First of all, you coordinate charging so you don’t charge cars at all when there’s high demand,” he says. “Then you could support the grid by partially discharging their batteries at these critical times. Of course, people who own these EVs would get paid for that.” The trick is getting everything to work together seamlessly. No one wants to finish their morning cuppa and head out to work, only to find that their EV’s battery is flat.
Enter artificial intelligence. AI is a blanket term for computer software that mimics a few of the smarter things that humans can do, including learning, reasoning, pattern recognition, and problem solving. AI can already surpass humans at selected tasks, such as ingesting mountains of data from millions of devices (such as car batteries) and quickly figuring out the most efficient way to charge and discharge them. At other jobs, AI is less capable than an infant. Computers still struggle to understand the simplest social relationships, for example, and they often come to ridiculous conclusions when they try to make deductions that require common-sense knowledge about the world.
Where machine learning has succeeded most impressively—such as in recognizing faces or responding helpfully to certain kinds of spoken commands—it works not because the computer has learned to do the task the way a human would, but because the software continually changes its own processes to reach the desired outcome. With AI based on this kind of machine learning, the more data it ingests, and the more widely and often it is used, the more accurate it becomes.
While Amazon, Google, and other tech companies were hoovering up AI experts to make smarter speakers and self-driving cars, Robu left his native Romania to work as a researcher at Harvard, Microsoft, and several European universities with the hope of putting AI into the grid. He envisions a scheme, based on game theory, in which an AI absorbs the preferences of thousands of EV owners, matches them each second to the needs of the grid, and prices everything fairly. Some owners may insist that their car battery always remain at least 75 percent charged. Others may be willing to let it drop lower overnight, as long as it is full in the morning. “The system will find the best time to charge and discharge, and owners would get some share of the payment,” Robu explains. “People who are more patient are likely to be paid better because the system has more flexibility in when to charge.”
Such vehicle-to-grid systems sound great, but repeatedly charging and discharging lithium-ion batteries reduces their capacity. In 2015, even the forward-looking chief technology officer of Tesla said that vehicle-to-grid is “something that I don’t see being a very economic or viable solution—perhaps ever, but certainly not in the near term.” Future improvements in battery technology could change that calculus. In the meantime, there are other options.
AI could give new life to an older, simpler idea: encouraging people to shift some electricity use to quieter times. Utilities call this approach demand response, and they have already signed up many large industrial customers, such as aluminum smelters, to reduce demand when necessary in return for discounted prices. But for demand response to put peaker plants out of business, it must gain far wider adoption. In high-income countries, residential homes (together with commercial users) consume the lion’s share of electricity—40 percent in the United States, compared to about 20 percent in China.
Utilities have enticed household customers to help them manage demand before. When I was growing up in Britain in the 1980s, many homes had thermal-storage heaters—basically ceramic bricks warmed by electric heating elements—that operated only during the off-peak hours overnight. Some of my friends woke up toasty warm every winter morning but crawled into chilly beds each evening. Given such trade-offs in comfort, the hassle of needing a special electricity meter, and the gradually shrinking fraction of household spending going toward utility bills, the scheme never really caught on.
How AI Can Suck Carbon Out of Traffic
Every minute your car sits idling, it pumps CO2 out the tailpipe. Now a startup, Rapid Flow Technologies, is trying to use AI to ease congestion in cities. The company employs a system known as Surtrac that was developed at Carnegie Mellon University to bring more intelligence to traffic lights. Radar sensors and cameras monitor car flow and wait times at intersections. The AI then adjusts the timing of the lights to move vehicles through as efficiently as possible.
Although the AI at each intersection works individually (to prevent mass outages), the smart systems can share data with others nearby. A pilot test at nine intersections in Pittsburgh reduced average travel times by one-fourth and average wait times by 40 percent. Surtrac systems are now running at 50 intersections in the city, with plans to expand to 150 more in the next three years. Where the smart lights are in place, travel times have dropped by 25 percent, braking by 30 percent, and idling by more than 40 percent.
Meanwhile, Google is putting AI to work on the problem of parking. Cars hunting for a space can account for one-third of all traffic in the most congested downtown areas—wasting time, burning fuel, and spewing greenhouse gases. Google Maps for Android phones can now predict parking availability near a destination in 25 US cities. Hopefully, the app will persuade drivers to park and ride instead.
But Robu and others believe that technology has now evolved to the point that a demand-response system could appeal to enough consumers, and cover enough appliances, to reach the necessary scale. Beyond just remotely controlling a few megawatt-munching smelters, this system would reach into thousands of homes during peak events and switch off, say, all the tumble dryers for an hour until demand subsides.
It’s a tempting idea, but it faces several hurdles. The first challenge is understanding what you can switch off—and when. The low-hanging fruits are power-hungry appliances such as ovens, fridges, washing machines, and clothes dryers. The system will need to draw on a database of both user preferences and appliance specifications to know whether it is acceptable to turn off a dryer at any point in its cycle, or whether a washing machine should be shut down only after it has drained. Some refrigerators and freezers could be driven to cooler-than-usual temperatures during the day and then turned off at peak demand—but only for so long, to avoid spoiling food inside.
Using AI is essential, Robu argues, because there’s no way humans could look at all the relevant data. “Once you instrument a fridge, you can sample it every five minutes and get a lot of information from it—and you might have 10,000 fridges,” he says. Software is also the only way to coordinate such a diverse collection of machines, some available for only brief windows, so that too many don’t turn back on at the same time and create a rebound peak.
Tech companies are already selling some of the many components needed to translate this vision into action. Smart meters now wirelessly transmit data on home electricity use to the cloud and receive signals in response. AI-enabled energy monitors pinpoint appliances in use by sensing their distinctive power signatures. Many of the latest domestic appliances offer remote control by smart-home software. Wi-Fi-networked outlet adapters add energy monitoring to older devices and put them under remote control, even by voice-activated gadgets such as Amazon’s Echo. And Robu and other researchers are developing AI-powered software to manage the whole process.
Reimagining a country’s entire electricity grid, daunting as that seems, is necessary but not sufficient. To make demand response work on a massive scale, engineers also have to surmount an even bigger obstacle: people.
Not many people today care enough about cutting their household emissions or energy bills to opt into conventional demand-response systems, says Long Tran-Thanh, who has been studying possible solutions to that thorny problem. Tran-Thanh’s background allows him to see a bigger picture. Born in Vietnam, he moved with his family to Hungary when he was just seven years old. He went to university in Budapest before relocating again to Britain, where he is now an assistant professor in AI at the University of Southampton. “People here [in the UK] are aware of the disadvantages of high energy usage, and they are quite open to new technologies and new ways of thinking,” he says. “But in Hungary, they don’t have this kind of mindset yet.” Though popular thinking is slowly shifting worldwide, even in Britain, he notes, “You can offer someone a very good plan to improve [energy] usage, but they’re not really interested in changing their habits. That’s why they’re called habits.”
Some have tried the obvious: offering customers financial incentives to adjust their behavior. “We thought this would be very useful, but in practice it’s not,” Tran-Thanh says. “The total of amount of savings you can offer is £20–£30 ($25–35) per year. If you show someone annual savings of £20, they just laugh.”
How AI Can Suck Carbon Out of Data Centers
If you want to be really green, ease up on social media. Data centers consume about 2 percent of all the electricity in the US, and that share grows every year. Most of that energy is dedicated to actual computation—serving up all those ads and streaming video—but around one-third is spent on cooling the servers to prevent them from overheating.
Last year, at one of its data centers, Google set loose on the pumps, chillers, and cooling towers AI software developed by its DeepMind subsidiary. The AI spent several months observing thousands of sensors within the center and learning the complex, non-linear interactions among the cooling devices. It then took control and was able to consistently achieve a 40 percent reduction in the amount of energy used for cooling. Google now plans to roll out the algorithms to its other data centers—and share the technology so other tech firms can reduce their carbon footprints, too.
Photo: Insulated pipes running through a Google data center
Although any individual’s savings are likely to be small, the potential benefits of demand response add up when multiplied across large populations. The US Energy Information Administration calculates that demand response could save a typical American household $40 and 100 kilowatt hours annually. If adopted nationwide, the practice could cut $5 billion a year from power bills and nearly 9 million metric tons of carbon dioxide from greenhouse gas emissions. Even more impressively, according to 2014 calculations by Alexander Smith, an energy researcher then at the Georgia Institute of Technology, by 2040 demand response could save the US up to $28 billion in infrastructure costs and avoid construction of 150 gigawatts’ worth of power stations, most of them fossil-fueled.
Tran-Thanh has now turned to AI in the hope of overcoming the people problem. He is developing a system called interactive demand response (IDR) that can realize the financial and environmental gains of demand response without asking people to do their laundry or drink tea in the middle of the night.
IDR teaches machines to understand how people typically use electricity. Tran-Thanh and his colleagues at the University of Southampton developed algorithms to extract patterns from a large dataset on household energy use that was assembled at M.I.T. The AI learns in a surprisingly similar way to humans, exploiting a software technique known as a neural network. No programmer decides in advance how to mathematically translate various kinds of consumption data into distinct demand profiles.
Instead, developers train the neural network by showing it examples. Gradually, it learns by experience—much like a child learning to distinguish pictures of cats from those of dogs. The more examples it sees, the more accurate its output. Young children soon understand that images of dogs and cats represent animals that can move and make noises. But the neural network is not able to understand how its learnings correspond to the real world, and that proved to be an important limitation.
The Southampton team’s AI system was able to turn electricity consumption data from any household into a reliable prediction of that home’s future demand. But homeowners hated it.
The system failed, Tran-Thanh realized, because it was too passive. “Machine learning collects the current data and learns patterns, but it cannot infer possibilities,” he says. “You can see only what the user is doing. You don’t know whether he is willing to deviate from that behavior.” To find out which activities users were comfortable shifting and which were sacrosanct, the software would have to ask them.
“But asking questions is also a problem,” Tran-Thanh found. “Every time you ask a question or require a user to interact with the system, it’s an annoyance—what we call the ‘bother cost.’ If you ask too many questions, a percentage of people will just not use it anymore.”
His solution to this weakness of AI was . . . more AI. The team folded into IDR a powerful sequential decision-making process called Pandora’s Rule to optimize the number of user interactions and minimize the bother cost. Pandora’s Rule is a mathematical model tailored for scenarios, such as house- or job-hunting, where you must decide when to stop looking and just make a choice. Adding that model to IDR enabled the system to track the accumulating bother cost and do a quick calculation before asking each user about her preferences, which are gathered via an iPad app. If the risk of asking another question outweighs the energy-saving benefits likely to be obtained, IDR quits while it’s ahead.
Tran-Thanh used a reserved portion of the M.I.T. consumption data (data that had not been used for training) to simulate human responses and compare the performance of the AI-enhanced system to that of previous, less “intelligent” algorithms. His conclusion: IDR reliably generated 35 percent greater financial
savings for users.
But would these savings be reflected in actual use, or would yet another unexpected problem rear its head? Testing IDR in a community is tricky. “It’s hard to wire up old houses, and it’s not that efficient,” Tran-Thanh says. “It’s easier to equip new-build homes—but it might not be cheap.” A field trial would have to be large enough to generate good statistics on the benefits (and problems) of IDR, yet small enough to remain affordable.
Luckily for Tran-Thanh, just such a development is taking shape now in the Netherlands. It is a remarkable new community that promises to resemble a floating village of the kind that might become more common on the coasts as sea levels rise.
How AI Can Suck Carbon Out of Factories
Systems do not get much more complicated and interconnected than factories filled with precisely choreographed robots and assembly lines. General Electric thinks that AI has a role to play here, in what it calls “brilliant manufacturing.”
An AI manager inhales data from supply chains, design teams, production lines, and quality-control checkpoints. The AI can then order new supplies just before they run out, monitor problems on the production line, and minimize electricity usage. Procter & Gamble says that its use of GE’s system has helped the company cut unplanned downtime by 10–20 percent. Because idle factories still run heating, ventilation, and lighting—accounting for around one-third of the entire electricity bill for a car assembly plant—less downtime translates directly into reduced emissions.
Rising from a canal in a heavily polluted industrial quarter of Amsterdam, the Schoonschip (“clean ship” in Dutch) project is building a floating neighborhood of 46 new homes connected by dock-like sidewalks. More than 100 residents will soon move into permanently anchored houses made from recycled materials and topped with plant-covered or translucent roofs to capture rain and sunshine. Water will be filtered and recirculated. Electricity generated by 500 solar panels will run 30 heat pumps to provide hot water and climate control.
As a tightly integrated, sustainable community, Schoonschip could be an ideal testing ground for AI-enabled, smart-grid technology. Last year, the builders linked up with Grid-Friends, a demand-response research group led by Michael Kaisers at Centrum Wiskunde & Informatica (CWI), the Netherlands’ national research institute for mathematics and computer science.
“I want to do new and original work in smart grids, but I want it to be feasible,” Kaisers tells me on a Skype videocall from the Dutch capital. “I’ve seen a number of research projects reach for a very innovative idea that either is nowhere close to the current situation or has not identified who would actually use it,” he says. Schoonschip seems to have avoided those pitfalls. Oversubscribed and with construction about to begin, the community aims to use an AI grid from the outset to achieve zero net carbon emissions, measured over the year. Although Schoonschip homes will draw power in the winter from the Dutch national grid, the community’s solar panels will generate excess power in summer months, and that clean energy will flow onshore for others in Amsterdam to use.
This kind of “net metering” is now familiar to homeowners who have rooftop solar panels and pay their utility only for the difference between the power they use and the power they generate. What is unique about Schoonschip is that the entire community will share a single connection to the national grid, with just one electricity meter. Behind that meter, Schoonschip will run its own mini smart grid, tying together the solar arrays and batteries of each house and intelligently coordinating the flow of energy from home to home.
Even a grid of just 46 homes quickly gets complex. With no central authority, every homeowner must decide for himself how much to invest in equipping their house to conserve energy or make its own. “There were long discussions about whether to build bigger terraces or leave more room on the sunny side for harvesting renewable energy,” Kaisers says. “In the coordination mechanisms for the automated intelligent control, we will have to take into account the people who made a sacrifice for an additional square meter for solar.” Even in this communal village, everyone will pay his fair share for the electricity used.
AI will be baked into Schoonschip’s grid from the very start. Kaisers’s team is building algorithmic AI “agents” to represent the users in demand-response negotiations among households. The algorithms will base their actions on predictions generated by learning each home’s generation and consumption patterns. The system will also use tablet computers to collect homeowners’ preferences, such as the minimum charge they want to keep in their batteries at various times of day. “AI agents can be better than humans in reaching good win-win outcomes, making sure that households don’t rip each other off but instead collaborate in a positive way,” says Kaisers. “Plus, lots of people hate haggling.”
AI also doesn’t mind sweating the details. The grid will have to make multiple split-second decisions and weigh tiny barters of solar power and battery space that would just annoy a person. “Human time is very expensive,” Kaisers says. Schoonschip will probably use Long Tran-Thanh’s research to work out the optimal times to ask for users’ input; Kaisers suspects owners will be asked no more than five questions a week.
By next summer, Schoonschip should be populated enough to turn off its connection to the national grid for a few weeks, to see how well the AI can run things. But that is just the start. “The big utility companies make good money the way things are going right now. They need an incentive to move, and that incentive is going to be self-sufficient local entities such as Schoonschip cutting into their business. These residential communities could be pivotal entities to put energy transition into practice, Kaiser says.
Whether that happens could hinge in part on economics. An AI-controlled smart grid comes at a price above that of the solar panels and heat pumps alone. Each house will likely have an extra smart meter or two, plus computers and sensors. Kaisers expects the bill to come in at less than $500 per household, with a small annual fee for the AI management software. These prices will likely fall as the technology improves—and by locating many of its innovations behind the meter, Schoonschip is not relying on conservative utility companies to drive innovation.
Schoonschip might seem like an insignificant thumb in the dike of climate change, but this bottom-up effort is being watched around the world. Nothing stops environmentally conscious towns in the US from following Schoonschip’s model. A community in Germany is already planning to roll out CWI’s Grid-Friends technology. And if the system can be proven in cloudy northern climes, it could be applied to even greater effect in poorer, sunnier parts of the world where existing power generation is more carbon-intensive. A 2010 McKinsey report noted that 70 to 80 percent of the India of 2030 is yet to be built. With cities and electricity grids that are still unrealized, developing nations—especially software-savvy ones like India—are well positioned to put lessons learned from pilot tests of AI into action.
Long Tran-Thanh sees great potential for the land where he was born. “Vietnam is still struggling with its basic electricity supply,” he says. “It still has daily planned blackouts because it cannot provide electricity to the whole country. But I know some researchers there have already started talking about smart grids. It’s at a very theoretical level only, but smart grids can definitely help Vietnam. We have the tools. We have the AI.”
Huge challenges remain, the biggest of which is the mind-boggling cost of retrofitting a planet full of dumb houses with the smart sensors and switches needed to let AIs take control. But if the past 30 years have taught us anything, it is that hardware gets smaller and cheaper, and software more powerful, by the month.
The beauty of adding artificial smarts to our grids is that it saves us hidebound, stick-in-the-mud humans from having to change our ways. Our attention spans are short. Our wills are weak. But software that can tirelessly observe and subtly intervene in our daily lives promises to achieve what decades of nagging have not: a meaningful reduction in our energy use that still lets us enjoy a hot cup of tea after every soccer game.
Mark Harris is an investigative science and technology reporter originally from the UK but now based in Seattle, with a particular interest in robotics, transportation, green technologies, and medical devices. He is a contributing editor at IEEE Spectrum and writes for a wide range of outlets including The Economist, The Guardian, and Wired.