It doesn’t feel like thinking uses all that much energy, does it? If we want to lose weight, we’re told to go for a run, not play chess.
However, evolutionary biologists have pointed out that our current, oversize brains did not evolve until after we mastered the use of fire to cook our food. That’s because they use a surprising amount of calories. Our nearest primate cousins, with their raw food diets, simply don’t have enough hours in the day to eat and digest all the calories that it would take to support a human brain. Cooking means we can extract more calories from the same amount of food, more quickly.
Is the same true of electronic brains then? Well, there was a great story in the press recently about a swimming pool in Exmouth that is being heated with the waste heat from just a small part of a data centre. A data server the size of a washing machine is enough to heat the pool to 30C around 60% of the time. So yes, all those transistors may not look like they’re doing much, but they are using a huge amount of energy.
What might this mean for autonomous vehicles? It’s generally taken as a given that most autonomous vehicles will be electric, since that’s the way we’re going. And EVs already have a problem with energy storage. If the vehicle also has to do its own thinking, what might that do to its range?
I should credit the Emissions Analytics blog for first alerting me to this question. Back in 2020 they did a few ‘back of the envelope’ calculations, and came to the conclusion that a fully autonomous vehicle, with multiple sensing and processing systems, might use as much energy to do its sensing and ‘thinking’ as it needed to actually drive the wheels.
That, frankly astonishing, conclusion suggested this might be a bigger problem than one might think – even if Emissions Analytics’ calculations were a fair way out. At the time I wondered if any other evidence might back this up.
Then last year I went to visit some guys in a start-up based in Millbrook, called Hypermile AI. They were kind enough to take me around the Millbrook bowl in an HGV tractor unit that was basically being driven by a mobile phone. Their very clever kit combines with the truck’s cruise control to anticipate the movement of other vehicles much better than the usual crude adaptive cruise control algorithms, thus achieving fairly impressive fuel savings.
While we were chatting after the demonstration, I asked about the energy use of fully autonomous systems – could they be as energy hungry as Emissions Analytics had suggested. The reply – ‘absolutely’. That’s one of the reasons the Hypermile System only uses a single camera – as soon as multiple sensing systems need to be integrated into that type of system, the processing power (and associated energy use) increases exponentially. Anecdotally, they told me that many Tesla drivers report that when they are using full autopilot, the range of the vehicle drops by around 25%. (I checked this later, and found the Tesla owner forums are awash with discussions that absolutely back this up.)
Of course processors are constantly getting more efficient, and system designers will find things to optimise, but the scale of energy use would appear to be too big to ignore. This may be yet another argument for the real benefits to autonomous vehicles being found around level 4 (where sensing and ‘thinking’ can be streamlined to particular use cases) rather than level 5, where the AI needed to handle all those open-ended situations may need so much power that it destroys the business case.
Tiny data centre used to heat public swimming pool:
Could vehicle automation make carbon dioxide emissions and air quality worse?