In considered one of my favourite motion pictures of all time, The Matrix, people grow to be the facility supply that retains machines alive.
Elon Musk should have watched that film lately, as a result of he simply pitched an identical thought. Besides he desires idle machines to energy the way forward for intelligence, not the opposite method round.
On Tesla’s current third-quarter earnings name, Musk floated this wild thought:
Truly, one of many issues I believed, if we’ve obtained all these vehicles that possibly are bored, whereas they’re type of, if they’re bored, we might even have an enormous distributed inference fleet and say, in the event that they’re not actively driving, let’s simply have an enormous distributed inference fleet.
Translation: each idle Tesla might quickly act as a node in a large AI community. Tens of thousands and thousands of parked vehicles, pondering collectively.
However how would Elon’s cellular supercomputer work?
That’s the place issues get actually fascinating…
A Fleet That Thinks
Estimates fluctuate, however as of 2024, there have been round 5 million Teslas on the street worldwide.
Elon Musk has a lot greater plans, predicting the fleet would possibly ultimately complete 100 million vehicles.
Right here’s what he mentioned throughout Tesla’s current earnings name:
Sooner or later, for those who’ve obtained tens of thousands and thousands of vehicles within the fleet, or possibly in some unspecified time in the future 100 million vehicles within the fleet, and let’s say that they had at that time, I don’t know, a kilowatt of inference functionality, of high-performance inference functionality, that’s 100 gigawatts of inference distributed with energy and cooling taken, with cooling and energy conversion taken care of. That looks as if a reasonably vital asset.
In different phrases, 100 million Teslas, every able to about one kilowatt of high-performance inference.
That works out to roughly 100 gigawatts of compute energy.
To place that in perspective, 100 gigawatts is near the mixed output of 100 nuclear reactors or sufficient electrical energy to energy 75 million U.S. properties.
A single hyperscale information heart from Amazon Net Providers or Google Cloud can draw 50 to 100 megawatts of energy. You’d want round 1,000 of these to match Musk’s theoretical 100-gigawatt community.
And all that potential computing energy would already be constructed, paid for and sitting in driveways.

Picture: Tesla
Tesla’s full-self-driving pc — often called {Hardware} 4 — is designed to method the sort of efficiency seen in high-end information heart chips.
And a next-generation system referred to as AI5 is in improvement that might ship a number of occasions extra processing energy, giving each Tesla the sort of onboard compute as soon as reserved for information facilities.
What’s extra, every automobile already comprises a high-performance processor and energy system able to operating advanced AI duties. Every one already has a built-in thermal-management system that retains chips cool and batteries balanced. And each car is linked to Tesla’s cloud via the identical over-the-air replace community that pushes new software program and maps.
The distinction is, in contrast to a server rack, these programs spend most of their time doing nothing. As a result of the common automobile sits parked 95% of the day.
So Musk’s pitch is straightforward. Let’s put these idle processors to work.
When you might borrow just a little little bit of vitality and compute from each parked Tesla, you could possibly type a world computing grid that might make right this moment’s cloud networks look far too centralized and inefficient by comparability.
Have to run an image-recognition mannequin, simulate an autonomous-driving state of affairs or course of video information?
Tesla might parcel out these jobs throughout thousands and thousands of vehicles in a single day.
This may give Tesla a possible moat that no different automaker — or cloud firm — might simply match.
In spite of everything, GM and Ford don’t have proprietary chips just like the AI5 of their vehicles. And Amazon doesn’t have 5 million linked automobiles plugged into its cloud.
It will additionally assist shift AI from centralized supercomputers to distributed inference. That’s the identical sort of edge computing mannequin that powers smartphones, drones and industrial robots right this moment.
As a result of on this state of affairs, the community wouldn’t must exist in a single central place.
It will dwell wherever a Tesla is parked.
Right here’s My Take
If Musk can really execute on this wild thought, Tesla’s fleet might rival the biggest AI compute clusters on Earth.
However there are hurdles to resolve earlier than it might grow to be actuality.
Operating inference jobs on car batteries might shorten their lifespan in the event that they aren’t managed fastidiously.
Some house owners would possibly refuse to permit their automobile for use for Tesla’s compute work, even when they’re compensated. And data-privacy legal guidelines in Europe and California would require consent and transparency.
However Tesla already has expertise orchestrating large distributed programs. Each time it updates Autopilot or trains new imaginative and prescient fashions, it collects and processes video information from thousands and thousands of vehicles worldwide.
The distinction right here is that Musk would need the Tesla fleet not simply to coach AI, however to run it.
On this future, Tesla’s vehicles would cease simply being automobiles and begin performing as cellular computing belongings. Homeowners would possibly decide in via software program, permitting their automobiles to lease out compute cycles whereas parked, which might earn them credit or money in return.
For Tesla, it might be a completely new income stream layered on high of the present fleet. And like Musk’s robotaxi enterprise, it might scale mechanically.
As a result of each new automobile offered would develop the community’s computing energy.
It’s a radical thought. And it might characterize a radical shift for the corporate. If Tesla can pull it off, Musk might find yourself operating the world’s strongest, most distributed AI community…
With out ever constructing a knowledge heart.
Regards,

Ian King
Chief Strategist, Banyan Hill Publishing
Editor’s Be aware: We’d love to listen to from you!
If you wish to share your ideas or ideas concerning the Each day Disruptor, or if there are any particular matters you’d like us to cowl, simply ship an e mail to dailydisruptor@banyanhill.com.
Don’t fear, we gained’t reveal your full title within the occasion we publish a response. So be at liberty to remark away!

