Overview
- Elon Musk reiterated confidence on October 29 in using idle Teslas for AI inference after first floating the idea on Tesla’s October 22 earnings call.
- He outlined a scenario of roughly one kilowatt of inference per vehicle that could scale to about 100 gigawatts if tens of millions to 100 million cars participate.
- The envisioned system would tap parked vehicles, likely prioritizing those that are charging, to execute inference tasks outside traditional data centers.
- Coverage notes owners might opt in for compensation, but Tesla has not announced timelines, program mechanics, or privacy and consent frameworks.
- Reports highlight current AI4 (HW4) capabilities and a projected AI5 with large performance gains, alongside practical concerns about battery drain, heat stress, and connectivity.