Uber AV Labs’ Sensor Grid Could Redefine Autonomous Driving Data

Uber gave up on building self-driving cars years ago. However, the company just revealed a plan that could make it the most powerful player in the autonomous vehicle industry anyway.
Uber has a long-term ambition that goes well beyond shuttling passengers: the company eventually wants to outfit its human drivers’ cars with sensors to soak up real-world data for autonomous vehicle companies, and potentially other companies training AI models on physical-world scenarios. Praveen Neppalli Naga, Uber’s chief technology officer, revealed the plan at TechCrunch’s StrictlyVC event in San Francisco, describing it as a natural extension of a nascent program the company announced in late January called AV Labs.
“That is the direction we want to go eventually,” Naga said. “But first, we need to get an understanding of the sensor kits and how they all work. There are some regulations. We have to make sure every state has clarity on what sensors mean, and what sharing it means.”
For now, the Uber AV Labs sensor grid runs on a separate, dedicated fleet. However, Naga is thinking much bigger than that.
Uber has millions of drivers globally, and if even a fraction of those cars could be transformed into rolling data-collection platforms, the scale of what Uber could offer the AV industry would dwarf what any individual AV company could assemble on its own. “The bottleneck is data,” Naga said.
He made the value proposition concrete. Companies like Waymo need highly specific real-world data. “You may be able to say: in San Francisco, ‘At this school intersection, I want some data at this time of day so I can train my models.’ The problem for all these companies is access to that data, because they don’t have the capital to deploy the cars and go collect all this information.”
How Uber Positions Itself in the AV Ecosystem
Uber has already established partnerships with 25 autonomous vehicle companies, including London-based firm Wayve. The centrepiece of this collaboration is the “AV Cloud,” a library of labelled sensor data. This system allows partners to test their software in “shadow mode” during real-world Uber trips, simulating AV performance without a computer actually controlling the steering wheel.
Naga was clear about the stated intent. “Our goal is not to make money out of this data,” he said. “We want to democratise it.”
However, the commercial reality is harder to ignore. The company has already made equity investments in numerous AV players, and its ability to offer proprietary training data at scale could give it significant leverage over a sector that right now depends on Uber’s ride marketplace to reach customers.
Therefore, the Uber AV Labs sensor grid is more than a data business. It is Uber’s answer to a future where self-driving cars reduce the need for human drivers entirely.






