Could drivers someday skim email while their pickup hums down the interstate? Ford, speaking at CES and in interviews with Reuters, says yes — or at least that it plans to let people take their eyes off the road in certain conditions by 2028.
The headline: Ford will offer a Level 3 driver‑assistance system on its new Universal EV Platform, slated to debut on a midsize electric truck platform and roll out across other vehicles. Unlike today’s Level 2 systems — think Tesla’s Full Self‑Driving (FSD) and GM’s Super Cruise, which demand constant driver attention — Level 3 lets the vehicle handle driving tasks without the driver watching, so long as the car signals it’s time to intervene.
Eyes off, but who's behind the wheel?
Ford’s Doug Field, the company’s chief EV, digital and design officer, framed the move as practical rather than theatrical: this isn’t full autonomy, and a human will still need to be present to take control. That distinction matters. Level 3 is a legal and engineering gray area in many markets because the system can cede responsibility between machine and human — a handoff that has tripped up other companies.
The company said the tech will appear on vehicles built on its new $30,000 EV architecture, but it won’t be standard on the cheapest models. Field said Ford is still deciding whether to sell the feature as a one‑time buy or a subscription; pricing was not disclosed. The first mass‑market Ford EV on that platform is expected in 2027, with the Level 3 option to follow in 2028.
Lidar makes a comeback
A notable technical choice: Ford plans to use lidar sensors. That’s a rebuke to Elon Musk’s long‑standing argument that lidar is an expensive crutch; Tesla has leaned instead on cameras and radar. Ford’s wager is that the extra depth and redundancy lidar provides will smooth the path to a reliable Level 3 system — and perhaps give the automaker a safety and validation advantage when regulators ask for proof.
Safety is the subtext here. After decades of falling roadway deaths, U.S. traffic fatalities spiked in the past decade and peaked near 43,000 in 2021, renewing interest in automation as a potential safety fix. Level 3 promises reduced human error in limited situations, but it also raises questions: will drivers be ready to re‑engage when the car asks? How will liability be apportioned if a system misjudges a scenario?
More than a feature: a business model experiment
Ford’s executive team is treating the software as a product experiment as much as a safety upgrade. Field explicitly raised the business‑model question — subscription vs. upfront purchase — signaling that automakers now see advanced driver aids as recurring revenue engines, not just hardware specs.
That approach mirrors broader industry moves: automakers are monetizing software, from over‑the‑air updates to optional driver aids. It’s a necessary pivot as margins on EV hardware get pressured and carmakers chase steady aftermarket income.
The competitive scoreboard
Ford isn’t alone in chasing autonomy. Tesla markets FSD as the frontier for consumer driver assistance but markets it as Level 2 in many regulatory regimes; Mercedes has a certified Level 3 system in Drive Pilot for select markets; and Alphabet’s Waymo and Amazon‑backed Zoox pursue robotaxi models in city cores. Waymo, for example, has logged tens of millions of autonomous miles in ride‑hail service, and smaller players continue to iterate on routing, perception, and business models.
Ford’s announcement also hints at robotaxi ambitions down the line: reliable eyes‑off tech could serve both consumers and commercial fleets. That said, transition from highway‑limited Level 3 to dense urban autonomy remains a vast technical and regulatory leap.
Why maps and AI matter
Autonomy isn’t just sensors and compute; it’s data, routing and maps. The navigation layer — real‑time routing, hazard updates, and driver handoff cues — will need to be flawless. That’s where advances in map‑aware navigation and AI copilots come into play. Recent work on conversational and context‑aware navigation systems suggests the stack beyond the sensor suite will be just as crucial to success as lidar itself. See how mapping and AI are evolving in consumer navigation platforms like Google Maps’ Gemini-powered copilot, which hints at the kind of multimodal guidance future cars will require.
A larger auto strategy
This move fits into a broader Ford pivot: electrification, new software services and occasional headline‑grabbing product plays. Ford has been layering software features atop traditional products — even niche offerings like SEMA kits and aftermarket packs — to extract more value from vehicle platforms and enthusiast markets. For context on how the company balances mainstream EVs with enthusiast and aftermarket strategies, see Ford’s recent product activity around the Maverick and Raptor ecosystems showcasing the company’s broad approach.
Risks and the road ahead
Regulation, driver behavior, and the hard physics of real‑world driving remain big obstacles. Ford itself noted the need to prove the platform in the U.S. before expanding elsewhere — a tacit admission that while the tech may be ready in controlled conditions, public roads are another story. Broader debates about AI reliability and human‑machine handoffs also matter; the industry‑wide discussion about when AI is “ready” for critical tasks is ongoing and sharp, affecting how quickly regulators or insurers will embrace Level 3 at scale. For a sense of that debate, see recent thinking on AI’s limits and promises in the ongoing AI readiness conversation.
Ford’s promise is concrete in timing but cautious in scope: eyes off, for certain roads, by 2028, with lidar and a pay model still being decided. Whether that becomes a genuine leap over competitors or another step in a long, iterative race for safe autonomy will come down to how the company stitches sensors, software, regulation and consumer trust into one package — and whether drivers want to hand the wheel over at all.