At CES 2026 Hyundai Motor Group pulled back the curtain on an audacious plan: turn humanoid robots from lab curiosities into mass-produced coworkers. The company laid out a timeline, the industrial plumbing to make it happen, and a business model that treats robots more like utility services than one-off purchases.
A robot designed for work, not just stunts
What people saw this winter in Las Vegas and at Boston Dynamics’ facilities isn’t the cartwheeling demo bot of old. The new Atlas is a sleeker, largely electric humanoid engineered to do industrial tasks: lifting, sorting, manipulating parts. Reporters who visited Boston Dynamics and Hyundai factories described Atlas doing hands-on practice in parts warehouses and learning warehouse tasks autonomously—training that, until recently, required engineers to hand-code each move.
That shift comes thanks to two things: better hardware (lighter, repairable bodies and stronger actuators) and modern machine learning. Boston Dynamics now trains Atlas using human demonstrations, simulation and large-scale data so that when a new motion works, one updated model can be pushed to every machine. Nvidia chips power much of that on-device processing, giving Atlas the compute to react to the messy real world rather than a perfectly scripted studio set.
From factory floor to subscription service
Hyundai isn’t talking about selling robots the old-fashioned way. Instead it plans to manufacture Atlas at scale inside its automotive supply chain and to offer robotics-as-a-service (RaaS): customers subscribe, while Hyundai handles software updates, maintenance, spare parts and remote monitoring. The approach converts a one-time capital sale into recurring revenue and gives buyers lower up-front costs—an attractive pitch for businesses that need flexible automation.
That model also tightens the data loop. Robots deployed in Hyundai’s plants feed operation data back into training centers so models and controls improve with each lift, scan and recovery. Hyundai’s Software‑Defined Factory (SDF) idea treats factories as software-first environments where behavior can be updated quickly, and campuses like the upcoming Robot Metaplant Application Center (RMAC) are meant to simulate real assembly-line conditions for continuous retraining and validation.
Smarter bodies meet smarter brains
Hardware and scale are one half of the equation; advanced reasoning is the other. At CES, Boston Dynamics announced a collaboration with Google DeepMind to run Gemini Robotics models on its machines. The aim: something closer to contextual common sense—letting Atlas identify unfamiliar objects, reason about how to pick them up, and adapt when the world isn’t tidy.
This is not just conjecture. WIRED reported that Gemini-powered models will be tested on Atlas (and Spot) in Hyundai’s automotive plants in the coming months. The idea is to pair a capable, generalist body with a powerful, multimodal AI brain so robots can generalize skills instead of needing painstaking, task-by-task programming. You can read more about Gemini’s expanding role across Google products in our coverage of Gemini’s integration into workplace tools and search Gemini’s Deep Research May Soon Search Your Gmail and Drive — Google Docs Gains ‘Document Links’ Grounding and the company’s move to bring conversational AI into mapping experiences Google Maps Gets Gemini: A Conversational AI Copilot for Navigation.
A practical timeline — and the risks people notice
Hyundai’s public roadmap is deliberate. RMAC is slated to open in 2026; training and sequencing tasks are planned at some sites by 2028, with more complex assembly roles targeted around 2030. That sequence—test in demanding manufacturing environments first, then broaden applications to logistics, construction and facilities—reflects Hyundai’s insistence that robots should earn their stripes where precision and reliability matter.
Still, the rollout raises questions. Journalists who toured test facilities noticed real gains in dexterity and autonomy, but robotics leaders repeatedly stressed limitations: everyday human tasks like dressing or carrying a hot cup of coffee while navigating a cluttered home remain far harder than picking and placing factory parts. Company executives emphasize safeguards, human oversight, and the need to build systems that are guaranteed safe in shared environments.
Workers and labor economists are watching closely. Repetitive, back‑breaking work is the most immediate target for automation; those jobs will change fast. But the industry also needs technicians, AI trainers, maintenance crews and integration specialists—roles that don’t disappear so much as morph. Whether those transitions are smooth depends on training programs, corporate choices and public policy.
Why carmakers, chips and AI labs are converging
Hyundai’s bet makes sense when you look at the ingredients required to scale humanoids: supply chains, precision manufacturing, software operations and field servicing. Carmakers already have factories, logistics networks and expertise in mass production—assets that are awkward for a typical robotics startup to build overnight. Combine that with Google’s AI and Boston Dynamics’ mechanical know‑how and you get a vertically integrated push toward commercialization.
And there’s money in ongoing service contracts. As analysts noted, recurring revenue both stabilizes cash flow and deepens customer lock-in—especially when updates and repairs are handled by the vendor.
What this could mean for the next five years
Expect a stepwise rollout: tightly controlled deployments in Hyundai plants and logistics centers first, then expansion into other industrial sites where environmental variables are limited and benefits are clear. Over time, as models get better at handling unstructured tasks and as durability and repairability improve, we could see humanoids in broader commercial roles.
But ubiquity isn’t inevitable. Regulation, safety validation, workforce adjustments and competition—from Tesla to Chinese firms to nimble startups—will all shape the pace. For now, Atlas represents a concrete experiment in bringing humanoid robots out of labs and into the messy, repetitive work humans would rather not do. Whether it becomes an industrial staple or a niche tool depends on how well industry, policymakers and society manage the technical and social trade-offs.
If you want to follow how Gemini is being applied across software and services, we have more on that evolution in our coverage of Gemini’s Deep Research. And for a look at how conversational Gemini features are landing in navigation and assistant products, see our write-up of Google Maps’ Gemini copilot.