ETH Zurich spin-off Mimic Robotics is redefining automation with AI-powered robotic hands that move, adapt, and react just like humans — bridging the gap between man and machine for the future of industrial robotics.


As global labour shortages intensify and industries rush to reshore manufacturing, the need for robots that can handle complex, human-like tasks has never been greater. Enter Mimic Robotics, a fast-emerging startup whose physical AI system promises to bring human-level dexterity to industrial robots.

The company has just announced a $16 million funding round, led by Elaia and Speedinvest, to accelerate deployment of its groundbreaking technology. The round also included support from Founderful, 1st kind, 10X Founders, 2100 Ventures, and Sequoia Scout Fund, pushing Mimic’s total funding past $20 million.

The fresh capital will go toward scaling the startup’s foundation AI model, developing next-generation humanoid robotic hands, and expanding pilot projects with global industry leaders.


From ETH Zurich to Factory Floors Worldwide

Founded in 2024 as a spin-off from ETH Zurich’s Soft Robotics Lab, Mimic Robotics is led by Stefan Weirich (CEO), Elvis Nava (CTO), Stephan-Daniel Gravert (CPO), and Benedek Forrai (Founding Engineer) — a team of researchers who have been advancing the intersection of AI and robotics under Professor Robert Katzschmann, now the company’s Scientific Advisor.

Their mission? To train AI-driven robots that can learn directly from humans — not through code, but through real-world demonstrations. Using proprietary motion-tracking devices worn by skilled operators on factory floors, Mimic captures high-fidelity motion data without interrupting production.

That data then fuels imitation learning models, enabling Mimic’s robotic hands to replicate human technique, self-correct, and adapt autonomously to different objects and environments.

“Mimic closes this automation gap by building frontier physical AI models trained on real-world human demonstrations,” said Stephan-Daniel Gravert, co-founder and CPO at Mimic Robotics. “Our robotic hands can react, adapt, and recover in environments built for humans.”


Beyond Traditional Automation

Unlike traditional industrial robots limited to repetitive, preprogrammed tasks, Mimic’s humanoid robotic hands can perform delicate and varied operations — from assembling intricate components to handling fragile materials — while operating safely alongside humans.

This scalable and flexible approach makes Mimic Robotics stand out among competitors like Tesla, Agility Robotics, and Figure AI, which are focused on full humanoid robots often hindered by high costs and slow regulatory adoption.

According to Clément Vanden Driessche, Partner at Elaia, “The world-class team at Mimic is addressing one of the most challenging problems in physical AI: dexterous manipulation. Their breakthrough approach integrates a proprietary robotic hand, foundation models for robotics, and novel training methods.”


Europe’s Moment in AI and Robotics

Andreas Schwarzenbrunner, General Partner at Speedinvest, highlighted the broader impact:

“With Mimic, we see exactly what’s needed — a platform that unlocks human-level dexterity with frontier AI and solves billion-dollar problems on factory floors. This is Europe’s moment to lead the new era of AI and robotics.”

For Mimic Robotics, the long-term goal is clear: to make human-level dexterity accessible across industries, paving the way for intelligent automation that can finally match — and scale — the skills of human workers.

“We’re building the foundation for the next generation of industrial robots — ones that can finally do what people do, at the scale industry demands,” said Gravert.


The Future of Work Is Taking Shape

With physical AI, robotic dexterity, and automation converging, Mimic Robotics stands at the forefront of a technological shift that could redefine the future of work.

Follow, share, or comment below to stay updated on how AI-powered robotics is reshaping industries — one human-like movement at a time.