
Tutor Intelligence, which makes AI-powered robots for warehouse work, has raised $34 million in new funding.
The company will use the new capital to speed commercialization of its robots, scale its consumer packaged goods (CPG) fleet and advance its central robot intelligence platform and research infrastructure, according to a Monday (Dec. 1) news release.
“Tutor stands out for its extraordinary speed of execution and its ability to balance cutting-edge product and model development with a clear commercial focus that quickly gets this functionality into customers’ hands,” Rebecca Kaden, managing partner at Union Square Ventures, which led the funding round, said in the release. “They’re not building for an abstract future; they’re transforming how CPG companies operate today.”
Founded out of MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), Tutor Intelligence’s robots work alongside human operators to process goods for a “vast Fortune 50 supply chain network,” the company said in the release. It also works with multiple Fortune 500 packaged food companies, and “global leaders” in the personal care, toys, home goods, beauty and consumer technology spaces.
“When we started Tutor Intelligence nearly five years ago as grad students at MIT, we saw that the robotics intelligence bottleneck was the key barrier to robotic worker viability,” said Josh Gruenstein, the company’s co-founder and CEO.
“We built a system that leverages on-the-job data to teach robots to navigate and understand the physical world with human-like intuition. This new capital enables us to expand our fleet, scale our robot training infrastructure, and empower our robots to tackle increasingly complex tasks, reshaping industrial work as we know it.”
The funding comes at a time when, as covered here last month, physical artificial intelligence (AI) is emerging as the next stage of robotics. Earlier robots followed fixed commands and worked only in predictable environments, having trouble with the unpredictability found in everyday operations such as shifting layouts, mixed lighting, and human movement.
“That is beginning to change as research groups show how simulation, digital twins and multimodal learning pipelines enable robots to learn adaptive behaviors and carry those behaviors into real facilities with minimal retraining,” PYMNTS wrote.
Amazon’s launch of its Vulcan robot is one of the clearest examples of physical AI moving from research to frontline operations. This robot uses both vision and touch to pick and stow items in the company’s fulfillment centers, letting it handle flexible fabric storage pods and unpredictable product shapes.
Source: https://www.pymnts.com/
