
Physical AI is emerging as the next stage of robotics as advances in sensing, perception and large AI models give machines capabilities that traditional automation never supported. Earlier robots followed fixed commands and worked only in predictable environments, struggling with the unpredictability found in everyday operations such as shifting layouts, varying item shapes, mixed lighting, and human movement.
That is beginning to change as research groups show how simulation, digital twins and multimodal learning pipelines enable robots to learn adaptive behaviors and carry those behaviors into real facilities with minimal retraining.
Meet the New Workforce
Work from Fraunhofer IESE describes how digital engineering techniques, including virtual replicas of factory floors and logistics routes, help robots build robust skills before they ever enter a building, reducing deployment risks and giving operators far more control over performance. These simulation workflows also allow robots to learn to navigate dynamic spaces and handle variation safely and consistently.
The World Economic Forum reported a similar shift underway in manufacturing. Their analysis explained how improvements in robot dexterity, machine perception, environment mapping and model-based reasoning are moving robots from isolated, fenced-off stations into shared work areas where they support production, inspection and transport tasks. According to their research, companies are beginning to treat robots as intelligent mobile systems that can adjust to new product lines, switch tasks and work next to people without requiring rigid constraints.
At the same time, researchers at Carnegie Mellon University highlighted how new sensor designs, tactile perception and simulation-to-reality training methods are enabling robots to operate reliably in crowded, multi-actor environments. Their work examines how robots trained on thousands of simulated scenarios can perform accurately in real workplaces, even as conditions shift hour by hour.
These developments are reshaping how companies think about robotics. Physical AI is becoming a dependable workforce layer that sits between human teams and digital systems. Robots equipped with physical AI can interpret high-level instructions, navigate corridors or warehouse aisles, adjust to real-time conditions and handle repetitive workflows at consistent speed.
Enterprises are beginning to use these systems to stabilize throughput, reduce error rates and maintain operational continuity during volatile staffing periods. The Wall Street Journal noted that companies in logistics, retail, healthcare and manufacturing are prioritizing robots that combine perception, reasoning and connected motion so physical workflows can run continuously and share real-time intelligence with planning platforms.
How Companies Are Putting Physical AI Into Practice
Amazon’s deployment of its Vulcan robot is one of the clearest examples of physical AI moving from research to frontline operations. Vulcan uses both vision and touch to pick and stow items in fulfillment centers, allowing it to handle flexible fabric storage pods and unpredictable product shapes. Amazon explained that Vulcan’s tactile systems let it respond to pressure, contact and motion in real time so it can complete tasks that previously depended on human dexterity. The robot is already deployed in U.S. and European sites and integrates directly into Amazon’s logistics software, allowing human associates and robots to coordinate assignments smoothly.
Walmart is expanding physical AI systems across its distribution network, using automation platforms that reduce unit-handling costs and improve throughput.
The company is also scaling its robotics footprint through a 2025 agreement in which Symbotic will acquire Walmart’s advanced systems and robotics business and develop AI-enabled automation for its pickup and delivery centers, reflecting a deeper move toward physical-AI-based logistics as reported by Reuters.
Other operators are moving in a similar direction. GXO Logistics recently expanded its physical AI pilots after reporting strong results from its 2025 deployment of an AI-powered narrow-aisle inventory robot that scans pallets, tracks stock levels and feeds real-time data into warehouse systems. GXO said the system performed well enough to justify expansion across U.S. and European sites. These examples show how enterprises are deploying physical AI not as experimental projects but as core operational infrastructure that stabilizes throughput, lowers cost and provides real-time visibility into physical workflows.
Source: https://www.pymnts.com/
