ENG

NEWSROOM

News Event Blog

News

[BUSINESS KOREA] MOVENSYS Presents Real-Time Execution Technology for Physical AI at NVIDIA GTC 2026

  • 2026.03.23
  • |
  • 최고관리자
  • |
  • 60

Approach to Closing the Sim-to-Real Gap through Integration of the NVIDIA Isaac Platform and a Real-Time Control Stack



MOVENSYS, a developer of real-time software-based motion control technology (Chairman Boo-Ho Yang), announced that its research on a real-time execution stack for Physical AI was presented at the poster session of NVIDIA GTC 2026, held March 16–19, 2026 in San Jose, California.

NVIDIA GTC is one of the world’s largest technology conferences covering AI, robotics, digital twins, and accelerated computing. The event brings together researchers, developers, and industry leaders to present the latest technologies and applications. MOVENSYS’s research was selected for the poster session, which NVIDIA described as having the most competitive selection process in GTC history.

The study focuses on addressing the Sim-to-Real Gap, a key challenge in deploying Physical AI systems powered by robot foundation models. While foundation models can generate robust motion plans in simulation, executing those plans reliably in real-world environments requires precise, low-latency control and tightly synchronized execution.

Most existing Physical AI systems use a separated architecture, where a GPU-based computing module performs AI inference while a dedicated robot controller handles motion control. Because these components are connected through network communication such as Ethernet, communication latency and task scheduling delays can occur, introducing significant latency within the control loop. This structural limitation can cause timing mismatches between AI decision-making and robot actuation, leading to reduced stability and tracking accuracy in real-world environments.

To address these challenges, MOVENSYS developed a Real-Time Motion Control Stack based on its software motion controller WMX. The stack functions as a Real-Time Execution Layer that connects the NVIDIA Isaac–based application layer with the robot control layer through EtherCAT-based real-time communication and a ROS2 interface. This architecture minimizes execution latency between AI inference and robot motion control while enabling tightly coupled integration between intelligence and control.

In comparative experiments conducted using Isaac Manipulator on an NVIDIA Jetson Orin IPC, the MOVENSYS real-time control stack demonstrated significant improvements over a conventional system using an external robot controller. The MOVENSYS approach reduced tracking errors (Mean Absolute Error, MAE) by approximately 85%, highlighting the importance of tightly coupled real-time execution when deploying foundation model–driven Physical AI systems in real-world environments.

MOVENSYS’s core technology, Soft Motion, originated from robotics research at MIT and represents a software-based motion control approach. The technology is currently used by global equipment manufacturers in industrial automation applications, including semiconductor manufacturing equipment. This research extends those industrial real-time control technologies by integrating them with robot foundation model–based software stacks, proposing an execution infrastructure for Physical AI systems.

The presentation also outlined future development directions based on the real-time execution stack, including a real-time adaptive fine-tuning framework that uses real-world sensor telemetry—such as joint states and force feedback—to adapt robot foundation models.

“Beyond AI models themselves, the Physical AI era will require real-time execution infrastructure capable of reliably operating in the physical world,” said MOVENSYS. “We will continue developing a Physical AI execution platform that can support next-generation robotic systems based on industrial real-time control technologies.”


Original Press Release: https://www.businesskorea.co.kr/news/articleView.html?idxno=265416