❌

Reading view

RΒ²DΒ²:Β Improving Robot Manipulation with Simulation and Language Models

Robot manipulation systems struggle with changing objects, lighting, and contact dynamics when they move into dynamic real-world environments. On top of this,...

Robot manipulation systems struggle with changing objects, lighting, and contact dynamics when they move into dynamic real-world environments. On top of this, gaps between simulation and reality, and non-optimized grippers or tools often limit how reliably robots can generalize, execute long-horizon tasks, and achieve human-level dexterity across diverse tasks. This edition of NVIDIA Robotics…

Source

  •  

Upcoming Livestream: Build Visual AI Agents with NVIDIA Cosmos Reason and Metropolis

On November 18, learn how to fine-tune the NVIDIA Cosmos Reason VLM with your own data to create visual AI agents.

On November 18, learn how to fine-tune the NVIDIA Cosmos Reason VLM with your own data to create visual AI agents.

Source

  •  

RΒ²DΒ²: Perception-Guided Task & Motion Planning for Long-Horizon Manipulation

Traditional task and motion planning (TAMP) systems for robot manipulation use cases operate on static models that often fail in new environments. Integrating...

Traditional task and motion planning (TAMP) systems for robot manipulation use cases operate on static models that often fail in new environments. Integrating perception with manipulation is a solution to this challenge, enabling robots to update plans mid-execution and adapt to dynamic scenarios. In this edition of the NVIDIA Robotics Research and Development Digest (RΒ²DΒ²)…

Source

  •