Distributionally Robust Safe Control of Robotic Manipulators in Dynamic Environments

This video demonstrates the obstacle avoidance process of a robotic manipulator, conducting six different experiments: using the nominal CBF and DR-CBF methods under noise standard deviations σ = 0.01, 0.05, and 0.1.

Abstract

In this paper, we investigate safe execution for robotic manipulators operating in environments with dynamic obstacles and perception uncertainty. A key challenge in this setting is that obstacle states must be inferred from noisy measurements, and estimation errors can render safety constraints overly optimistic, increasing the risk of violations. To address this, we propose a distributionally robust control framework that integrates Kalman-filter-based obstacle estimation with control barrier function (CBF) safety constraints, explicitly accounting for estimation uncertainty in safety-critical control. Using a Kalman filter, we maintain a Gaussian belief over obstacle positions and velocities, construct an ambiguity set capturing plausible deviations, and derive a robust CBF constraint that enforces safety under the worst case within this set. We validate the effectiveness and robustness of our approach through simulation studies, demonstrating safe manipulation under uncertainty and dynamic obstacles, and compare its performance against baseline methods such as standard point-estimate CBF controllers and fixed-margin safety constraints.

Video Presentation

Video Clips