Skip to main content
SearchLoginLogin or Signup

Getting a grip in zero-g

Autonomous robot manipulation in microgravity

Published onNov 17, 2023
Getting a grip in zero-g

Project goals

In this project, we aim to develop an autonomous robot to solve a “capture-and-stow” task, where the robot must autonomously identify, grab, and stow a free-floating object, mimicking real-world docking and servicing tasks. This project aims to achieve two firsts:

  1. It will be the first (to our knowledge) fully autonomous robot manipulation system in microgravity. Previous projects have demonstrated teleoperation [1] and semi-autonomous manipulation [2, 3], but we are not aware of any demonstrations in microgravity with onboard perception, decision making, and control.

  2. It will be the first demonstration of open-source, low-cost robotic manipulation hardware in microgravity, with a target cost ~500x less than previously-flown space robots like Robonaut [2] and ~10x less than commercial terrestrial robot arms.


Robot arms have been used extensively on Earth, the Moon, and Mars, but robot arms in low Earth orbit (LEO) have been mostly limited to non-autonomous “teleoperation” by astronauts or ground controllers [1]. While effective, the resource cost for teleoperation is significant, motivating the exploration of autonomous solutions. The ability to autonomously grasp and manipulate objects in microgravity would not only free up valuable crew time but also support applications in orbital servicing and manufacturing.

The constant motion of free-floating objects in microgravity means that an autonomous manipulator must be able to plan and move quickly in order to successfully grasp objects. Previously-deployed space manipulators have been semi-autonomous, lacking the on-board autonomy needed to quickly react to the dynamics of free-floating objects [2, 3, 4]. This project would integrate recent advances in computer vision and robust control to build an on-board autonomy stack for the capture-and-stow task. To address the safety concerns posed by full onboard autonomy, we will draw on recent research by members of the project team for certifying the safety of autonomous systems [5].


The project team includes Charles Dawson (MIT AeroAstro, advised by Prof. Chuchu Fan) and Evan Palmer (Oregon State University, advised by Prof. Geoff Hollinger).

Update 1: Choosing a robot

There are a lot of different factors that can go into choosing a robot manipulator, but for this mission the three main considerations are reach (how far the robot can reach), payload (how much it can pick up), and cost.

Reach: Since we’ll need to deploy this arm on the zero-g flight, where space can be tight, we need a robot arm that is relatively compact. Ideally, we’d like to be able to fit within a 50x50 cm footprint; it’s OK if the robot can technically reach further than this, since we can constrain it using software to stay within the bounds of the experiment.

Payload: Luckily, we have the luxury of designing the manipulation task, so we can choose a relatively light object to grasp (e.g. < 100 g). Choosing a small payload helps us access lower-cost arms and removes any safety concerns from the robot swinging a heavy mass around.

Cost: The cost of a robot arm usually scales with the reach and payload, so it’s good that we’re targeting a small, low-payload arm for this experiment.

A robot that meets all of these requirements is the Trossen Robotics PincherX 100, which has a 100 g maximum payload and 300 mm reach, easily fitting within our deployment constraints. It’s also (relatively) low cost, especially compared to arms with greater reach and payload. One drawback of this arm is that it only has 4 degrees of freedom: it can reach any (x,z)(x, z) coordinate in it’s workspace, and it can adjust the pitch of the gripper, but it cannot roll the gripper, and gripper yaw and the yy coordinate are coupled (they cannot be chosen independently). This will require some additional mechanical design to add a yy axis to the robot, but this extra complexity is justified by the significantly lower cost of this arm compared to a fully actuated arm.

Update 2: Simulation

While hardware will be the gold standard for testing our system, it’s helpful to have a software simulation environment for quick tests during development. We’ve chosen to use Gazebo as our simulation environment, which will allow us to simulate the physics of the robot interacting with a free-floating object, as well as simulate any cameras the robot uses to track the target object.

TODO: add videos of simulator when available.

  • [1] Hambuchen et al. “A Review of NASA Human-Robot Interaction in Space.” Current Robotics Reports (2021)

  • [2] Ahlstrom et al. “Robonaut 2 on the International Space Station: Status Update and Preparations for IVA Mobility.” AIAA Space 2013 Conference

  • [3] Farrell et al. “Supervisory Control of a Humanoid Robot in Microgravity for Manipulation Tasks” IEEE IROS 2017

  • [4] Hirzinger. “ROTEX — The first space robot technology experiment.” Experimental Robotics III 2005

  • [5] Dawson et al. “Safe Control With Learned Certificates: A Survey of Neural Lyapunov, Barrier, and Contraction Methods for Robotics and Control.” IEEE T-RO 2023.

No comments here
Why not start the discussion?