Laboratory Validation of Vision Based Grasping, Guidance and Control with Two Nanosatellite Models

Laboratory Validation of Vision Based Grasping, Guidance and Control with Two Nanosatellite Models

 

This work focuses on guidance and control of small satellites such as cubesats performing proximity operations in a several meters range of a target object. The main goal is to develop a principled methodology for handling the coupled effects of orbital dynamics, rotational and translational rigid body dynamics, underactuation and control bounds, and obstacle avoidance constraints. The work was initiated with the support of the Keck Institute of Space Studies (KISS).

<iframe width=”320″ height=”240″ src=”https://www.youtube.com/embed/POqW0iDRBvU” frameborder=”0″ allowfullscreen></iframe>

 

We have developed 2-D experimental testbeds consisting of an air-bearing table and cubesat engineering models for testing and integration. Simulated scenarios, such as reconfiguration maneuvers and asteroid surface sampling, are developed to illustrate the approach. A simple cold-gas propulsion system using rapid prototyping was developed for 6-dof control.The video below shows the JHU proximity navigation testbed for nanosatellites in action. The engineering model of the CubeSat uses on-board sensing and computing to autonomously navigate among obstacles. It uses Robot Operating System(ROS) as the framework for running the perception, control and planning algorithms to enable the desired autonomous behavior. The model is fixed to an air-bearing base which facilitates the low friction planar motion on a polished granite table and is driven by a cold gas propulsion system.

Left: Point cloud generated from depth sensor Right: A mock-up obstacle environment on the JHU proximity testbed

Perception and control components

Other older related videos:


GNC algorithms applied to three conceptual studies. Motions computed using optimal control exploiting orbital and rigid body dynamics, and global stochastic optimization.