Postgraduate research project

Autonomous close-proximity spacecraft operations: hardware-in-the-loop validation of neuromorphic event cameras

Funding
Fully funded (UK only)
Type of degree
Doctor of Philosophy
Entry requirements
First-class honours degree View full entry requirements
Faculty graduate school
Faculty of Engineering and Physical Sciences
Closing date

About the project

This project aims to overcome the challenges of testing spacecraft control algorithms on real hardware in Earth-based environments. Using our new Aerospace Robotics Control & Simulation (ARCS) facility, we will investigate specifically how bio-inspired neuromorphic event cameras enable resilient, low-compute, autonomous spacecraft proximity operation, and we will experimentally validate the system under realistic space-like conditions using our KUKA KR10 robot arm.

Autonomous close-proximity operations (CPO) between spacecraft are critical for space situational awareness (SSA), inspection of uncooperative targets, and in-orbit servicing. However, spacecraft must operate in one of the most extreme environments encountered by autonomous systems: 

  • severe lighting transitions
  • high dynamic range glare
  • radiation-constrained computation
  • limited control authority
  • communication-denied conditions

While robust relative navigation and control algorithms exist in the literature, most assume idealised sensing and high-performance computing. A major scientific and technological gap remains between theoretical guidance, navigation and control (GNC) frameworks and experimentally validated, radiation-compatible hardware implementations. In particular, conventional frame-based cameras require significant computational resources for continuous feature extraction and pose estimation. 

This research focuses on: 

  • Resilient 6-DOF relative navigation and control: You'll develop a nonlinear relative motion framework for an autonomous chaser spacecraft approaching a non-cooperative target. High-order Extended/Unscented Kalman Filters will fuse inertial sensing with event-based visual pose estimation for an end-to-end on-board state estimation pipeline. Adaptive trajectory planning and constraint-aware attitude control will be implemented to ensure collision avoidance under state uncertainty. 
  • Neuromorphic perception integration: You'll integrate event cameras, which asynchronously detect pixel-level brightness changes, to enable high-speed, low-latency perception with dramatically reduced data rates. Algorithms for event-based feature tracking, pose estimation, and optical flow will be investigated and compared against conventional frame-based methods in terms of accuracy, power consumption, and computational load. 
  • Hardware-in-the-loop (HIL) validation: You'll build a representative spacecraft test model - including event camera, onboard embedded processor, and power unit - as a “partial” physical twin. Using our robotic arm, you will develop a simulation of 6-DOF orbital motion, covering realistic space conditions such as eclipse transitions, glare, sensor dropout, and processor throttling. You'll quantify system resilience, graceful degradation, and recovery performance as part of the verification.

The School of Engineering is committed to promoting equality, diversity inclusivity as demonstrated by our Athena SWAN award. We welcome all applicants regardless of their gender, ethnicity, disability, sexual orientation or age, and will give full consideration to applicants seeking flexible working patterns and those who have taken a career break. The University has a generous maternity policy, onsite childcare facilities, and offers a range of benefits to help ensure employees’ well-being and work-life balance. The University of Southampton is committed to sustainability and has been awarded the Platinum EcoAward.