Close Proximity Space Domain Awareness On-Board Satellite Systems

The research topic is in alignment with the CRC’s Advanced Satellite Systems, Sensors and Intelligence research program, and is highly relevant to the defence & security industry focus. The proposed research focuses on the development of intelligent satellite systems that have the ability to perceive and relay the semantics of any space situation through the processing of onboard sensor data. Thus, such research would follow the theme of building advanced satellite systems that leverage Artificial Intelligence (AI) techniques to perform advanced analytics on collected
sensor data. We currently call this capability “onboard Space Domain Awareness (SDA)”. Given that there is no universally recognised definition, SDA can be defined as the capability to detect, track, identify and characterise objects in space.

Through the partnership and collaboration with Infinity Avionics, the immediate intention is to look towards defence applications. In this context, the research has potential for application to tasks such as: 3D reconstruction, pose estimation and motion characterisation of space objects. Further, the proposed research aligns with many of the proposed future capabilities of existing platforms in the area of space logistics, such as the Mission Extension Vehicle (MEV) from Northrop Grumman. Future tasks in onboard autonomy such as satellite inspection, repair and in-orbit robotic assembly of space structures would immensely benefit from advances in close-proximity SDA.

Surveillance and tracking applications would also benefit from advancements made in SDA. Certain surveillance and tracking tasks, such as conjunction analysis (delivery of collision alerts between two objects) and fragmentation tracking (survey and characterise new debris emanating from a collision or explosion), rely on extracting semantics from a region of space to produce actionable intelligence — a function fulfilled by SDA. The purpose of the joint research and collaboration with Infinity Avionics is to develop cutting edge machine learning and computer vision algorithms for novel SDA-oriented space applications. Specifically, a key focus of the partnership is the development of algorithms that can leverage event cameras to achieve SLAM, pose estimation, etc. in a close-proximity space environment. These new capabilities are intended to extend the capabilities of onboard satellite perception systems by leveraging the unique operating characteristics of neuromorphic (event) cameras, namely their high dynamic range, high temporal resolution, low data rate and low power consumption properties.

Due to the unique constraints of operating within the space environment, sensors (and the algorithms that use them) need to operate in the face of challenging conditions: extreme lighting conditions, high velocity motion and low power consumption. Given the end-user’s current prowess in developing space-ready hardware (e.g. Orion12MP, SelfieCam), we hope to combine their expertise in developing space-ready hardware with our expertise in computer vision algorithms to develop a system that uses a monocular event camera to track the motion of a single (uncooperative) object in space. Towards this goal, we hope to use our combined expertise to develop algorithms that take the constraints of the operating environment into account and fully utilise onboard hardware to achieve efficient operation.

As technology in this area begins to mature, the proposed joint research will pivot towards attaining real-time operation of the object motion estimation algorithm and exploring potential multi-sensor architectures (e.g. event & LiDAR) to further improve the efficacy of the algorithm. To accommodate for these new challenges, new techniques must be devised that can fuse the additional information from the other sensors with the events.

P2.51s

Project Leader:
Professor Tat-Jun Chin, The University of Adelaide

PhD Student:
Ethan Elms, The University of Adelaide

Participants: