SCARLET laboratory (lab) is a SmartSat initiative to develop innovative technologies across spacecraft autonomy, on-board Artificial Intelligence (AI) and data analytics.

The lab provides a collaborative platform to bring together researchers and industry to advance autonomy and produce tangible outcomes for Defence and Civil pursuits, enabling Australia’s next space missions.

Examples of challenges that the SCARLET lab aims to tackle include:

  1. Autonomous spacecraft, smarter and more efficient payload operations
  2. Resilient and sustainable operations in a congested Low Earth Orbit
  3. Accurate information extraction and timely analytics of terabytes of Earth Observation data

Spacecraft autonomy and Artificial Intelligence (AI) is a technology area that Australia can make a contribution on the international stage, enabling our future space endeavours and making a difference here on Earth.

To get involved, contact SCARLET Lab Director Dr Carl Seubert (Chief Research Officer, SmartSat CRC) at [email protected].

Programs

SCARLET-

Spacecraft Autonomy and Onboard AI for Next Generation Space Systems

Project Leader: Professor Ryszard Kowalczyk, SmartSat Professorial Chair for Artificial Intelligence
(University of South Australia)

This project aims at addressing the above requirements by developing novel concepts, methods and technologies to provide new AI-based spacecraft autonomy capabilities for the next-generation space systems, such as dynamically networked formations of heterogeneous satellites. It focuses on high impact areas of spacecraft autonomy and onboard AI as identified and prioritised with the industry and defence partners, including:

  • WP1: Onboard processing and actionable intelligence
  • WP2: Small spacecraft and constellation resilience
  • WP3: Dynamic optimisation of constellation resources
  • WP4: Real-time tasking and resource allocation

The output of this project is a set of autonomous algorithms, demonstrating their capability through software simulations with use-cases provided by the industry partners.

Find out more here >

SCARLET-β

Spacecraft autonomy experiment via an optimal self-image against a backdrop of Australia

Project Leader: Professor Salah Sukkarieh, The University of Sydney

The aim of this project is to research, develop and test goal-oriented algorithms and software that will grant a spacecraft autonomous capability to undertake its mission robustly and adaptively in real-time. The activities will focus on coupling optimisation and machine learning techniques to orbital and sensing prediction models, such that when sensing data is obtained in real time the next most optimal action can be determined.

The autonomy will be experimentally tested using the Defence Science and Technology Group (DSTG) Buccaneer Main Mission (BMM) spacecraft scheduled to launch in 2023. BMM features the MANTIS payload with a controllable, deployable arm for self-inspection imaging. The project output will be a set of algorithms, methodologies, and approach to grant the spacecraft the ability to take an optimal image of itself against a backdrop of Australia in real-time. The outcomes will be learnings of the relationship between on-board and off-board autonomy that will be applicable to other spacecraft missions.

Find out more here >

KANYINI ONBOARD-AI

There are a range of projects currently underway affiliated with the South Australian Space Services Mission satellite, Kanyini. These projects focus on payload autonomy experimentation of AI to process hyperspectral images onboard. This includes efficient tools, such as band registration and panoptic segmentations, as well as applications like early fire-smoke detection.

  • Onboard Hyperspectral AI: Cal, Panoptic segmentation, estimation
    Project Leader: Professor Clinton Fookes, Queensland University of Technology

    This project will develop brand new capabilities for onboard AI processing and analysis of hyperspectral imagery on smart satellite platforms. In particular, the project will tackle the key modules of calibration, segmentation, fine-grained analysis and joint space-ground inference of onboard AI processing of hyperspectral data. New capabilities in these areas will transform the ability of a satellite to automatically make sense of the rich and multidimensional spectral modalities in an end-to-end manner onboard the satellite itself. This will create new opportunities to enable accurate, efficient, and reliable automated detection and classification of natural phenomena and human activities over a wide area on Earth. Find out more here >
  • Small satellite energy-efficient on-board AI processing of hyperspectral imagery for early fire-smoke detection
    Project Leader: Dr Stefan Peters, University of South Australia

    This research aims to provide a solution for energy-efficient AI-based on-board processing of hyperspectral imagery supporting automated early detection of fire smoke. We propose using modified and resampled MODIS imagery data that emulates the swath as well as spectral, spatial, and radiometric resolution of HyperScout-2 channel 1 hyperspectral imagery. In doing so, we intend to provide a solution that meets on-board processing limitations and up/downlink data transfer restrictions of the Kanyini – HyperScout-2 with Intel’s Myriad X VPU chip.

    Read Technical Report 11: Energy efficient onboard AI for early fire smoke detection

    Find out more here >

  • Robust Predictive AI: Advanced Satellite Hyperspectral Band Registration and Reliable Event Prediction
    Project Leader: Professor Clinton Fookes, Queensland University of Technology

    This project will research a novel deep learning pipeline for achieving robust and reliable forecasting of natural disaster events with hyperspectral satellite imagery. The majority of the existing machine learning algorithms do not possess the ability to forecast the occurrence of natural disasters in advance and they are only able to detect their occurrence when the disaster event happens.

    Find out more here >