What capabilities does the first AI satellite have?
Though artificial intelligence (AI) plays a vital role in modern life, from stimulating our understanding of the cosmos to surfacing entertaining videos on your device, artificial intelligence hasn’t found its way into orbit yet.
However, that is until September 2, when an experimental satellite about a cereal box’s size named PhiSat-1. It is now soaring at more than 27,500 Km/h in sun-synchronous orbit around 530 km overhead.
PhiSat-1 consists of a new hyperspectral-thermal camera and on-board artificial intelligence processing because of an Intel Movidius Myriad 2 Vision Processing Unit (VPU). The same chip exists inside many smart cameras and even a $99 selfie drone here on this planet. PhiSat-1 is one of a pair of satellites on a mission to surveil polar ice and soil moisture. It also tests inter-satellite communication systems to create a future network of federal satellites.
How to handle the massive amount of data generated by high-fidelity cameras such as on PhiSat-1? Data systems and on-board computing lead at the European Space Agency, Gianluca Furano says, “The capacity that sensors have to produce data are increasing, but only by a factor of three, four, five per generation.”
On the other hand, nearly two-thirds of the Earth’s surface is covered in clouds at any given time which means many useless images of clouds are typically captured, saved, sent over precious downlink bandwidth to Earth, saved again, and then evaluated by a scientist or an algorithm on computer hours or days later that will be deleted.
Furano says, “AI at the edge came to rescue us, the cavalry in the Western movie.” The idea was to use on-board processing to recognise and discard cloudy images which can save almost 30% of bandwidth.
Chief Technology Officer at Ubotica, Aubrey Dunne says, “Space is the ultimate edge.” The Irish startup built and examined PhiSat-1’s AI technology, working closely with Cosine, camera maker, in addition to the University of Pisa and Sinergise to bring the complete solution. Dunne adds, “The Myriad was designed from the ground up to have an impressive compute capacity but in a very low power envelope, and that suits space applications.”
However, the Myriad 2 was not designed for orbit. Spacecraft computers generally use specialised “radiation-hardened” chips which can be “up to two decades behind state-of-the-art commercial technology,” Aubrey Dunne explains. Artificial intelligence has not been on the menu.
Dunne and Ubotica team experimented with ‘radiation characterisation’ putting the Myriad chip through a series of tests to understand how to handle any resulting errors. ESA had never tested a chip of such complexity for radiation. Hence, they were doubtful if they could test it properly. They had to write the handbook on how to perform a comprehensive test and characterisation for this chip from scratch.
Although the low-power yet high-performance computer vision chip was ready to venture beyond this planet’s atmosphere, there was another challenge.
AI algorithms are usually built or trained using enormous quantities of data to learn. In such a case, the more significant concern is what a cloud is and what’s not. As the given camera was new, the team didn’t have any data. They had to train their application on synthetic data extracted from exiting missions.
The team could put everything on board, including the system, software integration and testing with the help of other organisations across Europe within four months. Later September, after all the on-ground cloud verifications Pastena proclaimed, “We have just entered the history of space.”
ESA announced that the joint team was “happy to reveal the first-ever hardware-accelerated AI inference of Earth observation images on an in-orbit satellite.”
ESA and Ubotica are working together on PhiSat-2 expecting to identify wildfire in minutes, spot rogue ships or environmental mishaps, track soil moisture and the growth of crops, and monitor climate change.