Objects falling back to Earth from space are increasingly in the news, including decommissioned satellites and a balky Russian rocket stage that crashed in the Pacific Ocean January 5.
The recent launch of the James Webb Space Telescope also reminded observers how the Earth is continuously orbited by hundreds of thousands of large and small fragments of "space debris," each one a potential speeding bullet to any spacecraft it happens to strike.
To help predict these space jams and falling objects, Stevens Institute of Technology faculty researchers are on the case.
Professors Rajarathnam "Mouli" Chandramouli and K.P. "Suba" Subbalakshmi recently took top prize among all university entries in the U.S. Space Force's 2021 Hyperspace Challenge, winning $25,000 for their proposed artificial intelligence-based system to continuously optimize how Earth-based sensors (such as telescopes and radars) work together to track space objects and anticipate collisions in real time.
"Both accidental collisions and intentional, hostile acts are real threats in space," says Chandramouli. "Our 'smart' sensor-tasking system was more than 60% more efficient than the current state-of-the-art systems for observing objects, so we believe this early work is an important first step toward safer space environments for everyone."
More intelligent sky-watching
By 2050, it's estimated there will be 50,000 man-made satellites launched and orbiting the Earth, a 2,000% increase from the 2,500 satellites currently circling the planet.
There are also an estimated half-million more large, small and microscopic pieces of "space debris" orbiting Earth, at speeds of up to 16,000 miles per hour — everything from tiny meteorites and discarded gloves to paint chips and parts of satellites that have disintegrated or collided. More than 25,000 of these objects are larger than a softball.
Those objects frequently crash into each other or are intentionally destroyed by their owners, too, creating even more pieces of high-speed trash that must be carefully avoided. In 2020 alone, spacecraft made three evasive maneuvers to steer clear of potential collisions with orbital objects.
The bottom line: in a space environment where even a coin-sized object can punch a lethal hole in a spacecraft — "hard impacts from small objects have already happened," says Chandramouli — it will be critically important to know where all space objects of interest are, at all times, and whether each is moving toward anything else.
For decades this has been done the old-fashioned way, by fixing telescopes and radars on some 20,000 individual known objects at various altitudes, tracking them through the sky, and then repeating the exercise — observing the same objects, with the same devices — the following evening.
With the forecasted rapid increase in satellite launches, plus increased concerns about space-based rogue actions, however, the U.S. government's thinking and strategy have recently changed.
"The Space Force and the Air Force Research Laboratory have decided they would like to inject more intelligence into the observation and prediction process," says Subbalakshmi. "They want to develop new systems that can self-optimize themselves automatically and enhance the observations."
The Stevens team's innovation was to repurpose technology that had originally been designed to assist emergency responders in crisis situations. During chaotic events such as the 2001 Twin Towers attacks or the 2013 Boston Marathon bombing, normal communications channels are quickly overwhelmed, preventing responders from talking to dispatchers or one another.
To address that deficiency, the Stevens duo began working on algorithms that can intelligently switch radio bandwidths when they sense congestion increasing — without dropping the calls.
"This is somewhat of an iteration of that other public safety idea," explains Subbalakshmi, "applied to the new and also highly challenging problem of space objects."
The duo's newly proposed system decides which telescopes and radars should observe which areas of space and which objects in real time, adjusting the assignments of devices and sensors — in other words, changing where they are pointing — according to the very latest information about all the thousands of tracked objects' motions, their predicted paths and sudden deviations from those predicted paths.
"The process of observing objects is somewhat similar to the way a self-driving car looks at all the objects on the road with its cameras and sensors," explains Chandramouli, "and finds a pedestrian, finds a stop sign, then treats those things differently as it moves for optimal safety to all."
"This proposed work is addressing the very first step of this process, which is deciding what camera or sensor looks at what."
By learning on the fly and "rewarding" itself for correcting sensing objects by reinforcing the patterns that led to successful prediction, the Stevens-developed algorithm continually optimizes itself in real-time, rapidly increasing its effectiveness.
Future iterations of their algorithmic system might go on to address the next steps, adds Chandramouli, such as real-time assessment of the various objects themselves and the characteristics of their paths through the sky.
Anand Santhanakrishnan of New York Institute of Technology also contributed research to the project proposal.