By: Robin Seemangal
The results from the launch of the Kepler* space telescope and the K2 mission to search for planets outside of our solar system have been nothing short of astounding. From Kepler, 2,343 exoplanet candidates have been discovered while 2,343 have been actually confirmed.
So far, 30 of those planets have the potential to be habitable.
One of those exoplanets, Kepler-90i, is a toasty and rocky destination 2,545 light-years away that orbits its host star every 14.4 days. This planet was found using an early incarnation of artificial intelligence. Specifically, Google* machine learning infrastructure.
"In this case, computers learned to identify planets by finding in Kepler data instances where the telescope recorded signals from planets beyond our solar system, known as exoplanets," explained NASA.
In April, Elon Musk's SpaceX* launched the Transiting Exoplanet Survey Satellite* (TESS) from Cape Canaveral, so NASA's new satellite can add to the growing list of exoplanets. During Kepler's initial four-year mission that began in 2009, the satellite continuously scouted a small portion of the night sky, searching through 150,000 potential host stars.
The way Kepler and its successor TESS spot planets is complex and has the potential for false-positives, which have occurred.
The satellites don't actually search for worlds orbiting stars; they stare at stars and wait for them to periodically dim. Minute dips in light can indicate a large passing body making regular trips around that host star. A host star's dimming isn't easy to spot to say the least and requires large teams and thousands of hours to process the imaging data. Sometimes, scientists will miss something and that's where AI can step in.
"Just as we expected, there are exciting discoveries lurking in our archived Kepler data, waiting for the right tool or technology to unearth them," said Paul Hertz, director of NASA's Astrophysics Division in Washington. "This finding shows that our data will be a treasure trove available to innovative researchers for years to come."
When Kepler-90i was discovered by a deep learning AI program, the data had already been missed by the project's principal researchers. Machine learning is geared toward a very specific type of data set that must be analyzed by an AI as the data set grows. The AI then uses what it learns to adapt and create parameters for its query, like an x amount of dimming every x number of hours in front of a star that is x amount of brightness.
"The discovery came about after researchers Andrew Vanderburg and Christopher Shallue trained a computer to learn how to identify exoplanets in the light readings recorded by Kepler* – the miniscule change in brightness captured when a planet passed in front of, or transited, a star," explained NASA. "Inspired by the way neurons connect in the human brain, this artificial 'neural network' sifted through Kepler data and found weak transit signals from a previously-missed eighth planet orbiting Kepler-90, in the constellation Draco."
The more data collected by NASA, especially from TESS*, the more finely tuned the entire infrastructure for cataloging exoplanets will be. The new satellite will do a rolling survey over the course of two years while attempting to view about 85% of the sky. TESS* will focus on the hottest burning stars closer in proximity to our own solar system and these observations would yield a staggering amount of data compared to the Kepler* mission.
More astronomical data means more possibilities, as well. Why are we searching for exoplanets in the first place? Because if we're lucky, one or two may be habitable. With future missions like the James Webb Space Telescope, we must ask ourselves how we can utilize AI even further, to not only detect even more of these worlds but help us determine if they are habitable.
The data coming from this new telescope will begin to shed light on the atmospheric makeup of these exoplanets, and AI would prove a powerful ally in finding the markers for oxygen, methane, and "other possible signs of life," according to NASA.
The problem with data sets from Kepler, the future ones from TESS, and the James Webb Space Telescope is the number of false positives. It's simply unsustainable for detection and exploration activities to be parsed by human observers given the amount of data. This creates the need for machine learning software that can interface with an existing hardware infrastructure.
Companies like Intel are offering platforms to data scientists from any industry, including space, to house their data sets and research parameters that will eventually help scale up or speed up the processing of data. The Intel® AI DevCloud platform can accommodate all AI approaches to data comprehension and analysis, and is powered by Intel® Xeon® Scalable processors for machine learning and deep learning training - all of which researchers can familiarize themselves with at Intel's online AI Academy.
About the Author:
Robin Seemangal Robin Seemangal is a spaceflight journalist who spent three years at Kennedy Space Center covering SpaceX's breakthrough in rocket reusability. His work is published in Wired Magazine, Popular Science, Popular Mechanics, and the New York Observer. https://twitter.com/nova_road
(Photo of TESS courtesy of NASA)