Searching for ET using AI on GCP

Click for: original source

Rob Harrand wrote this article about aproject playing with open data from SETI. They say that the best way to learn data science is to create something. Some of the most interesting data publicly available in GitHub repositories is data from the SETI Institute (the Search for Extraterrestial Intelligence).

The folks over at SETI were keen to help author out and made it clear that engaging “citizen scientists” was something they wanted to do more of in the future.

The article then describes:

  • SETI and Citizen Science
  • SETI’s use of Deep Learning
  • The ABACAD Approach to finding ET
  • Code on GCP Datalab
  • Simulating the data
  • Building a deep learning model
  • Making predictions from ABACAD filterbank files

.. and more. Author started to process data on Kaggle, thanks to their free data hosting and GPU support. He created a notebook (known as a “kernel” on Kaggle) that introduced typical SETI data and the filterbank file format, followed by one using deep-learning to distinguish between different types of simulated data (as per the summer challenge).

Code for this project can be found here. Nice one!

[Read More]

Tags big-data analytics cloud machine-learning data-science