Crowdsourcing the search for aliens

The SETI Institute's Jill Tarter is looking for a few good filters.

Research from the Search for Extra-Terrestrial Intelligence (SETI) project is well known to most technologists because the SETI@Home initiative was one of the first widely distributed computing applications.

Although decades of listening and analyzing radio signals have yet to yield proof of alien intelligence, the pursuit has resulted in significant advances in signal processing technology, as well as serendipitous discoveries in radio astronomy. Now Jill Tarter, director of the Center for SETI Research at the SETI Institute, wants to take the distributed analysis of radio signals to the next level. Tarter, a speaker at the upcoming OSCON conference, discusses her new initiatives in the following Q&A.

How is your new project different from existing distributed computing projects, such as SETI@Home?

OSCON -
Save 20%Jill Tarter: SETI@Home came on the scene a decade ago, and it was brilliant and revolutionary. It put distributed computing on the map with such a sexy application. But in the end, it’s been service computing. You could execute the SETI searches that were made available to you, but you couldn’t make them any better or change them.

We’d like to take the next step and invite all of the smart people in the world who don’t work for Berkeley or for the SETI Institute to use the new Allen Telescope. To look for signals that nobody’s been able to look for before because we haven’t had our own telescope; because we haven’t had the computing power.

At the moment, we’re swamped with data. We can’t process it all in real-time. Ten years from now, Moore’s law will allow us to catch up. Ten years after that, we’ll probably be data-starved.

Our study has typically been done by analyzing the data in near real-time with things we’ve invented and custom-built over the years. We’re about to change that by going into a cluster environment for the first time, and not building any accelerator cards or any special purpose hardware. That means anybody can help us write software to make this better. We’re trying to get our code cleaned up enough to publish as open source and then let anybody do what they want with it.

Once a week we capture about eight hours of data that we’re putting in the cloud. People thus far have been downloading big datasets. That’s a bummer, operating on them in their own environment, using their own analysis tools, looking for different things. What we want to do, and what I’m hoping to demonstrate at OSCON, is release a new API that we’ve co-developed with a startup called Cloudant that will allow people to compile and debug their code locally and then upload it and operate using EC2 resources that Amazon has provided.

How can people who aren’t math wonks get involved?

JT: For people who don’t have black belts in digital signal processing, we want to take regions of the spectrum that are overloaded with signals and get those out and have them visualized in different ways against different basis vectors. We’d like to see if people can use their pattern recognition capabilities to look or maybe listen; to tease out patterns in the noise that we don’t know about.

That’ll be a big challenge, and there will be a lot of matching of visual patterns that are real or imagined by the observer against known patterns of interference. That can involve a lot more of the world. Perhaps we can make it into a game.

How is the Allen Telescope different from traditional radio telescopes such as the Very Large Array?

JT: The Allen Telescope is the first of what we call Large Number of Small Dishes (LNSD), a new way of building telescopes. It’s a radio interferometer. That isn’t new. We’ve had interferometers since the ’70s. But creating the equivalent of a large telescope by building it out of lots of small pieces, by using consumer technologies wherever possible and by putting the complexity into computing, we’ve changed the paradigm and brought the cost down. I hope that we’ll use it to change the world by detecting evidence of another technology or by discovering some new astrophysical phenomenon that no one yet thought of.

The Drake equation is a well-known estimate of the number of possible intelligent alien races that might exist in our galaxy. Does the recent discovery of planets around other stars shift the equation?

JT: Yes, in the sense that we’re reducing the error bars on the fraction of sun-like stars that have planets. Within a couple of years, thanks to the Kepler mission, I think we’ll have found the first Earth analog. That’s going to make a big difference in people’s perceptions about life elsewhere. So far, the planetary systems that we’re finding look a bit strange when compared to ours. But when we actually find analogs, people will begin to say, “Wow, maybe there are other technological civilizations out there.”

We’re also moving from the other direction. We’re starting to give microbes the respect they deserve, and we’re getting blown away by the capabilities of extremophiles. Millions of years of evolution have made these things perfect for living and growing in battery acid and in all kinds of extreme conditions. So what we’re also doing is broadening the potentially habitable real estate in the universe. It might not actually have to be quite such a Goldilocks “just right” planet for life to originate and evolve into something that’s technological, although not humanlike. I think there’s a real estate boom going on out there.

Related:


OSCON will be held July 19-23 in Portland, Ore. Radar readers can save 20% on registration with the discount code OS10RAD.

tags: ,