Are there better ways of studying the raw data from SETI? We may know soon, because Jill Tarter has announced that in a few months, the SETI Institute will begin to make this material available via the SETIQuest site. Those conversant with digital signal processing are highly welcome, but so are participants from the general public as the site gears up to offer options for all ages. Tarter speaks of a ‘global army’ of open-source code developers going to work on data collected by the Allen Telescope Array, along with students and citizen scientists anxious to play a role in the quest for extraterrestrial life.
SETI@home has been a wonderful success, but as Tarter notes in this CNN commentary, the software has been limited. You took what was given you and couldn’t affect the search techniques brought to bear on the data. I’m thinking that scattering the data to the winds could lead to some interesting research possibilities. We need the telescope hardware gathered at the Array to produce these data, but the SETI search goes well beyond a collection of dishes.
Ponder that the sensitivity of an instrument is only partly dependent on the collecting area. We can gather all the SETI data we want from our expanding resources at the Allen Telescope Array, but the second part of the equation is how we analyze what we gather. Claudio Maccone has for some years now championed the Karhunen-Loève Transform, developed in 1946, as a way of improving the sensitivity to an artificial signal by a factor of up to a thousand. Using the KL Transform could help SETI researchers find signals that are deliberately spread through a wide range of frequencies and undetectable with earlier methods.
Image: Dishes at the ATA. What new methods can we bring to bear on how the data they produce are analyzed? Credit: Dave Deboer.
SETI researchers used a detection algorithm known as the Fourier Transform in early searches, going under the assumption that a candidate extraterrestrial signal would be narrow-band. By 1965, it became clear that the new Fast Fourier Transform could speed up the analysis and FFT became the detection algorithm of choice. It was in 1982 that French astronomer and SETI advocate François Biraud pointed out that here on Earth, we were rapidly moving from narrow-band to wide-band telecommunications. Spread spectrum methods are more efficient because the information, broken into pieces, is carried on numerous low-powered carrier waves which change frequency and are hard to intercept.
What Biraud noticed, and what Maccone has been arguing for years, is that our current SETI methods using FFT cannot detect a spread spectrum signal. Indeed, despite the burden the KLT’s calculations place even on our best computers, Maccone has devised methods to make it work with existing equipment and argues that it should be programmed into the Low Frequency Array and Square Kilometer Array telescopes now under construction. The KLT, in other words, can dig out weak signals buried in noise that have hitherto been undetectable.
But wait, wouldn’t a signal directed at our planet most likely be narrow in bandwidth? Presumably so, but extraneous signals picked up by chance might not be. It makes sense to widen the radio search to include methods that could detect both kinds of signal, to make the search as broad as possible.
I bring all this up because it points to the need for an open-minded approach to how we process the abundant data that the Allen Telescope Array will be presenting to the world. By making these data available over the Web, the SETI Institute gives the field an enormous boost. We’re certainly not all digital signal analysts, but the more eyes we put on the raw data, the better our chance for developing new strategies. As Tarter notes:
This summer, when we openly publish our software detection code, you can take what you find useful for your own work, and then help us make it better for our SETI search. As I wished, I’d like to get all Earthlings spending a bit of their day looking at data from the Allen Telescope Array to see if they can find patterns that all of the signal detection algorithms may still be missing, and while they are doing that, get them thinking about their place in the cosmos.
And let me just throw in a mind-bending coda to the above story. KLT techniques have already proven useful for spacecraft communications (the Galileo mission employed KLT), but Maccone has shown how they can be used to extract a meaningful signal from a source moving at a substantial percentage of the speed of light. Can we communicate with relativistic spacecraft of the future when we send them on missions to the stars? The answer is in the math, and Maccone explains how it works in Deep Space Flight and Communications (Springer/Praxis, 2009), along with his discussion of using the Sun as a gravitational lens.
“Can we communicate with relativistic spacecraft of the future when we send them on missions to the stars?”
Tongue slightly in cheek, but didn’t Mr. Heinlein figure this out 60 or 70 years ago? I would think Doppler-shifted RF would be easily accounted for. And messages stored and playback speed increased or decreased as necessary to account for differences in the passage of time.
Of course a completed thought experiment isn’t the real deal. Am I, or Mr. Heinlein, missing something?
“Dishes at the ATA. What new methods can we bring to bear on how the data they produce are analyzed?”
Hopefully they’ll bear upon some ~real~ astronomy like mapping the distribution of ionized gasses, masers, and pulsars in the Galaxy and beyond. >:-)
I do recommend anyone living near or visiting the northeastern corner of California to tour the ATA. The facility is fascinating and the docents there are enthusiastically welcoming — it gets lonely out there in Hat Creek.
Denver writes re KTL and relativistic communication:
Good question and the answer is no, but I’ll let Claudio explain why, as he’s the expert — let me send your query to him, and I’ll publish his response. (Later): Email from Claudio that he’s getting ready for a transatlantic flight, so I may not have a quick answer, but will post one as soon as it’s received.
Would it make sense to release test data with various types of signals embedded in it? It would give people who want to develop their own search methods a place to start.
SETIQuest – hmmm, that name sounds awfully familiar…
http://www.coseti.org/setiq_cv.htm
NS: “Would it make sense to release test data with various types of signals embedded in it?”
… and test data with *no* signals embedded (the scientists should know how). This is important for validating the approaches.
I’m really curious to know what data and software we will get.
There are already many potential citizen scientists with powerful computers (more and more have multiprocessors). There is the notorious problem of (not) having enough time.
Everybody is assuming the civilizations we contact will be more advanced!
They might be survivors of a cataclysm who can barely operate some of the systems without really understanding what is happening, or how.
A starship which enters our system might be a generation ship whose passengers have forgotten their origins.
That idea’s been done in novels with human characters. It might happen to aliens.
I do the KLT software for detection of signal into noise like R.Dixon tell in his paper, and tested it for several month on a Mercury Altivec system for the Seti project and it seem to be a very promising tool but It need so much computation and more studies
The search for alien intelligence: Why ET has nowhere left to hide
How supercomputing will help the Seti Institute search one million stars
By Nick Heath, 18 September 2010 11:17
ANALYSIS
It’s easy to be cynical about the Seti Institute and its search for extra-terrestrial (ET) intelligence.
After all, after years of listening out for alien radio broadcasts from the depths of space, the institute has found nothing but cosmic static.
Established in 1984, the Seti Institute supports scientific research into the existence of intelligent life outside of Earth and today runs one of the world’s biggest projects searching for extra-terrestrial intelligence (Seti).
Yet far from giving up the hunt, the institute is instead planning to broaden its search for alien signals to cover at least 500 times more stars than it has looked at to date.
According to Jill Tarter, the institute’s director of the Center for Seti Research, the institute is aiming to cover between one million and 10 million stars over the next decade.
So far, the institute has looked for alien communications in the vicinity of about 2,000 stars in the Milky Way, a fraction of the hundreds of billions of stars in our home galaxy.
“That’s a tiny, tiny portion of the sky. I like to use the analogy that all of the searching that we have done to date is the equivalent of taking an eight ounce glass of water out of the Earth’s oceans,” Tarter told silicon.com.
Full article here:
http://www.silicon.com/technology/hardware/2010/09/18/the-search-for-alien-intelligence-why-et-has-nowhere-left-to-hide-39746138/