It was a busy weekend for backchannel emails. I got off a Twitter post (OK, a ‘tweet’) on the centauri_dreams channel on Friday about the disturbing news from Kepler. The reaction was swift. The problem is caused by noisy amplifiers in the electronics of the space-borne telescope, which means the powers that be have to fiddle with the way data from Kepler is processed. This article in Nature News (thanks to all who forwarded links) spells it all out, saying that the planet hunt could be delayed.
The article has circulated widely but apparently has problems of its own. William Borucki, Kepler principal investigator, posted this on Ian O’Neill’s SpaceDisco site:
“There is a mistake in the Nature article. The Kepler Mission is actually doing very well and is producing planet discoveries that will be announced early next year. Data from 3 of the 84 channels that have more noise than the others will be corrected or the data flagged to avoid being mixed in with the low noise data prior to the time an Earth twin could be discovered.”
No word here on exactly how long that corrective process will take, so this is still a developing story. The Nature News article had said the Kepler team’s best response to the noise issue would be to re-write the software code, a process that might take two years or more. That’s not a fatal problem, but it would push back the time-frame for finding our first terrestrial world around another star. Because that first find might well be an Earth-sized planet around an M-dwarf, where transits are frequent because the habitable zone is so much closer to its star.
That thought probably put new wind in the sails of planet-hunters for ground-based projects as well as CoRoT. This is competitive business, make no mistake, and a slowdown at Kepler enhances the chances for the Earth-sized planet prize going to someone else. The Nature News article quotes Greg Laughlin (UCSC) on this, saying the delay makes it “more likely that the first Earth-mass planet is going to go to the radial-velocity observers.” But now we must learn more about how bad the current problem is, and wait for Kepler’s upcoming planetary announcements.
Well, after reading all of the material I could on the subject, I’m still confused. We really need a much better explanation. I’m seeing conflicting statements on different sources. For example, if one goes to the NASA Kepler website and looks at the status reports from the project manager, the spacecraft is operating nominally (except of course the two power resets experienced last summer). There is absolutely no mention of the noisy amplifiers. But apparently, the noise issue was known during the testing phase on the ground BEFORE Kepler was ever launched. So why was it never mentioned until now?
As a follow-up to my previous post, I got onto the NASA Kepler website (“http://kepler.nasa.gov/”) and re-read William Borucki’s power point slides from the August 14th IAU meeting. Of particular interest is slide 15 showing the HAT-P-7 light curve. Beneath the light curve is the statement:
“Detection of the occultation proves that Kepler has the precision to find Earth-size planets.”
Hmmm. Interesting. So did the equipment suddenly get worse in the last 10 weeks or is there something else going on here?
“the team has to fix the software — it would be “too cumbersome” to remove the bad data manually — so that it accounts for the noise automatically.” (from the Nature News article)
I’ve had a short-wave listening hobby with homebrew antennas and front-end circuits. One of the problems was receiving low-amplitude signals through the noise. At the antenna, a ‘bucking’ circuit was able to compare inputs from the receiving antenna and a noise antenna and inductively couple the two signals to null the noise product.
Inherent receiver noise is another matter. Years ago, I had a dipole antenna aimed north-south while listening to the twenty-meter band. Without the antenna, the amplifiers (an old Heathkit vacuum-tube set) gave a steady audible hiss. With a dual-pole knife switch the antenna was added to the circuit. The hiss became twice as loud: the sun was in the antenna lobe at noon, right near the meridian.
The practical technologies of radio astronomy has recently been adapted to the arsenals of ground-based optical telescopes. Radio astronomy has long had to deal with the problems of amplifier noise, and the disciplines across professional astronomy have electronics and computers in common. There may be intermediate solutions until the intended upgrades can be implemented.
Okay — that doesn’t sound so bad. There are 42 CCDs (or at least there are 42 segments in the Kepler field of view) so if there are 84 channels and only three of them are affected by the amplifier noise, doesn’t that mean only 3 out of the 42 segments are affected? If that’s the case, I would have thought that only a small portion of Kepler’s field of view is affected by the amplifier problem (which would certainly explain why they would launch without trying to correct the problem on the ground).
If that’s the case, I’m not sure why discoveries of Earth-like planets would be delayed — it would just reduce the odds a little. If someone can explain the impact better, that would be great!
Either way, it’s clear from the initial results that the Kepler instrument is still exquisitely sensitive, and with the right software processing in place it doesn’t seem as though much data, if any, will be lost because of the problem.
No need to panic over this one.
Well given that you ideally want 3 or more transits, you wouldn’t get an Earth-twin for 3 years anyway. As far as I’m able to make out this is unfortunate but not a showstopper.
well, does this noise cause the problem only for Earth-size planets detection and no problem for hot jupiters or is that problem more general? “early 2010” announcement seems to support the first hypothesis. nevertheless I think even first results will be of great importance as they will provide informations on hot jupoters frequency in our Galaxy what indirectly limits chances to find earth sized planet
The Hubble Space Telescope went up in 1990 with a defective mirror that didn’t get fixed until almost four years later.
The Soviets sent their Mars 4 through 7 probes to the Red Planet in the early 1970s knowing they had computer defects but hoped they would last long enough to complete their missions and beat the US Vikings to the punch. Their success was mixed at best.
I am sure there are other examples. Not excusing Kepler, but it sounds like they knew there was a problem that could be fixed over time while it was in space, but if they tried to do it on the ground the satellite would still be sitting on Earth. Let us all hope that finding alien Earths does not suffer from human technical and red tape issues.
“Data from 3 of the 84 channels that have more noise than the others will be corrected or the data flagged to avoid being mixed in with the low noise data prior to the time an Earth twin could be discovered.”
So to reword that to be less misleading: “An Earth twin will only be discovered after the 3 noisy channels are corrected or flagged”.
The problem is significant because all the channels get processed together on the probe (I think), so I it isn’t feasible to correct or flag those channels on the ground; new software needs to be uploaded to Kepler.
Additionally, they can’t just upload new software now, because that would muck up the existing observations which are due to continue until sometime next year.
Why didn’t they mention this before launch? and why did they suggest from the HAP-P-7 light curve that earth planets were detectable? My guess is that they wanted to quell the competition. Not very honourable, but there you go.
Here is a quote from the Kepley mission press kit:
Combined with this correction from Kepler’s principle investigator:
It would seem that the original news story is short on facts and long on hype. If it is true that only 2 or 3 detectors out of 42 are affected by increased signal noise, then only between 4% and 7% of Kepler’s field of view is impacted by this problem, and there won’t be any significant delay in the detection of exoplanets at all — only a slight reduction in the volume of detectable planets until the noisy amplifiers can be characterized and counteracted through software processing. No data will be lost.
I think they made an eminently sensible decision. Not only would it have cost tens of millions of dollars (perhaps more) to delay the launch, it would likely have proved impossible to fix the problem on the ground without the risk of introducing other, worse problems into the system. As a software engineer I know all to well how easy it is to make further mistakes when trying to fix that last tiny bug before shipping a product. Better to live with minor issues that can be corrected later than to risk causing a major mission-ending calamity.
I am sure that they have every confidence that Kepler will be capable of meeting its mission objectives despite having a couple of gimpy amplifiers.
3 of 84 channels. That’s 3.57% of the data that is noisier than the others. And that 3.57% can be flagged, corrected, or not used.
3.57%…
Everyone, chill out.
As a certain major computer software company would say, it’s a feature not a bug. :^)
This is unfortunate, but hopefully it won’t turn into a fatal problem that prevents the mission from meeting its scientific objectives. According to the newscientist article on this developing story, the good news is that supposedly the stars Kepler is observing are turning out to be less variable than expected.
As for other techniques bagging an Earth-sized world, I wouldn’t count microlensing out.
I’ve reread the quotes from the article a couple times and it seems like this isn’t actually that bad, though there is some ambiguity. I believe they are saying that even though only a tiny amount of the data is affected by this problem their software is currently not able to flag that data, and manually flagging the data would be too difficult. I’m sure there are complexities here that can’t be appreciated without knowing the full details. My impression is that they need to modify their analysis software so that it can automatically flag this high-noise data, but making those modifications will take time (again, likely due to factors that outside observers can’t appreciate). However, it seems to me, and this may be incorrect since it’s only implied, that this is just an analysis problem, they will still be able to use the data being collected today (most of which doesn’t have this problem) once they make their software changes.
An entertaining read is Hubble Wars Eric Chaisson’s account of denial, evasion and bad behavior by the scientific community about Hubble’s troubles.
The “fix” for Hubble’s mis-figured primary mirror fixed the spherical aberration, but at the cost of at least two magnitudes of sensitivity— something still ignored by press.
It’s odd that Kepler’s handlers would not have started working on the data fix prior to launch, if it was a known problem.
If the amplifiers are degrading after the launch and three of the eighty-four have already deteriorated, then “Houston, We have a problem.”
We don’t know that they didn’t. It’s seems clear from the report that the problem was only found after assembly, so it must have been reasonably late in the project. They simply may not have had the time to being working on it, given the tight schedules involved.
No, they detected the problem on the ground, so there is reason to hope that it’s due to faulty component from the start and not a degradation problem. Of course, it could be an indicator that the amplifiers as a whole batch are not as reliable as expected but unless there are other failures along the way, one can hope that it was an isolated problem.
If they are integrating the results from both channels for a detector, and the three faulty amps are on three separate detectors, then 7% of the results could be affected. Still not that bad.
Looks like the problem is global.
The detector is a sub 1/10,000 machine in Automatic mode.
See here for further details.
http://solar-flux.forumandco.com/extrasolar-news-and-discoveries-f2/kepler-results-t282-90.htm
I’m pretty mad they are not coming clean on this. Blowing smoke only makes everyone madder.
I just realized that as the detector must turn every 6month the noise problem will affect 4 times 3,5% of the data time series! or even 4 times 7% as tacitus noticed above. this is not negligable at all!
every 3 month of course
The good article about Kepler (on Russian):
http://infox.ru/science/universe/2009/11/02/NoisyCCDamplifiersdelayKeplerplanetsearch.phtml
or Google translation of the article:
http://translate.google.ru/translate?u=http%3A%2F%2Finfox.ru%2Fscience%2Funiverse%2F2009%2F11%2F02%2FNoisyCCDamplifiersdelayKeplerplanetsearch.phtml&sl=ru&tl=en&hl=ru&ie=UTF-8
Everything with Kepler is more badly than you suppose.
Let’s wait until we hear more from the Kepler team themselves. From the same forum you quoted:
It’s way too premature to start accusing the team of blowing smoke and being less than honorable. If there is anyone who wants this mission to succeed any more than we do then it’s the Kepler team itself. They would not have launched the spacecraft if the knew they were going to fail.
Definitions of fail very different.
If Kepler bags ~ 1500 Jupiter/Neptune types this would be great success for science team.
Absence of 50 rocky planets sad but not missed too much.
TheoA November 4, 2009 at 1:46:
“Absence of 50 rocky planets sad but not missed too much.”
I strongly disagree. This is precisely one the main objectives of Kepler and perhaps the greatest addition to our knowledge base: the prevalence of small rocky planets.
But again, as several people mentioned: let’s just wait for more reliable news from the experts themselves, things may turn out well.
Ok, so things aren’t so bad after all:
Read the whole article:
http://www.newscientist.com/article/dn18095-telescope-glitch-could-delay-discovery-of-alien-earths.html
So it looks as though there will be no loss of data (or at least minimal impact on a small proportion of the data once the error correction algorithm has been applied — post download. I’m guessing that since all the stars being watched by an affected detector will share the glitchy signal, they will be able to distinguish between amplifier noise and the true variability of each star and mask out the noise even months after the data is download from the spacecraft. (It’s probably not that simple, but you get the idea.)
So they remain confident that they will be able to complete their full mission, and that only a portion of the results will be delayed — in particular some exoplanet detections that might have been announced in 2010 will shift out to 2011. The detection of true Earth analogs should not be affected at all.
Very interesting that people seem to want to believe the worst, and consider anything the Kepler team says to the contrary to be “blowing smoke”.
Sure, blowing smoke gets people madder. So does making baseless accusations of scientific fraud.
On the Kepler web site they have announced that they’ve released light curve data for 9000 stars. Winnowing out the variable chaff from the stable main-sequence wheat. Ofcourse for astronomers who study variables it will be welcome data too. I don’t think there is anything seriously defective with Kepler at this time and I’m keeping my fingers crossed that Kepler will accomplish it’s mission completely before the inevitable failure occurs. Hopefully we will see a mission extension.
Too many exoplanet related space missions have been delayed or cancelled,
eg.TESS,SIM,TPF. We’re all relying on Kepler to succeed and inform us on
the commonality of earth like planets. For many older people it may be their
only chance to know. As well the results may inspire a resurgence in space-based exoplanet telescopes.
I believe the Kepler did the pragmatic thing and went ahead with the launch rather than dis-assembling delicat equipment and perhaps introducing fault(s) worse than those already present. That said, the Kepler team should have disclosed this known pre-launch amplifier problem. To issue rosey performance press releases such as the HAP-P-7 light curve results without also mentioning this previously known amplifier problems does not reflect well on the team. Remember, each quarterly roll exposes different stars to the bad amp channels certainly missing terrestrial size planetary transits during the 3 month noisey data period. Since Kepler doesn’t do on board data processing, the data will need to be software ‘filtered’ on the ground where possible. Don’t forget that SNR information theory limits apply.
I would expect the soon due Kepler Managers Report to answer the performance and data issues explicitly and in detail.
http://kepler.nasa.gov/about/manager.html
Per the Kepler mission manager, even the noisy data will eventually be usable, but will take longer to calibrate.
Sequence of events as reported very curious.
– Kepler knows 3 amplifiers malfunctioning prior to launch ~ 1 year ago.
– Science team notified and worried but general public NOT notified. Told everything is perfect.
– Space craft launched.
– Data from calibration set manually over ridden to get HAT-P-7 as clear as possible.
– Data unveiled to the world as instrument functioning wonderfully.
– “This early result shows the Kepler detection system is performing right on the mark,” said David Koch
“It bodes well for Kepler’s prospects to be able to detect Earth-size planets.”
-“Kepler photometer is working well
All 84 channels are operating & producing useful data
Photometric precision is about 1.5 times design values” Borucki IAU
– NASA review committee told of problem but not general public.
– “We’re not going to be able to find Earth-size planets in the
habitable zone — or it’s going to be very difficult — until that work
gets done,” says Kepler principal investigator William Borucki.
– Bombshell
– Very disturbing statement, esp. if known for some time.
– Claims of only 3.87% data affected.
– Now claims of only 15% affected and only some of the time.
– Maybe fixable? Not said so earlier.
– Not sure what to think or believe.
– Full data still not disclosed.
– Which amplifiers are problematic, are more going to fail. Why not?
– Silence deafening. Esp. since problem known for some time.
New mission manager update at the Kepler web site today.
People should read it carefully.
Relevant sections of Managers’ report…
“Measurements taken in space confirm that Kepler meets its random noise requirements.
Systematic noise results from the imperfect nature of any measuring device. It represents the instrument’s “finger print” placed upon the measurement, and must be calibrated out of the data in post-processing on the ground. Because systematic noise depends on the specific characteristics of the instrument, the best calibration requires that the noise sources be characterized and modeled based on measurements made in space. The Kepler team has been developing the ground software to calibrate out the various systematic noise sources since launch, and this work will continue for a number of months. As each source of systematic noise is calibrated, fainter transit signals can be detected. Data collected from the spacecraft will be continually reprocessed as the ground software matures, revealing smaller and smaller planets. This is a normal process and has been part of the Kepler plan since before launch. Fortunately for Kepler, the worst sources of systematic noise affect only a small portion of the field of view, so the majority of the field of view will be calibrated earlier, enabling small planets to be detected sooner.”
This is the report with the detail I anticipated. Random noise to spec – excellent. Systematic noise (e.g. the ‘noisey’ amps) – the laborious ground process of software ‘filter’ development underway with increasingly improving results anticipated. Given this level of detailed explanation, I for one am lots calmer than when I read the initial dour report.
Let the terrestrial planet hunt continue!
Looking fwd to the papers at Jan 2010’s AAAS meeting.
It’s really quite sad that some people seem to want there to be a conspiracy so badly, even taking the fact that the scientists aren’t acting like the entire experiment is over as evidence that there is some coverup going on.
Paul,
you may already be aware of this (in that case just ignore this post), but ESO just announced that analysis of HARPS data have led to the conclusion that low lithium content in solar type stars is probably an indication of the presence of planets. This would be very exciting and promising indeed!
(however, others, such as Melendez argue that this lithium depletion may just be an age effect).
http://www.eso.org/public/outreach/press-rel/pr-2009/pr-42-09.html
http://www.nature.com/news/2009/091111/full/news.2009.1078.html