The fascination of the so-called ‘Pioneer anomaly’ is that it offers the possibility of new physics, an apparently constant acceleration on the Pioneer 10 and 11 probes with a value of (8.74 ± 1.33) × 10?10 m/s2 being something that we can’t easily explain. Equally useful is the chance the Pioneer anomaly gives us to validate current physical models by figuring out how we might explain this acceleration through hitherto unsuspected processes, perhaps aboard the spacecraft itself. Either way you look at it, the Pioneer anomaly has deserved the attention it has received, and now a new paper emerges to take a crack at resolving the issue once and for all.
Frederico Francisco (Instituto Superior Técnico, Lisbon) and colleagues have revisited the question of whether heat that is emitted and reflected aboard the spacecraft could account for the anomalous acceleration. Francisco’s team had accounted for between 33% and 67% of the acceleration in a thermal model they developed in 2008. The new paper builds on this earlier work, with a methodology based on a distribution of point-like radiation sources that can model the thermal radiation emissions of the spacecraft. The authors then deploy a method called ‘Phong Shading’ that is commonly used to render the illumination of surfaces in 3D computer graphics. This allows them to study how heat effects can be reflected off the various parts of the spacecraft.
Image: An artist’s rendition of one of the Pioneer probes. Credit: NASA.
I referred to the acceleration as ‘apparently constant’ above, but the authors take pains to note that we haven’t fully characterized the acceleration. In fact, one analysis of the flight data shows that, given the data we have, both a constant acceleration and one with a linear decay of a period greater than fifty years are compatible with the data. This comes into play as the team tests for the constancy of the acceleration, as discussed in the paper:
… a so-called “jerk term” is found to be consistent with the expected temporal variation of a recoil force due to heat generated on board… This is essential if the hypothesis of a thermal origin for the Pioneer anomaly is to be considered, as such [a] source would inevitably lead to a decay with at least the same rate as the power available onboard. Possible causes for an enhanced decay include e.g. degradation of thermocouples, stepwise shutdown of some systems and instruments, etc.
With this in mind, the authors go to work looking at thermal radiation and the force it can bring to bear on a surface, using Phong Shading to model the reflection of this radiation off the various other surfaces of the Pioneer probes. Radiation facing outwards, for example, radiates directly into space with an effect that cancels out. But radiation emitted toward the center of the spacecraft is reflected by the high-gain antenna and the main equipment compartment. The trick is to weigh these effects in terms of the acceleration. The method gives “a simple and straightforward way of modeling the various components of reflection…,” according to the paper, and one that accounts for the effect of thermal radiation on different parts of the spacecraft.
The result: The Phong shading method confirms earlier work suggesting that the Pioneer anomaly results from heat effects aboard the spacecraft. It also offers a method with which to study similar effects aboard other spacecraft. The authors explain:
…the acceleration arising from thermal radiation effects has a similar order of magnitude to the constant anomalous acceleration reported [in a study of the anomaly published in 2002]. We believe that the chosen approach is most adequate for the study of this particular problem, taking into account all its specific characteristics. Moreover, this Phong shading method is well suited for future studies of radiation momentum transfer in other spacecraft.
And the paper concludes:
With the results presented here it becomes increasingly apparent that, unless new data arises, the puzzle of the anomalous acceleration of the Pioneer probes can finally be put to rest.
This is a useful result, and one that will now be scrutinized by the wider community. If its conclusions are accepted, we will have made a step forward in identifying an effect that may need to be taken into account in future spacecraft operations. Just as important, we’ll have been able to rule out a line of investigation that seemed to open a door into new physics, meaning that the analysis of the Pioneer Anomaly, now more than a decade old, has born fruit. This is exactly what good science should do, and while we might hope for breakthroughs into new theories, anomalies like these are just as valid as ways of testing and verifying accepted physical laws.
The paper is Francisco et al., “Modelling the reflective thermal contribution to the acceleration of the Pioneer spacecraft” (preprint).
The exact same conclusions had been reached by others already ten(!) years ago – but the ‘anomaly’ was kept alive artificially by certain interested parties from the physics to the space science to even the space advocacy world. There were even proposals to spend/waste hundreds of millions of bucks on a space mission to study the (non-)effect! One day sociologists of science may write papers about all that – certainly not the way science should work in an ideal world …
Which frequency of light is the high-gain antenna most reflective to? Radio for sure, but if it’s made of aluminium, solar UV can contribute to extra thrust.
Another option not discussed here is the damage that the probe can get from going through space at 11Km/s for several decades. A dust coating, even very fine, will modify its reflectivity on certain sides, micrometeorites will pierce holes in machinery, damaging it (leading to premature shutdown of the system and enregy not used up) or creating channels for thermal radiation to leak through in unpredictable directions.
These may add up to a sinificant degree, and cannot be accounted for by a deterministic model…
I take the opposite viewpoint of Paul and Daniel. Science will have suffered the worst sort of dysfunction if the Pioneer Anomaly gets swept under the convenient rug of “the plausible.” Even so, we will still have the Earth flyby anomalies and the so-called “A.U.” anomaly left uncovered. All three anomalies seem to be manifestations of a singular phenomenon — the latter two cannot be dismissed as heat radiation.
Heat-radiation models, like string theory, can be customized to fit any set of observational parameters. There is no limit on sophistication.
We should not be so easily impressed. Nothing has been resolved.
And it needn’t have cost 100’s of millions of dollars to do some authentic observational research. The New Horizons mission to Pluto could have been adapted to re-test the Anomaly if it was taken seriously enough in the first place. But Daniel is correct to point out that it wasn’t even then.
Still, that sum of money would have been a better investment than the 100’s of millions of dollars wasted on LIGO — which I expect future historians of science to gossip about as their white elephant of choice.
@Daniel. Thanks for posting this. I had no idea that this had already been fairly well explained. I think there was even a piece on this on NPR’s “Science Friday” not that long ago. I also seem to recall that The Planetary Society used this as part of its campaign for more donations.
Like most science, the “interesting” claims are reported, but never the disproofs, which cumulatively leads to a lot of misinformation that is hard to erase from the collective memes of our culture.
I expected a mundane explanation, but I must say I’m still a little disappointed. New Physics would have been more interesting. :)
This should remind us that thermal radiation can be used to convert heat to propulsion at 100% efficiency. I call this the flashlight drive, aka photon rocket. A high temperature heat source has all its radiation directed backwards by a reflector/sail. The drawback is that the Isp, being c, is quite a bit higher than optimal, leading to low thrust. However, if nuclear reactions provide the energy, optimal Isp is quite high, and the 100% energy efficiency could well compensate for the Isp mismatch and make the drive competitive with more complicated and less efficient schemes that rely on nuclear reaction products as reaction mass. I think someone working with Claudio Maccone once published on the method. Has this been seriously considered, or is there something wrong with it that I do not see?
Ah, well, Wikipedia, of course, has it all: http://en.wikipedia.org/wiki/Nuclear_photonic_rocket
If antimatter is the fuel, c is the optimal Isp, and a photonic rocket does not require reflecting gamma rays, just shielding and thermalizing them, which is not nearly as hard.
This is a very interesting and intriguing paper. Even though it’s late at night I just had to go through it. From a qualitative perspective it looks good.
The main thing I wondered about is where the additional radiative acceleration could be coming from since (as they describe in detail) the power and time evolution of the RTG is known to high accuracy. It seems that it all comes down to whether the IR from the RTG (almost all its 2.5 kW goes directly to heat) is reflected or absorbed by the various surfaces illuminated, and then secondary and tertiary (etc.) absorptions and reflections.
This matters since there is a large difference in the acceleration due to absorption and reflection. When you illuminate a surface (mostly unpolished aluminum in this case), part of the IR is absorbed and part is reflected. If absorbed, being metal the heat diffuses quickly so that the opposite surface is close in temperature so the reradiation of IR is fairly equal and cancels. If it is reflected, the affect is twice as great as emission alone from the surface.
The trick is in figuring out the amount of reflection and whether it is specular or diffuse, with the latter quality determining the secondary etc. reflections and absorptions. Then they let the computer crunch the numbers.
They also did a time evolution study and a sensitivity analysis which further buttressed their conclusions. I can’t repeat their calculations (nor would I want to!) but this analysis looks top notch. As Paul says, it’ll be interesting to see how well it passes review.
I’m all for mundane explanations about the anomaly but I don’t think this paper adds much certainty about the radiation pressure cause (which is the most likely in my layman opinion).
If they want to model the reflections, why not using a more recent, more physical model? It sounds like they chose Phong in order to get the expected results.
To some these anomalies are what constitute science. To us science is not a list of facts but a journey into the unknown. To us there should be great diligence taken before an established mystery should be labelled solved.
I can’t help thinking of Luis Alvarez who claimed that he drew early inspiration toward science from the ‘death of the dinosaurs’ that featured in many science books written for young children. When, as an adult, he showed the K-T layer was created by a large impact, he thought he had found the answer. Little did he know that this mystery was already thought solved and thus prematurely destroyed as a mechanism to inspire more generations of children.
To me the whole problem is best illustrated over the treatment of the ‘cold fusion’ episode. To this day it still annoys me if anyone mistakenly believe that I think that Fleischmann and Pons were probably right. That is an unimportant side issue. The one I care about is, that due care was paid over the examination of their results.
I would be happiest if we accept the possibility that the actual real world may be to complex for the human mind to comprehend, and science just keeps building ever more predictive models that we can understand. Only then would some stop imagining that every part should fit together. We should then forever continue seeking the exceptions to old well established theories, and not just to new hypotheses.
This thermal model predicts that as the spacecraft cooled there should have been a decay in the anomaly by almost half over the 30 years of available data. This prediction disagrees with the data since no decay was seen.
Mike, from (16) in the paper the 1/2 life of the RTG thermal output is 87.72 years. Given 30 years, the output decline is 21%, not 50%. Also, they did the time evolution analysis you claim they didn’t do. As Paul noted in the passage he quoted, the available data fit a 50-year linear decay profile (among others, since the error bars allow some flexibility in the fit).
panini, I expected to see some discussion about the suitability of Phong so I am unsure what to think of that. However, if my (modest) understanding of Phong is reasonably correct, the only important departure from reality is in the region of low-incidence angle reflection from a rough surface. In other words, the Fresnel zone where diffuse reflection becomes specular. My guess is that this difference does not bear much on the total result. They are not attempting to render an animated movie, just getting the reflection angles approximately right. I doubt they could do better since the texture of the spacecraft’s surfaces would not be too precisely known.
Predictably it seems the true believers are out in force, from panini’s insinuations that there is some kind of conspiracy to bury the anomaly to Erik Anderson’s post which appears to claim that if something can be explained by known physics then science must be suffering some kind of dysfunction.
I have no idea why so many people so desperately want this to be caused by new physics that they set the bar for evidence this low. Sorry, new physics is not a reasonable null hypothesis. Moving on…
True believers? I think not. Most of the comments here express a willingness to abandon interest in the problem already. For my own part, I too am willing to abandon my interest if the Anomaly is properly tested with new experiments yielding negative results . This has not yet been done. What I object too is the attitude that we needn’t bother.
Also, I do not believe that the nature of the Pioneer Anomaly is rooted in “new” physics per se. The acceleration is not real. It’s merely an artifact in the telemetry. With a better understanding of well-established physical principles, this would in fact be what we’d expect to find.
I’m with Andy above on this. The sociology of “anomalies” seems to be fairly predictable: a substantial number of people will always give the benefit of the doubt to “new physics” where precisely the opposite is warranted. Some subset of those can NEVER be convinced by any analysis, regardless of its merit, if “new physics” is excluded. Though I don’t have any evidence that is more than anecdotal, it seems to me there’s an inverse correlation between how much physics one knows and how willing a person is to believe “new physics” is warranted.
It was my previous understanding that the majority (but, of cause, not all) of the Pioneer anomaly would have to be explained by other causes before MOND could be used to explain the residual. Superficially, this seems to be exactly what this paper claims to have done. This makes me even more puzzled as to why so many are claiming that this is a vindication of classical physics. Is this a case of human nature trumping scientific method or am I missing something?
Rob Henry: the question is whether there is any residual to explain. According to this paper, known physics can explain the anomaly to the point where there is no statistically-significant residual. Furthermore if there is new physics it is not necessarily MOND.
andy, this is not my field and my ignorance is thus great. Perhaps I can be forgiven if I have only previously come across work ruling out MOND, due to the scale mismatch of the effect. You could thus enlighten me to all the other new physics hypothesise that have been ruled out by this new residual scale.
I note that to the casual observer, the statistical significance given for the new residual in the preprint, means that you must using the term ‘no significance’ in a nonstatistical way. Perhaps you meant to write ‘this is still a deep and stimulating mystery, but one that now offers more hope of solution without invoking new physics’. I could certainly agree to that.
Hi All
The “real anomalies” are more interesting IMO – the odd ephemeris errors of Saturn in particular, the slow expansion of Earth’s orbit, and the slow change in the Moon’s eccentricity. These are less easy to write off as telemetry issues. What do they mean?
Error bars are not mysterious. All measurements have them. I suppose if you listen to the noise long enough you’ll start hearing things.
Hi Ron
Not error bars. Those are much smaller. I’m talking real effects, measurable thanks to space probes and lunar laser reflectors.
Adam, I saw your comment but wasn’t thinking of it in my last comment; I was making a more general response to the tone of some other posts in this thread. I don’t even follow the items you mention, although I am slightly familiar with them.
However, I will say that the term “anomaly” should be used carefully. There is a trap waiting there for the unwary who (if you’ll excuse a theological analogy) could find themselves making the equivalent of the god-of-the-gaps argument. That is: here’s something we don’t fully understand or haven’t studied sufficiently to gain that understanding, therefore I will project my hopes and desires for novel physics into that gap.
This is particularly pertinent to matter of celestial dynamics since gravitation, including the Newtonian variety, is chaotic – I mean that in the mathematical sense. Good numerical simulations show some truly extraordinary effects in our own, familiar solar system. The Moon, for example, is one of the most difficult bodies to model with respect to its position and orbit. There are good reasons for why this happens, and it has been studied to death and is known to be ultimately intractable. More specifically, it is unknown whether the Solar System is stable; orbital eccentricities are known to pop up here and there among all bodies in the system, some of which damp out, reverse, and reappear with a different amplitude, and may in fact grow without bound and without predictability (i.e. get ejecting from the system).
Whether the particular effects you mention fall into this class of problems I can’t say without knowing more about your references. However, I don’t plan on looking into it. You may want to do so if that’s your interest. Do a paper search on authors such as Wisdom.
Cosmic Optical Background: the View from Pioneer 10/11
Y. Matsuoka, N. Ienaka, K. Kawara, S. Oyabu
(Submitted on 22 Jun 2011)
We present the new constraints on the cosmic optical background (COB) obtained from an analysis of the Pioneer 10/11 Imaging Photopolarimeter (IPP) data. After careful examination of data quality, the usable measurements free from the zodiacal light are integrated into sky maps at the blue (~0.44 um) and red (~0.64 um) bands. Accurate starlight subtraction is achieved by referring to all-sky star catalogs and a Galactic stellar population synthesis model down to 32.0 mag.
We find that the residual light is separated into two components: one component shows a clear correlation with thermal 100 um brightness, while another betrays a constant level in the lowest 100 um brightness region. Presence of the second component is significant after all the uncertainties and possible residual light in the Galaxy are taken into account, thus it most likely has the extragalactic origin (i.e., the COB). The derived COB brightness is (1.8 +/- 0.9) x 10^(-9) and (1.2 +/- 0.9) x 10^(-9) erg/s/cm2/sr/A at the blue and red band, respectively, or 7.9 +/- 4.0 and 7.7 +/- 5.8 nW/m2/sr.
Based on a comparison with the integrated brightness of galaxies, we conclude that the bulk of the COB is comprised of normal galaxies which have already been resolved by the current deepest observations. There seems to be little room for contributions of other populations including “first stars” at these wavelengths.
On the other hand, the first component of the IPP residual light represents the diffuse Galactic light (DGL) – scattered starlight by the interstellar dust. We derive the mean DGL-to-100 um brightness ratios of 2.1 x 10^(-3) and 4.6 x 10^(-3) at the two bands, which are roughly consistent with the previous observations toward denser dust regions. Extended red emission in the diffuse interstellar medium is also confirmed.
Comments:
Accepted for publication in The Astrophysical Journal
Subjects:
Cosmology and Extragalactic Astrophysics (astro-ph.CO); Galaxy Astrophysics (astro-ph.GA)
Cite as:
arXiv:1106.4413v1 [astro-ph.CO]
Submission history
From: Yoshiki Matsuoka [view email]
[v1] Wed, 22 Jun 2011 10:45:36 GMT (588kb)
http://arxiv.org/abs/1106.4413
NASA Releases New Pioneer Anomaly Analysis
The mysterious force acting on the Pioneer spacecraft seems to be falling exponentially. That’s a strong clue that on-board heat is to blame, says NASA
kfc 07/20/2011
In the early 1970s, NASA sent two spacecraft on a roller coaster ride towards the outer Solar System. Pioneer 10 and 11 travelled past Jupiter (and Saturn in Pioneer 11’s case) and are now heading out into interstellar space. But in 2002, physicists at NASA’s Jet Propulsion Laboratory in Pasadena, noticed a puzzling phenomenon. The spacecraft are slowing down. Nobody knows why but NASA analysed 11 years of tracking data for Pioneer 10 and 3 years for Pioneer 11 to prove it.
This deceleration, the Pioneer anomaly, has become one of the biggest problems in astrophysics. One idea is that gravity is different at theses distances (Pioneers 10 and 11 are now at 30 and 70 AU). That would be the most exciting conclusion. But before astrophysicists can accept this, other more mundane explanations have to be ruled out.
Chief among these is the possibility that the deceleration is caused by heat from the spacecraft’s radioactive batteries, which may radiate more in one direction than another.
Back in March, physicists from Europe claimed that a new computer model of heat emission from the spacecraft had finally nailed the problem. This proved that heat was to blame, they said. NASA, which has its own team looking at this, has kept quiet about this result and today we can see why.
Slava Turyshev at JPL and a few pals say they’ve trawled through JPL’s records looking for more data. And they’ve found it. These guys say they’ve been able to double the datasets for both spacecraft. That increases the tracking data for Pioneer 10 to 23 years and Pioneer 11 to 11 years. That’s a jump from 20,055 to 41,054 data points for Pioneer 10 and from 10,616 to 81,537 for Pioneer 11.
So what does it show? Firstly, the new data confirms that anomalous deceleration exists. But it throws up something interesting. Turyshev and co say there appears to be an exponential drop in the size of the anomalous deceleration over time. It’s not easy to see in the data for sure, but there are certainly signs it is there. That’s an important clue.
Pioneer 10 and 11 are powered by the radioactive decay of plutonium-238, which of course decays exponentially. That’s an important clue. NASA is currently performing its own computer simulation of the way that heat is emitted by the spacecraft to see whether it can explain the new dataset.
All the clues point to the notion that heat can explain the Pioneer anomaly. As Turyshev and co put it: “The most likely cause of the Pioneer anomaly is the anisotropic emission of on-board heat.” So it looks as if NASA is set to agree with the European conclusion and that astronomers will soon be able to put this great mystery to rest once and for all.
Ref: http://arxiv.org/abs/1107.2886: Support For Temporally Varying Behavior Of The Pioneer Anomaly From The Extended Pioneer 10 and 11 Doppler Data Sets
http://www.technologyreview.com/blog/arxiv/27175/
GPS Satellites Could Solve Flyby Anomaly
Spacecraft flying past Earth undergo a puzzling change in speed and nobody knows why. The next generation of navigation satellites could help, say scientists
kfc 09/19/2011
Now that the Pioneer anomaly has been more or less laid to rest, the outstanding space-based puzzle of the moment is the flyby anomaly.
This is how the arXiv Blog described the phenomena back in 2008:
“On 8 December 1990, something strange happened to the Galileo spacecraft as it flew past Earth on its way to Jupiter. As the mission team watched, the spacecraft’s speed suddenly jumped by 4 mm per second. Nobody took much notice — a few mm/s is neither here or there to mission planners.
Then on 23 January 1998, the same thing happened to NASA’s Near spacecraft as it swung past Earth. This time its speed jumped by 13 mm/s.
The following year, Cassini’s speed was boosted by 0.11mm/s during its Earth fly-by.
And people finally began to ask questions when the Rosetta spacecraft’s speed also jumped by 2 mm/s during its 2005 close approach.”
Nobody knows what causes these strange hiccups in spacecraft speed but there is no shortage of theories, some of which we’ve discussed here and here.
If scientists are ever to get to the root of this phenomenon, they need to have a way of measuring it repeatedly, unambiguously and in detail.
But flyby’s are few and far between. And even when they do occur, NASA’s Deep Space Network which monitors spacecraft from the ground is not designed to study the effect in detail.
The most serious problem is that the network cannot follow spacecraft when they are very close to Earth. This results in a gap in communications during a flyby lasting a few hours, just when the most interesting effect is happening.
As a result, the fly-by anomaly has never been caught in flagrante. Instead, it arises as the difference between the observed and expected velocity after a flyby.
Today, Orfeu Bertolami at the University of Porto in Portugal and a few buddies suggest a way out of this conundrum. They say the next generation of global navigation satellite systems ought to be able to help. These should be capable of detecting the expected change in speed of just a few millimetres per second.
These guys calculate that a microsatellite fitted with a device capable of receiving signals from any of the satellite navigation systems would cost less than $15 million. And it might be considerably less if the necessary gear was bolted onto an existing spacecraft intended for a flyby or the kind of highly elliptical orbit likely to demonstrate the anomaly .
That’s chickenfeed to most space agencies and that means we’re likely to see an attempt to measure the phenomenon in the not too distant futur..
The agency most likely to take the bait is the European Space Agency which is about to deploy a GPS rival constellation called Galileo.
If it needs a scientific mission to raise the profile of Galileo and show off its potential, it need look no further.
Ref: http://arxiv.org/abs/1109.2779: Probing The Flyby Anomaly With The Galileo Constellation
The Planetary Society blog piece on the Pioneer Anomaly – yet another example of Occam’s Razor:
http://www.planetary.org/blog/article/00003459/