Centauri Dreams
Imagining and Planning Interstellar Exploration
We Are the Music: Reflections on Galactic Immensity
While I’m immersed in the mechanics of exoplanet detection and speculation about the worlds uncovered by Kepler, TESS and soon, the Roman Space Telescope (not to mention what’s coming with Extremely Large Telescopes), I’m daunted by a single fact. We keep producing great art showing what exoplanets in their multitudes look like, but we can’t actually see them. Or I should say that the few visual images we have captured thus far are less than satisfying blobs of light marking hot young worlds.
Please don’t interpret this as in any way downplaying the heroic work of scientists like Anne-Marie Lagrange (LESIA, Observatoire de Paris) on Beta Pictoris b and all the effort that has gone into producing the 70 or so images of exoplanets thus far found. I’m actually just pointing out how difficult seeing an exoplanet close up would be, for the goal of interstellar flight that animates our discussions remains hugely elusive. The work continues, and who knows, maybe in a century we’ll get a close-up of Proxima Centauri b. Until then, I need periodically to return to deep sky objects to refresh the part of me that needs sensual imagery (and also accounts for my love of Monet).
Herewith some images that would challenge even the greatest of the Impressionists to equal. We’re looking at the Milky Way with new eyes thanks to two related projects, the VISTA Variables in the Vía Láctea (VVV) survey and the companion VVV eXtended (VVVX) survey. Roberto Saito (Universidade Federal de Santa Catarina, Brazil) is lead author of the paper introducing this work, which includes close to 100 co-authors. VISTA is the European Southern Observatory’s Visible and Infrared Survey Telescope for Astronomy, run out of the Paranal Observatory in Chile. Its great tool is the infrared camera VIRCAM, which opens up areas otherwise hidden by dust and gas.
Image: This collage highlights a small selection of regions of the Milky Way imaged as part of the most detailed infrared map ever of our galaxy. Here we see, from left to right and top to bottom: NGC 3576, NGC 6357, Messier 17, NGC 6188, Messier 22 and NGC 3603. All of them are clouds of gas and dust where stars are forming, except Messier 22, which is a very dense group of old stars. The images were captured with ESO’s Visible and Infrared Survey Telescope for Astronomy (VISTA) and its infrared camera VIRCAM. The gigantic map to which these images belong contains 1.5 billion objects. The data were gathered over the course of 13 years as part of the VISTA Variables in the Vía Láctea (VVV) survey and its companion project, the VVV eXtended survey (VVVX). Credit: ESO/VVVX survey.
We’re dealing with some 200,000 images covering an area of sky that is equivalent to 8600 full moons, according to ESO, and 10 times more objects than released by the same team in 2012, based on observations that began two years earlier and ended early in 2023. Working over that timeframe allowed scientists to chart brightness changes and movement that can be useful in calculating distances on this huge scale.
Image: This image shows the regions of the Milky Way mapped by the VISTA Variables in the Vía Láctea (VVV) survey and its companion project, the VVV eXtended survey (VVVX). The total area covered is equivalent to 8600 full moons. The Milky Way comprises a central bulge — a dense, bright and puffed-up conglomeration of stars — and a flat disc around it. Red squares mark the central areas of our galaxy originally covered by VVV and later re-observed by VVVX: most of the bulge and part of the disc at one side of it. The other squares indicate regions observed only as part of the extended VVVX survey: even more regions of the disc at both sides (yellow and green), areas of the disc above and below the plane of the galaxy (dark blue) and above and below the bulge (light blue). The numbers indicate the galactic longitude and latitude, which astronomers use to chart objects in our galaxy. The names of various constellations are also shown. Credit: ESO/VVVX survey.
The twin surveys have already spawned more than 300 scientific papers while producing a dataset too large to release as a single image, although the processed data and objects catalog can be found at the ESO Science Portal. More than 4000 hours of observation went into the work, and while the twin projects cover about 4 percent of the celestial sphere, the region covered contains the majority of the Milky Way’s stars and the largest concentration of gas and dust in the galaxy.
Clearly, a survey like this will be useful for observations from future instruments like the Vera Rubin Observatory, which will deploy an 8.4-meter mirror and the largest camera ever built for astronomy and astrophysics in a deep survey of the southern hemisphere at optical wavelengths. Instruments like the James Webb Space Telescope are obviously able to home in on objects with much higher resolution but cannot be used for broad area surveys of this kind. Next generation ground-based instruments will use the survey in compiling their target lists, and eventually the Roman Space Telescope will be able to produce deep infrared images of large regions with higher resolution.
As the paper notes:
…there are many more applications of this ESO Public Survey for the community to exploit for future studies of Galactic structure, stellar populations, variable stars, star clusters of all ages, among other exciting research areas, from stellar and (exo)planetary astrophysics to extragalactic studies. The image processing, data analysis and scientific exploitation will continue for the next few years, with many discoveries yet to come. The VVVX Survey will also be combined with future facilities to boost its scientific outcome in unpredictable ways: we are sure that this survey will remain a goldmine for MW studies for a long time.
But I fall back on sheer aesthetics this morning. As witness starbirth:
Image: A new view of NGC 3603 (left) and NGC 3576 (right), two stunning nebulas imaged with ESO’s Visible and Infrared Survey Telescope for Astronomy (VISTA). This infrared image peers through the dust in these nebulas, revealing details hidden in optical images. NGC 3603 and NGC 3576 are 22,000 and 9,000 lightyears away from us, respectively. Inside these extended clouds of dust and gas, new stars are born, gradually changing the shapes of the nebulas via intense radiation and powerful winds of charged particles. Given their proximity, astronomers have the opportunity to study the intense star formation process that is as common in other galaxies but harder to observe due to the vast distances. The two nebulas were catalogued by John Frederick William Herschel in 1834 during a trip to South Africa, where he wanted to compile stars, nebulas and other objects in the sky of the southern hemisphere. This catalogue was then expanded by John Louis Emil Dreyer in 1888 into the New General Catalogue, hence the NGC identifier in these and other astronomical objects. Credit: ESO/VVVX survey,
And a nebula inset into a riotous field of stars:
Image: This image shows a detailed infrared view of Messier 17, also known as the Omega Nebula or Swan Nebula, a stellar nursery located about 5500 light-years away in the constellation Sagittarius. This image is part of a record-breaking infrared map of the Milky Way containing more than 1.5 billion objects. ESO’s VISTA ― the Visible and Infrared Survey Telescope for Astronomy ― captured the images with its infrared camera VIRCAM. The data were gathered as part of the VISTA Variables in the Vía Láctea (VVV) survey and its companion project, the VVV eXtended survey (VVVX). Credit: ESO/VVVX survey.
The vistas opening up with our new technologies inspire a deep sense of humility. We are within and a part of what we are observing, which forces us continually to look with new eyes. I think of Carl Sagan’s frequent admonition that we are made of star-stuff. Or as T. S. Eliot put it in the “The Dry Salvages” (from Four Quartets):
For most of us, there is only the unattended
Moment, the moment in and out of time,
The distraction fit, lost in a shaft of sunlight,
The wild thyme unseen, or the winter lightning
Or the waterfall, or music heard so deeply
That it is not heard at all, but you are the music
While the music lasts.
We are the music. The immense VISTA data-trove will advance further discovery while igniting and shaping our imagination. Perspective frames the seasoned mind.
The paper is Saito et al, “The VISTA Variables in the Vía Láctea extended (VVVX) ESO public survey: Completion of the observations and legacy,” Astronomy & Astrophysics Vol. 689, A148 (September 2024). Full text.
Habitability around F-class Stars
Some years back I read a science fiction story in which the planet where the action took place orbited an F-class star. That was sufficiently odd to get my attention, and I began to pay attention to these stars, which represent on the order of 3 percent of all stars in the galaxy. Stars like our G-class Sun weigh in at about 7 percent, while the vast majority of stars are M-dwarfs, still our best chances for life detection because of the advantages they offer to our observing technologies, including deep transits and lower stellar brightness for direct imaging purposes.
F-stars are intriguing despite the fact that they tend to be somewhat larger than the Sun (up to 1.4 times its mass) and also hotter (temperatures in the range of 6200-7200 K). Back in 2014, I looked at the work of Manfred Cuntz (University of Texas at Arlington), who had performed a study examining radiation levels in these stars and the damage that DNA would experience with an F-star in the sky at various stages of stellar evolution. We’re dealing here with a shorter life expectancy than the Sun, usually reckoned in the range of 2-8 billion years on the main sequence depending on mass.
We’re also dealing with a larger habitable zone, a width 1.5 to 4 times greater than in the case of the Sun, again depending on the mass of the star and the climate models used to calculate the HZ. So there are advantages, for in the 2014 work, Cuntz and team found that the outer regions of the HZ experience tolerable levels of UV radiation. Now Cuntz has pushed the F-star work forward with a new paper, working with lead author Shaan Patel, a UTA grad student, and colleague Nevin Weinberg. The new work embarks on a statistical analysis of planet-hosting F-class stars drawn from data in the NASA Exoplanet Archive, which is a resource I don’t link to often enough. Says Cuntz:
“F-type stars are usually considered the high-luminosity end of stars with a serious prospect for allowing an environment for planets favorable for life. However, those stars are often ignored by the scientific community. Although F-type stars have a shorter lifetime than our Sun, they have a wider HZ. In short, F-type stars are not hopeless in the context of astrobiology.”
Image: The habitable zone as visualized around different types of star. Credit: NASA.
206 planetary systems emerge from the investigation, of which 18 offer a planet in the liquid water habitable zone for at least part of its orbit. The authors break these worlds down into categories based on the amount of time each spends in the HZ. It’s worth noting that all the currently known planets in the habitable zone of F stars are Jupiter-class worlds, so what we are thinking about here in terms of astrobiology is habitable moons, about which interesting new work continues to emerge. I also assume we’ll be finding terrestrial-class worlds around these stars with deeper investigation.
The exo-Jupiter 38 Virginis (HD 111998) is noteworthy for spending the entirety of its orbit in the habitable zone, which most of these worlds do not. Now things get intriguing. There are reasons for including planets whose orbital eccentricity allows only partial passage through the HZ, drawing on previous research (citation below) on atmospheric conditions for Earth-class planets in extremely elliptical orbits. That 2002 study found that despite large variations in surface temperature, long-term climate depended on the average stellar flux over the entire orbit, meaning that planets not in but near the HZ may still be potentially habitable, at least for extremophiles.
And we can possibly extend our definition of habitable zone. From the paper:
As part of our study, we also consider cushions for both HZ limits. This approach is informed by previous studies given by Abe et al. (2011) and Wordsworth et al. (2013). The former work deals with climate simulations for “land planets” (i.e., desert worlds with limited surface water), which based on those models have a significantly extended inner HZ limit than planets with abundant surface water (akin to Earth). Moreover, Wordsworth et al. (2013) continued to explore the outer limit of HZs by considering the impact of CO2, including CO2 clouds. They found that in their models the outer HZ is notably extended, commensurate to the Martian orbit in the solar system.
Image: This is Figure 10 from the paper. Caption: Depiction of all 18 systems that spend at least part of their time within their respective HZs. Empty markers in panel (c) represent actual planetary mass values as opposed to minimum mass values, which are represented by filled in markers. Credit: Patel et al.
Consider that the lowest-mass planet currently in a habitable zone in all these systems has an estimated mass 143 times Earth and you’ll agree with the need to probe further into potentially habitable exomoons, about which we know next to nothing. Overall, with projects like the Habitable Worlds Observatory on the table, we should consider F-class stars as targets for deeper study. As lead author Patel says, “In future studies, our work may serve to investigate the existence of Earth-mass planets and also habitable exomoons hosted by exo-Jupiters in F-type systems.”
The paper is Patel et al., “Statistics and Habitability of F-type Star–Planet Systems,” The Astrophysical Journal Supplement Series Vol. 274, No. 1 (12 September 2024), 20 (full text). The paper on habitability in eccentric orbits is Williams & Pollard, “Earth-like worlds on eccentric orbits: excursions beyond the habitable zone,” International Journal of Astrobiology Vol. 1, Issue 1 (January, 2002), 61-68 (abstract).
The Long Afternoon of Earth
Every time I mention a Brian Aldiss novel, I have to be careful to check the original title against the one published in the US. The terrific novel Non-Stop (1958) became Starship in the States, rather reducing the suspense of decoding its strange setting. Hothouse (1962) became The Long Afternoon of Earth when abridged in the US following serialization in The Magazine of Fantasy & Science Fiction. I much prefer the poetic US title with its air of brooding fin de siècle decline as Aldiss imagines our deep, deep future.
Imagine an Earth orbiting a Sun far hotter than it is today, a world where our planet is now tidally locked to that Sun, which Aldiss describes as “paralyzing half the heaven.” The planet is choked with vegetation so dense and rapidly evolving that humans are on the edge of extinction, living within a continent-spanning tree. The memory of reading all this always stays with me when I think about distant futures, which by most accounts involve an ever-hotter Sun and the eventual collapse of our biosphere.
Image: The dust jacket of the first edition of Brian Aldiss’ novel Hothouse.
Indeed, warming over the next billion years will inevitably affect the carbon-silicate cycle. Its regulation of atmospheric carbon dioxide is a process that takes CO2 all the way from rainfall through ocean sediments, their subduction into the mantle and the eventual return of CO2 to the atmosphere by means of volcanism. Scientists have thought that the warming Sun will cause CO2 to be drawn out of the atmosphere at rates sufficient to starve out land plants, spelling an end to habitability. That long afternoon of Earth, though, may be longer than we have hitherto assumed.
A new study now questions not only whether CO2 starvation is the greatest threat but also manages to extend the lifetime of a habitable Earth far beyond the generally cited one billion years. The scientists involved apply ‘global mean models,’ which help to analyze how vegetation affects the carbon cycle. Lead author Robert Graham (University of Chicago), working with colleagues at Israel’s Weizmann Institute of Science, is attempting to better understand the mechanisms of plant extinction. Their new constraints on silicate weathering push the conclusion that the terrestrial biosphere will eventually succumb to temperatures near runaway greenhouse conditions. The biosphere dies from simple overheating rather than CO2 starvation.
The implications are intriguing and offer fodder for a new generation of science fiction writers working far-future themes. For in the authors’ models, the lifespan of our biosphere may be almost twice as long as has been previously expected. Decreases in plant productivity act to slow and eventually (if only temporarily) reverse the future decrease in CO2 as the Sun continues to brighten.
Here’s the crux of the matter: Rocks undergo weathering as CO2 laden rainwater carrying carbonic acid reacts with silicate minerals, part of the complicated process of sequestering CO2 in the oceans. The authors’ models show that if this process of silicate weathering is only weakly dependent on temperature – so that even large temperature changes have comparatively little effect – or strongly CO2 dependent, then “…progressive decreases in plant productivity can slow, halt, and even temporarily reverse the expected future decrease in CO2 as insolation continues to increase.”
From the paper:
Although this compromises the ability of the silicate weathering feedback to slow the warming of the Earth induced by higher insolation, it can also delay or prevent CO2 starvation of land plants, allowing the continued existence of a complex land biosphere until the surface temperature becomes too hot. In this regime, contrary to previous results, expected future decreases in CO2 outgassing and increases in land area would result in longer lifespans for the biosphere by delaying the point when land plants overheat.
How much heat can plants take? The paper cites a grass called Dichanthelium lanuginosum that grows in geothermal settings (with the aid of a symbiotic relationship with a fungus) as holding the record for survival, at temperatures as high as 338 K. The authors take this as the upper temperature limit for plants, adding this:
Importantly, with a revised thermotolerance limit for vascular land plants of 338 K, these results imply that the biotic feedback on weathering may allow complex land life to persist up to the moist or runaway greenhouse transition on Earth (and potentially Earth-like exoplanets). (Italics mine)
The long afternoon of Earth indeed. The authors point out that the adaptation of land plants (Aldiss’ continent-spanning tree, for example) could push their extinction to even later dates, limited perhaps by the eventual loss of Earth’s oceans.
…an important implication of our work is that the factors controlling Earth’s transitions into exotic hot climate states could be a primary control on the lifespan of the complex biosphere, motivating further study of the moist and runaway greenhouse transitions with 3D models. Generalizing to exoplanets, this suggests that the inner edge of the “complex life habitable zone” may be coterminous with the inner edge of the classical circumstellar habitable zone, with relevance for where exoplanet astronomers might expect to find plant biosignatures like the “vegetation red edge” (Seager et al. 2005).
The paper is Graham, Halevy & Abbot, “Substantial extension of the lifetime of the terrestrial biosphere,” accepted at Planetary Science Journal (preprint).
Beamed Propulsion and Planetary Security
Power beaming to accelerate a ‘lightsail’ has been pondered since the days when Robert Forward became intrigued with nascent laser technologies. The Breakthrough Starshot concept has been to use a laser array to drive a fleet of tiny payloads to a nearby star, most likely Proxima Centauri. It’s significant that a crucial early decision was to place the laser array that would drive such craft on the Earth’s surface rather than in space. You would think that a space-based installation would have powerful advantages, but two immediate issues drove the choice, the first being political.
The politics of laser beaming can be complicated. I’m reminded of the obligations involved in what is known as the Treaty on Principles Governing the Activities of States in the Exploration and Use of Outer Space, including the Moon and Other Celestial Bodies (let’s just call it the Outer Space Treaty), spurred by a paper from Adam Hibberd that has just popped up on arXiv. The treaty, which comes out of the United Nations Office for Space Affairs, emerged decades ago and has 115 signatories globally.
Here’s the bit relevant for today’s discussion, as quoted by Hibberd (Institute for Interstellar Studies, London):
States Parties to the Treaty undertake not to place in orbit around the earth any objects carrying nuclear weapons or any other kinds of weapons of mass destruction, install such weapons on celestial bodies, or station such weapons in outer space in any other manner. The moon and other celestial bodies shall be used by all States Parties to the Treaty exclusively for peaceful purposes. The establishment of military bases, installations and fortifications, the testing of any type of weapons and the conduct of military manoeuvres on celestial bodies shall be forbidden. The use of military personnel for scientific research or for any other peaceful purposes shall not be prohibited. The use of any equipment or facility necessary for peaceful exploration of the moon and other celestial bodies shall also not be prohibited.
So we’re ruling out weaponry in orbit or elsewhere in space. Would that prohibit building an enormous laser array designed for space exploration? Hibberd believes a space laser would be permitted if its intention were for space exploration or planetary defense, but you can see the problem: Power beaming at this magnitude can clearly be converted into a weapon in the wrong hands. And what a weapon. A 10 km X 10 km installation as considered in Philip Lubin’s DE-STAR 4 concept generates 70 GW beams. You can do a lot with that beyond pushing a craft to deep space or taking an Earth-threatening asteroid apart.
Build the array on Earth and the political entanglements do not vanish but perhaps become manageable as attention shifts to how to avoid accidentally hitting commercial airliners and the like, including the effects on wildlife and the environment.
Image: Pushing a lightsail with beamed energy is a feasible concept capable of being scaled for a wide variety of missions. But where do we put the beamer? Credit: Philip Lubin / UC-Santa Barbara.
The second factor in the early Starshot discussions was time. Although now slowed down as its team looks at near-term applications for the technologies thus far examined, Starshot was initially ramping up for a deployment by mid-century. That’s pretty ambitious, and we wouldn’t have a space option that could develop the beamer if that stretchiest-of-all-stretch goals actually became a prerequisite.
So if we ease the schedule and assume we have the rest of the century or more to play with, we can again examine laser facilities off-planet. Moreover, Starshot is just one beamer concept, and we can back away from its specifics to consider an overall laser infrastructure. Hibberd’s choice is the DE-STAR framework (Directed Energy Systems for Targeting of Asteroids and Exploration) developed by Philip Lubin at UC-Santa Barbara and first described in a 2012 on planetary defense. The concept has appeared in numerous papers since, especially 2016’s “A Roadmap to Interstellar Flight.”
If the development of these ideas intrigues you, let me recommend Jim Benford’s A Photon Beam Propulsion Timeline, published here in 2016, as well as Philip Lubin’s DE-STAR and Breakthrough Starshot: A Short History, also from these pages.
What Hibberd is about in his new paper is to work out how far away various categories of laser systems would have to be to ensure the safety of our planet. This leads to a sequence of calculations defining different safe distances depending on the size of the installation. The DE-STAR concept is modular, a square phased array of lasers where each upgrade indicates a power of base 10 expansion to the array in meters. In other words, while DE-STAR 0 is 1 meter to the side, DE-STAR 1 goes to 10 meters to the side, and so on. Here’s the chart Hibberd presents for the system (Table 1 in his paper).
Keep scaling up and you achieve arrays of stupendous size, and in fact an early news release from UC-Santa Barbara described a DE-STAR 6 as a propulsion system for a 10-ton interstellar craft. It’s hard to imagine the 1,000 kilometer array this would involve, although I’m sure Robert Forward would have enjoyed the idea.
So taking Lubin’s DE-STAR as the conceptual model (and sticking with the more achievable lower end of the DE-STAR scale), how can we lower the risks of this kind of array being used as a weapon? And that translates into: Where can we put an array so that even its largest iterations are too far from Earth to cause concern?
Hibberd’s calculations involve determining the minimum level of flux generated by an individual 1 meter aperture laser element (this is DE-STAR 0) – “the unphased flux of any DE-STAR n laser system” – and using as the theoretical minimum safe distance from Earth a value on the order of 10 percent of the solar constant at Earth, meaning the average electromagnetic radiation per unit area received at the surface. The solar constant value is 1361 watts per square meter (W/m²); Hibberd pares it down to a maximum allowed flux of 100 W/m² and proceeds accordingly.
Now the problems of a space-based installation become strikingly apparent, for the calculations show that DE-STAR 1 (10 m X 10 m) would need to be positioned outside cis-lunar space to ensure these standards, and even further away (beyond the Earth-Moon Lagrange 2 point) for ultraviolet wavelengths (λ ≲ 350nm). That takes us out 450,000 kilometers from Earth. However, a position at the Sun-Earth L2 Lagrange location would be safe for a DE-STAR 1 array.
The numbers add up, and we have to take account of stability. The Sun/Earth Lagrange 4 and 5 points would allow a DE-STAR 2 laser installation to remain at a fixed location without on-board propulsion. DE-STAR 3 would have to be positioned beyond the asteroid belt, or even beyond Jupiter if we take ultraviolet wavelengths into account. The enormous DE-STAR 4 level array would need to be placed as far as 70 AU away.
All this assumes we are working with an array on direct line of sight with the Earth, but this does not have to be the case. Let me quote Hibberd on this, as it’s rather interesting:
Two such locations are the Earth/Moon Lagrange 2 point (on a line from the Earth to the Moon, extending beyond the Moon by ∼ 61, 000 km) and the Sun/Earth Lagrange 3 point (at 1 au from the Sun and diametrically opposite the Earth as it orbits the Sun). In both cases, the instability of these points will result in the DE-STAR wandering away and potentially becoming visible from Earth, so an on-board propulsion would be needed to prevent this. One solution would be to use the push-back from the lasers to provide a means of corrective propulsion. However it would appear a DE-STAR’s placement at either of these points is not an entirely satisfactory solution to the problem.
So we can operate with on-board propulsion to achieve no direct line-of-sight to Earth, but the orbital instabilities involved make this problematic. Achieving the goal of a maximum safe flux at Earth isn’t easy, and we’re forced to place even DE-STAR 2 arrays at least 1 AU from the Sun at the Sun/Earth Lagrange 4 or 5 positions to achieve stable orbits. DE-STAR 3 demands movement beyond the asteroid belt at a minimum. DE-STAR levels beyond this will require new strategies for safety.
Back to the original surmise. Even if we had the technology to build a DE-STAR array in space in the near future, safety constraints dictate that it be placed at large distances from the Earth, making it necessary to have first developed an infrastructure within the Solar System that could support a project like this. As opposed to one-off missions from Earth launching before such an infrastructure is in place, we’ll need to have the ability to move freely at distances that ensure safety, unless other means of planetary protection can be ensured. Hibberd doesn’t speculate as to what these might be, but somewhere down the line we’re going to need solutions for this conundrum.
The paper is Hibberd, “Minimum Safe Distances for DE-STAR Space Lasers,” available as a preprint. Philip Lubin’s “A Roadmap to Interstellar Flight” appeared in Journal of the British Interplanetary Society 69, 40-72 (2016). Full text.
All the Light We Can See
I’ve reminisced before about crossing Lake George in the Adirondacks in a small boat late one night some years back, when I saw the Milky with the greatest clarity I had ever experienced. Talk about dark skies! That view was not only breathtaking on its own, but it also raised the point about what we can see where. Ponder the cosmic optical background (COB), which sums up everything that has produced light over the history of the universe. The sum of light can be observed with even a small telescope, but the problem is to screen out local sources. No telescope is better placed to do just this than the Long Range Reconnaissance Imager (LORRI) aboard the New Horizons spacecraft.
Deep in the Kuiper Belt almost 60 AU from the Sun, the craft has a one-way light time of over eight hours (Voyager 1, by comparison, shows a one-way light time of almost 23 hours at 165 AU). It’s heartening that we’re continuing to keep the Voyagers alive even as the options slowly diminish, but New Horizons is still robust and returning data from numerous instruments. No telescope anywhere sees skies as dark as LORRI. That makes measurements of the COB as authoritative as anything we’re likely to get soon.
Image: Not my view from the Adirondacks but close. The Milky Way is gorgeous when unobscured by city lights. Credit: Derek Rowley.
The issue of background light came to the fore in 2021, when scientists at the National Science Foundation-funded NSF NOIRLab put data from New Horizons’ 20.8 cm telescope to work. That effort involved measuring the light found in a small group of images drawn from deep in the cosmos. It suggested a universe that was brighter than it should be, as if there were uncounted sources of light. Now we have further analysis of observations made with LORRI in 2023 supplemented by data from ESA’s Planck mission, which aids in calibrating the dust density in the chosen fields of view. We learn that contamination from the Milky Way can explain the anomaly.
The new paper from lead author Marc Postman (Space Telescope Science Institute) studies light from 16 different fields carefully chosen to minimize the background light of our own galaxy which, of course, surrounds us and compromises our view. This new work, rather than using archival data made for other purposes, explicitly uses LORRI to create images minimizing foreground light sources. The conclusion is evidently air-tight, as laid out by Postman:
At the outset of this work we posed the question: Is the COB intensity as expected from our census of faint galaxies, or does the Universe contain additional sources of light not yet recognized? With our present result, it appears that these diverse approaches are converging to a common answer. Galaxies are the greatly dominant and perhaps even complete source of the COB. There does remain some room for interesting qualifications and adjustments to this picture, but in broad outline it is the simplest explanation for what we see.
And let me throw in this bit from the conclusion of the paper because it adds an interesting dimension to the study:
If our present COB intensity is correct, however, it means that galaxy counts, VHE γ-ray extinction, and direct optical band measurements of the COB intensity have finally converged at an interesting level of precision. There is still room to adjust the galaxy counts slightly, or to allow for nondominant anomalous intensity sources.
In other words, to fully analyze the COB, the scientists have included VHE (very high energy) gamma ray extinction, meaning adjustments for the scattering of gamma rays as they travel to us. Although not visible at optical wavelengths, gamma rays can interact with the photons of the COB in ways that can be measured, as an adjustment to the rest of the COB data. That analysis complements the count of known galaxies and the optical band measurements to produce the conclusion now achieved.
I always find it interesting that there is both a deep satisfaction in solving a mystery and also a slight letdown, for let’s face it, odd things in the universe are fascinating, and let our imaginations run wild. In this case, however, the issue seems resolved.
I don’t have to mention to this audience how much good science continues to get done by having a fully functioning probe this deep in the Kuiper Belt. From New Horizons’ vantage point, there is little to no effect from zodiacal light, which is the result of sunlight scattering off interplanetary dust. The latter is a key factor in the brightness of the sky in the inner Solar System and has made previous attempts to measure the COB from the inner system challenging. We now look ahead to New Horizons’ search for other Kuiper Belt Objects to explore and try to learn whether there is a second belt of debris beyond the known one, and thus between it and the inner Oort Cloud.
We’ll doubtless continue to find things that challenge our assumptions as we press on, a reminder that a successor to New Horizons and the Voyagers is still a matter of debate both in terms of mission design and funding. As to the cosmic optical background, we give up the unlikely but highly interesting prospect that any significant levels of light come from sources unknown to us. As the paper concludes: “…the simplest hypothesis appears to provide the best explanation of what we see: the COB is the light from all the galaxies within our horizon..”
The paper is Postman et al., “New Synoptic Observations of the Cosmic Optical background with New Horizons,” The Astrophysical Journal Vol. 972, No. 1 (28 August 2024), 95 (full text). The 2021 paper is Lauer et al., “New Horizons Observations of the Cosmic Optical Background,” The Astrophysical Journal Vol. 906, No. 2 (11 January 2021), 77 (full text).
Green Mars: A Nanotech Beginning
I want to return to Mars this morning because an emerging idea on how to terraform it is in the news. The idea is to block infrared radiation from escaping into space by releasing engineered dust particles about half as long as the wavelength of this radiation, which is centered around wavelengths of 22 and 10 μm, into the atmosphere. Block those escape routes and the possibility of warming Mars in a far more efficient way than has previously been suggested emerges. The paper on this work even suggests a SETI implication (!), but more about that in a moment.
Grad student Samaneh Ansari (Northwestern University) is lead author of the paper, working with among others Ramses Ramirez (University of Central Florida), whose investigations into planetary habitability and the nature of the habitable zone have appeared frequently in these pages (see, for example, Revising the Classical ‘Habitable Zone’). The engineered ‘nanorods’ at the heart of the concept could raise the surface temperature enough to allow survivability of microbial life, which would at least be a beginning to the long process of making the Red Planet habitable.
As opposed to using artificial greenhouse gases, a method that would involve vast amounts of fluorine scarce on the Martian surface, the nanorod approach takes advantage of the properties of the planet’s dust, which is lofted to high altitudes as an aerosol. The authors calculate, using the Mars Weather Research and Forecasting global climate model, that releasing 9-μm-long conductive nanorods made of aluminum “not much smaller than commercially available glitter” would provide the needed infrared blocking that natural dust cannot, and once at high altitude settle more slowly to the surface.
What stands out in the authors’ modeling is that their method is over 5,000 times more efficient than other methods of terraforming, and relies on materials already available on Mars. Natural dust particles, you would think, should warm the planet if released in greater quantities, but the result of doing so is actually to cool the surface even more. Let me quote the paper on this counter-intuitive (to me at least) result:
Because of its small size (1.5-μm effective radius), Mars dust is lofted to high altitude (altitude of peak dust mass mixing ratio, 15 to 25 km), is always visible in the Mars sky, and is present up to >60 km altitude (14–15). Natural Mars dust aerosol lowers daytime surface temperature [e.g., (16)], but this is due to compositional and geometric specifics that can be modified in the case of engineered dust. For example, a nanorod about half as long as the wavelength of upwelling thermal infrared radiation should interact strongly with that radiation (17).
Edwin Kite (University of Chicago) is a co-author on the work:
“You’d still need millions of tons to warm the planet, but that’s five thousand times less than you would need with previous proposals to globally warm Mars. This significantly increases the feasibility of the project… This suggests that the barrier to warming Mars to allow liquid water is not as high as previously thought.”
Image: This is Figure 3 from the paper. Caption: The proposed nanoparticle warming method. Figure credit: Aaron M. Geller, Northwestern, Center for Interdisciplinary Exploration and Research in Astrophysics + IT-RCDS.
Strikingly, the effects begin to emerge quite quickly. Within months of the beginning of the process, atmospheric pressure rises by 20 percent as CO2 ice sublimes, creating a positive warming feedback. Note this from the paper:
On a warmed Mars, atmospheric pressure will further increase by a factor of 2 to 20 as adsorbed CO2 desorbs (35), and polar CO2 ice (36) is volatilized on a timescale that could be as long as centuries. This will further increase the area that is suitable for liquid water (6).
That said, we’re still not in range for creating a surface habitable by humans. We have to deal with barriers to oxygenic photosynthesis, including the makeup of the Martian sands, which are laden with potentially toxic levels of nitrates, and an atmosphere with little oxygen. Toxic perchlorates in the soil would require ‘bioremediation’ involving perchlorate-reducing bacteria, which yield molecular oxygen as a byproduct. We’re a long way from creating an atmosphere humans can breathe, but we’re in range of the intermediate goal of warming the surface, possibly enough to sustain food crops.
Addendum: I made a mistake above, soon caught by Alex Tolley. Let me insert his comment here to straighten out my mistake:
“… which are laden with potentially toxic levels of nitrates,”
I think you misinterpreted the sentence from the paper:
“…is not sufficient to make the planet’s surface habitable for oxygenic photosynthetic life: barriers remain (7). For example, Mars’ sands have ~300 ppmw nitrates (37), and Mars’ air contains very little O2, as did Earth’s air prior to the arrival of cyanobacteria. Remediating perchlorate-rich soil…”
300 ppm nitrates is very low and will not support much plant or bacterial life. [You want ~ 10,000 ppm ] That is why N and P are added to simulated Mars regolith when testing plant growth for farming or terraforming. IIRC, there have been suggestions of importing nitrogen from Titan to meet its needs on Mars.
Thanks for catching this, Alex!
Although nanoparticles could warm Mars… both the benefits and potential costs of this course of action are now uncertain. For example, in the unlikely event that Mars’ soil contains irremediable compounds toxic to all Earth-derived life (this can be tested with Mars Sample Return), then the benefit of warming Mars is nil. On the other hand, if a photosynthetic biosphere can be established on the surface of Mars, perhaps with the aid of synthetic biology, then that might increase the Solar System’s capacity for human flourishing. On the cost side, if Mars has extant life, then study of that life could have great benefits that warrant robust protections for its habitat. More immediately, further research into nanoparticle design and manufacture coupled with modeling of their interaction with the climate could reduce the expense of this method.
That’s a robust way forward, one the authors suggest could involve wind tunnel experiments at Mars pressure to analyze how both dust and nanomaterials are released from modeled Mars surfaces, from dusty flat terrain to the ice of the poles. Large eddy simulations (LES) are ways to model larger flows such as winds and weather patterns. Deploying these should be useful in learning how the proposed nanorods will disperse in the atmosphere, while local warming methods also demand consideration.
A question I had never thought to ask about terraforming was how long the effects can be expected to last, and indeed the authors point out how little is known about long-term sustainability. A 2018 paper on current loss rates in the Martian atmosphere suggests that it would take at least 300 million years to fully deplete the atmosphere. The big unknown here is the Martian ice, and what may lie beneath it:
…if the ground ice observed at meters to tens of meters depth is underlain by empty pore space, then excessive warming over centuries could allow water to drain away, requiring careful management of long-term warming. Subsurface exploration by electromagnetic methods could address this uncertainty regarding how much water remains on Mars deep underground.
Image: Will we ever get to this? The ‘nanorod’ approach oculd be the beginning. Credit: Daein Ballard, Wikimedia Commons CC BY-SA 3.0.
The SETI implication? Nanoparticle warming is efficient, so much so that we might expect other civilizations to use the technique. A potential technosignature emerges in the polarization of light, because a terrestrial world with a magnetic field will show the interaction of polarized light with the planet’s atmosphere, the latter conceivably laden with the nanoparticles at work in terraforming. Polarization will occur when light interacts with nanoparticles, aerosols, or dust in the atmosphere or the magnetic field. This would be an elusive signature to spot, but not outside the range of possibility.
In the absence of an active geodynamo to drive a magnetic field, Mars would not be a candidate for this kind of remote observation. But an exoplanet of terrestrial class with a magnetic field should, by these calculations, be a candidate for this kind of study.
The paper is Ansari et al., “Feasibility of keeping Mars warm with nanoparticles,” Science Advances Vol. 10, No. 32 (7 August 2024). Abstract / Preprint. Thanks to Centauri Dreams reader Ivan Vuletich for the pointer to this paper.