Centauri Dreams

Imagining and Planning Interstellar Exploration

Getting Neptune into Focus

As a book-dazzled kid growing up in St. Louis, I had the good fortune to be surrounded by books from previous generations, and specifically those belonging both to my father and my half-brother, who had died long before I was born. Among these was a multi-volume encyclopedia from the 1920s I’ve never been able to identify. All I have is the memory of looking through its musty volumes and realizing that Pluto was not listed in it, as the publication date was a few years earlier than Clyde Tombaugh’s epic search for the world.

I do remember thinking that without Pluto, the Solar System only had eight planets, and musing in my teenage boy way about how odd this incomplete view of the Solar System was. Little did I know how much more was in store! As to that eighth planet, Neptune was a puzzler not only to the encyclopedia but to science fiction writers of the Gernsback era. Thus James Morgan Walsh’s “The Vanguard to Neptune,” published in Wonder Stories Quarterly in the Spring, 1932 issue. In the cover by Frank R. Paul, that’s Neptune hanging in the sky, looking for all the world like a terrestrial planet, here seen from Triton. The explorers assume the blue areas are ice until they cross to the planet.

Image: Frank R. Paul’s cover illustration for J. M. Walsh’s “The Vanguard to Neptune.” Walsh (1897-1952) was an interesting figure in his own right for those of us who spent a career living off the printed word. Settling in the UK, the Australian novelist would pen an astounding 94 novels across a wide range of genres and under a variety of pseudonyms. It was possible to do that kind of thing in the pulp era.

Spurring these recollections are images of Neptune revealed in a new study on the planet’s cloud cover and its relation to the solar cycle. They’re so stunning that I wanted to reproduce them here, thinking about how our knowledge of the Solar System has advanced since my first acquaintance with the planet in that encyclopedia as no more than a speck of light amongst countless others. There’s also a bit of the Voyager 2 thrill as the craft approached Neptune back in 1989 deep in the summer night here. To see new worlds open before us. Astonishing.

I suppose one day we’ll get so completely accustomed to imaging exoplanets that such thrills will seem commonplace, or maybe not, given their sheer diversity. But the images below still work for me, the first set from Hubble.

Image: This sequence of Hubble Space Telescope images chronicles the waxing and waning of the amount of cloud cover on Neptune. This nearly-30-year-long set of observations shows that the number of clouds grows increasingly following a peak in the solar cycle – where the Sun’s level of activity rhythmically rises and falls over an 11-year period. The Sun’s level of ultraviolet radiation is plotted in the vertical axis. The 11-year cycle is plotted along the bottom from 1994 to 2022. The Hubble observations along the top, clearly show a correlation between cloud abundance and solar peak of activity. The chemical changes are caused by photochemistry, which happens high in Neptune’s upper atmosphere and takes time to form clouds. Credit: NASA, ESA, LASP, Erandi Chavez (UC Berkeley), Imke de Pater (UC Berkeley).

I don’t mean to neglect the import of the paper that features these observations, which comes from astronomers at UC-Berkeley, or their conclusions, which use the numerous changes in the patterning of Neptune’s clouds to point to the connection with the flip in the Sun’s magnetic field every eleven years. It’s intriguing to learn that when the Sun emits more intense ultraviolet light, and in particular the strong hydrogen Lyman-alpha emission, there is increasing cloud cover on Neptune fully two years later.

Imke de Pater (UC-Berkeley) is senior author on the study:

“These remarkable data give us the strongest evidence yet that Neptune’s cloud cover correlates with the Sun’s cycle. Our findings support the theory that the Sun’s UV rays, when strong enough, may be triggering a photochemical reaction that produces Neptune’s clouds.”

We see 2.5 cycles of cloud activity on Neptune recorded over a 29-year period in observations not only from Hubble but Keck Observatory and Lick Observatory, in which it also becomes clear that there is a relationship between the number of clouds and the planet’s observed brightness. Below is the Keck imagery.

Image: A dramatic change in Neptune’s appearance was observed in late 2019 and has persisted through June 2023. As shown by this compilation of images at 1.63 µm (microns) obtained with the NIRC2 and adaptive optics system on the Keck II Telescope, Neptune had numerous cloud features organized in latitudinal bands from before 2002 through late 2019. Afterwards, clouds appeared almost absent except near the south pole. The images are displayed using a Asinh function which, like a log-scale display, decreases the contrast between the features; if displayed on a linear scale, only the brightest features would be visible. Credit: Imke de Pater, Erandi Chavez, Erin Redwing (UC Berkeley)/W. M. Keck Observatory.

This is tricky analysis, because as the paper points out, clouds not related to photochemical reactions, as for example those produced by storms rising up from the deep atmosphere, would complicate correlations with the solar cycle. More recent imagery from the summer of this year has begun to show more clouds in the northern latitudes and at high altitude, which de Pater says reflects the observed increase in the solar ultraviolet flux in the past two years. It’s chastening to realize that even with 30 years of high resolution data covering almost three solar cycles, we have still covered only 20 percent of Neptune’s orbit. Oh for an ice-giant orbiter to depict up close the chaotic actions on a planet whose winds are the strongest known in the Solar System.

The paper is Chavez et al., “Evolution of Neptune at near-infrared wavelengths from 1994 through 2022,” Icarus Vol. 404 (1 November 2023), 115667 (abstract).

LSST: Interstellar Interlopers and the Nature of Z

Interstellar studies toy with our expectations. Those of us who think about sending probes to other stars share the frustration of the long time-scales involved, not just in transit times but also in arriving at the technologies to make such missions happen. But the other half of interstellar studies, the observation and characterization of targets, is happening at a remarkable rate, with new instruments coming online and an entire class of extremely large telescopes in the pipeline. Exoplanet studies thrive.

In between, upcoming events are encouraging. Having identified two interstellar objects – 1I/ʻOumuamua and comet 2I/Borisov – in our own Solar System, we will shortly be able to expand the number of such confirmed interlopers enormously. That puts us in position to build intercept missions to study and sample material from another stellar system in relatively short order. The Legacy Survey of Space and Time (LSST), being planned for the now under construction Vera C. Rubin Observatory in Chile, should be able to detect interstellar materials passing through our system in abundance.

Image: An artist’s impression of a small, rocky interstellar object hurtling from the upper right toward the inner Solar System. The orbits of the four inner planets (Mercury, Venus, Earth, Mars) are fully visible, drawn as teal concentric circles around the bright ball of the Sun at the center. We see the orbits from a slightly elevated angle, so that the circular paths appear oval. The black background is sprinkled with points of starlight. The interstellar object looks like an elongated potato above the Sun, streaming toward the Sun from the upper right, with a short tail of gas and dust trailing behind. Credit: Rubin Observatory/NOIRLab/NSF/AURA/J. daSilva.

The LSST has crept into almost every discussion we’ve had in these pages about our two known interstellar visitors, along with the lament that had we found these objects sooner, we would have had been able to collect much more data from them. A 10-year survey of the southern sky (from the El Peñon peak of Cerro Pachón in northern Chile), the survey will use a large-aperture wide-field instrument called the Simonyi Survey Telescope (SST) to study half the sky every three nights in six optical bands. It will deploy the largest digital camera ever constructed, with a 9.6 square degree of view.

Using three refractive lenses, the LSST Camera will take a pair of 15-second exposures of each field, operating throughout the night. Astronomers plan over 5.2 million exposures in ten years, creating views that will be sensitive to redshifts up to z=3. Recall the terminology: z=3 means that the observed wavelength of light from a distant object is three times longer than the rest wavelength (when the light was emitted.

Because the z parameter represents the stretching of wavelength owing to the expansion of the universe, higher values of z represent more distant (and hence older) objects, receding from us at a significant percentage of the speed of light. I’ve seen a redshift of z=0.0043 for the galaxy M87, which is roughly 55 million light years from Earth. A redshift of z=3 implies an object whose light has been traveling 11 billion years to reach us. That would make the actual distance to the object today over 18 billion light years because of the continuing expansion of the universe as the light travels. Las Cumbres Observatory offers an excellent backgrounder on all this.

Forgive the digression – this is how I learn stuff. But the point is that what the LSST will create is what its planners call a ‘movie,’ one summing that decade of observation and exposures and sensitive to extraordinarily distant objects. To get a sense of this, consider that the LSST project will take 15 terabytes of data every night, yielding an uncompressed data set of 200 petabytes. And with this kind of sensitivity, interstellar objects moving into our own Solar System should appear with some regularity.

Michele Bannister (University of Canterbury, NZ), a member of the Rubin Observatory/LSST Solar System Science Collaboration, comments:

“Planetary systems are a place of change and growth, of sculpting and reshaping. And planets are like active correspondents in that they can move trillions of little tiny planetesimals out into galactic space. A rock from another solar system is a direct probe of how planetesimal formation took place at another star, so to actually have them come to us is pretty neat. We calculate that there are a whole lot of these little worlds in our Solar System right now. We just can’t find them yet because we aren’t seeing faint enough.”

Image: This image captures not only Vera C. Rubin Observatory, a Program of NSF’s NOIRLab, but one of the celestial specimens Rubin Observatory will observe when it comes online: the Milky Way. The bright halo of gas and stars on the left side of the image highlights the very center of the Milky Way galaxy. The dark path that cuts through this center is known as the Great Rift, because it gives the appearance that the Milky Way has been split in half, right through its center and along its radial arms. In fact, the Great Rift is caused by a shroud of dust, which blocks and scatters visible light. This dust makes the Great Rift a difficult space to observe. Fortunately, Rubin is being built to conduct the Legacy Survey of Space and Time (LSST). This survey will observe the entire visible southern sky every few nights over the course of a decade, capturing about 1000 images of the sky every night and giving us a new view of our evolving Universe. Rubin Observatory is a joint initiative of the National Science Foundation and the Department of Energy (DOE). Once completed, Rubin will be operated jointly by NSF’s NOIRLab and DOE’s SLAC National Accelerator Laboratory to carry out the Legacy Survey of Space and Time. Credit: RubinObs/NOIRLab/NSF/AURA/B. Quint.

The LSST has uses far beyond interstellar interlopers, of course, with implications for the study of dark energy and dark matter as well as the formation of the Milky Way and the trajectories of potentially hazardous asteroids. But its emergence, beginning with first operations in late 2024, puts us on the cusp of studying planet formation using materials from other stellar systems. That brings intercept missions into the discussion, a topic we’ve considered in these pages before through the work of my friend Andreas Hein (University of Luxembourg). On a broader level, consider that expansion into the Solar System already has interstellar aspects, as I’ll discuss soon with a look at what we are learning about interstellar dust, and how missions beyond the heliosphere can inform our views of the local bubble in which we move.

Administrative Leave

“It seems that destiny has taken a hand.” Thus Humphrey Bogart, in a pivotal scene from the iconic 1942 film Casablanca. In Bogart’s case, destiny had to do with the sudden arrival of Claude Rains and the gendarmerie at Rick’s Café Américain, with profound implications for his relationship with Ilsa. In my case, fate was more jejune, involving the failure of my PC’s power supply just as I was asking myself whether it was now time for my August vacation. The power supply left little doubt. Surely a sign from the cosmos that after all the recent work reconfiguring the site’s software, I should take some time off?

That’s how I plan to interpret it, in any case. In the meantime, I’ll get the PC problem resolved. As to the still developing work on the site, a couple of things to note:

1) I am all too aware that the mobile experience is problematic, depending on what phone you use. I find this bewildering, as many people see the site correctly on their phones, whereas people like me see a very skinny column of text with huge side margins. So that is right up there on the list, and I plan to work on it during my time off. Since I can’t travel anywhere right now, I will be staying put. I will be on the site each day, moderating comments and also trying to work out glitches. But I will not be posting new material under my byline for approximately two weeks.

2) Speaking of glitches, this one came out of nowhere. I’ve learned that older posts have comment sections that are not formatting correctly. This too needs to be fixed, and part of my time off will be spent in the quest for answers on that matter.

Otherwise, in the next couple of weeks, I plan to watch old movies and read the novels now at the top of my fiction reading stack. These include books by Richard Ford, Alastair Reynolds, Graham Greene and Emily St. John Mandel, although I have to finish up Alan Furst’s wonderful The World at Night before proceeding to the first of these.

One last thing: Every now and then I get a message from someone who has had trouble trying to leave a comment with the new interface. Here’s the method: To comment, what you need to do is click on the title of the post, which will open the same post with the comment section in place at the end of the text.

As I say, keep the comments coming, as I’ll be here to put them through. Thanks to all for your patience and suggestions re the site changes. More to come.

Nucleic Acid Stability in the Venusian Clouds

How to approach finding life on other worlds will continue to be a challenging issue, but how useful that even as we work out strategies for studying exoplanet atmospheres, we have planets we can actually reach right here in our own Solar System. And if the hunt for life has turned up empty thus far on Mars, we can keep searching there even as we consider the exotic possibility of life in the clouds of Venus. We’ve looked at Venus Life Finder before in these pages. This series of missions is now known as Morning Star, all designed to probe the clouds for signs of a kind of life that would have to endure the most hellish conditions we can imagine. In today’s post, Alex Tolley examines the Morning Star Missions and how they might proceed, depending on the results of that all important first sampling of the atmosphere.

by Alex Tolley

“To boldly seek life, where no terrestrial life has gone before”

The “Morning Star Missions” (formerly Venus Life Finder) group had previously outlined their plans for early life-detecting missions in the possibly habitable, temperate, but highly acidic Venusian clouds, at altitudes of 48-60 km above the searingly hot surface. The first mission, now slated for a 2025 launch, includes an Autofluorescing Nephelometer (AFN) that can detect organic materials, a prerequisite for living organisms. [1] The instrument emits laser light that causes certain carbon bonds to fluoresce and be detected (in this case, 440 nm is the selected detection wavelength). If no organic material is detected in the cloud droplets, that would eliminate life as we know it. However, there would still be ambiguities regarding whether organic material was detected or not as not all organic matter will fluoresce when stimulated by light. Typically aromatic carbon ring structures fluoresce, whilst linear carbon chains do not. A double-membraned cell wall that could contain a prebiotic metabolic system would probably fail to register. This might well be considered a false negative for what could be a very interesting finding.

It is well known that sulfuric acid (H2SO4) has a deleterious effect on organic matter, and highly concentrated sulfuric acid (CSA) that is expected in the Venusian clouds will rapidly break down organic matter and therefore it would appear that terrestrial life would rapidly succumb to this level of acid condition. [An acid bath is a traditional means by which murderers dispose of the victim’s body.] This would seem to rule out life of a terrestrial nature even in the Venusian clouds.

What about a positive result? Carbon aromatic rings that readily fluoresce may be very common in the clouds as simple carbon molecules are converted as the compounds fall towards the hotter surface. Polyaromatic hydrocarbons [PAH] are common in space and it has been hypothesized that they may be common in Venus’ clouds [2].

Apart from the simple destruction of living organisms like plants by pouring CSA onto them, prior work [3] has shown that organic material identified in terrestrial metabolisms is a little more stable than all naturally occurring organic compounds in CSA but far less stable than the space of manufactured organic compounds, as shown in figure 1.

A database of organic compounds and their reactivity to H2SO4 shows compounds with ring structures, especially those with unsaturated carbon-carbon bonds [5]. This implies that any extant organic compounds with these structural features will be more prevalent in the clouds, which includes PAHs. If abiotic aromatic ring carbon compounds are most likely to be resistant to CSA reactions, these abiotic organic molecules may create a false positive result. The search for life must therefore be sure that some biological molecules are resistant to CSA and could theoretically be part of a positive organic molecule detection. Otherwise, this search approach would be futile. That about 10% of the extracted core metabolism compounds are resistant to CSA for greater than 3 years provides support for the possibility that biology may exist in the Venusian clouds.

Which biotic molecules are resistant to CSA and therefore could be present in the clouds? An answer to this issue is provided in a new paper by Seager et al in Proceedings of the National Academy of Sciences [4] which examines whether any core biological molecules can survive the acid conditions. Information polymers such as DNA and RNA are a central component of terrestrial life. They are composed of nucleic acids of two types: purines (Adenine, Guanine) and pyrimidines (Cytosine, Thymine, Uracil), linked by a sugar (ribose in RNA, and deoxyribose in DNA) and phosphate. As shown in figure 2, the core structures have unsaturated bonds and very limited exposed bonds that could be attacked by CSA. The purpose of the paper was to determine if these nucleic acids are resistant to CSA and therefore possible detectable molecules on Venus.

The researchers performed several tests, including changes in UV spectra, Nuclear Magnetic Resonance (NMR) to detect changes in C-H bonds, NMR to detect the replacement of the hydrogen atoms with deuterium, and NMR to detect the donation of hydrogen ions, H+, to the nucleic acid molecules by the CSA (protonation).

The first series of tests placed these nucleic acids in CSA and tested how the UV spectrum changed over a period of up to 2 weeks in acid concentrations up to 98%. The spectra for the treated nucleic acids were very similar to those in aqueous solutions, indicating that there was no fundamental change in structures or breakdown of the compounds.

The next series of experiments ran NMR tests on the nucleic acids. This detects the state of the carbon atoms and their bonds which are shown by chemical shifts (ppm) in the hydrogens. Once again, the sharp spectral peaks were very similar to the controls, indicating that the structures and identified carbon and nitrogen bonds had not been changed. Figure 4 shows the results.

The last series of experiments used deuterated sulfuric acid [D2SO4] as well as C13 and N15 isotopes in the nucleic acids to determine if any of the bound hydrogen atoms had been replaced, indicating that the structures were capable of having the C-H bonds broken. Again, there was no evidence of bond-breaking and H atoms replacement.

“Taken together the NMR data confirms that the purine ring structure remains intact in 98% w/w D2SO4 in D2O.“

As the nucleic acids were in CSA where H+ ions were abundant, there is the question of whether these ions protonate the compounds. This protonation of the nitrogen and oxygen atoms was detected by NMR. As hydrogen bonds are important in biological functions, most notably the base pairing between purines and pyrimidines in DNA, and pairing of bases in the same RNA strands, protonation would impact these interactions. Figure 5 shows how protonation disrupts this pairing.

Figure 5. The base pairings in aqueous solutions and the impact of H2SO4 protonation that breaks the hydrogen bonds and pairing. Source: Seager et al 2023 [4]

The conclusion is that the purines and pyrimidines of terrestrial information molecules will remain stable in the Venusian clouds in the habitable region. As these molecules will fluoresce, a positive result of organic molecule detection could include these molecules, but follow-on missions would be needed to determine whether these molecules are present.

In summary, these experiments demonstrate that terrestrial information molecules using the core purine and pyrimidine structures are stable in CSA and therefore could potentially be present in the Venusian clouds. Therefore if organic carbon is detected in the first mission, a 2nd mission to characterize the carbon compounds is supported as the presence of organic carbon could include biological molecules.

While the detection of these nucleic acids would be very interesting, it is important to note that to be useful information molecules, they must polymerize in a way that allows their informational function to operate. Otherwise, the nucleic acids are like an alphabet that cannot be composed in text, as the sugar-phosphate links between them would not be stable in CSA. Other molecules would need to be used. Currently, possible linker molecules have not been identified and remain an area of work.

We already know that amino acids are not stable in sulfuric acid, which rules out proteins as the main functional type of molecule of terrestrial life, existing in the Venusian clouds without some mechanism to neutralize the pH.

If amino acids were stable, could the first mission detect them? Amino acids with cyclic rings such as tryptophan fluoresce, albeit with a peak well below the 440 nm detection wavelength of the AFN to be included in the first mission. If subsequently confirmed by other instruments on later missions, it would indicate that the cloud droplet environment is not as unfavorable as assumed. As a side note, the somewhat controversial detection of phosphine suggests the known rapid oxidation by CSA is at least partially avoided, perhaps by either avoiding the cloud droplets or the droplets having a higher pH, or both.

What are the implications for life if nucleic acids are confirmed and in polymer form? The authors offer 3 scenarios:

    1. Life may have emerged during the early wet age in Venus’ oceans. As the planet became the hot dry world it is today, that life could have evolved to adapt to the new cloud-borne, temperate, but concentrated sulfuric acid conditions. The DNA/RNA would have had to change links between the nucleic acids to retain their function.

    2. During its evolution to the current conditions, life may have evolved the ability to neutralize the acid by excreting ammonia. This would allow it to retain the existing nucleic acid sugar-phosphate links in DNA and RNA, as well as allow proteins to remain stable.

    3. Lastly, the abiogenesis of new life in the clouds. Perhaps this is limited to a pre-biotic state with nucleobases only.

In my opinion, scenario 2 seems most likely if there is evidence that terrestrial-analog cellular life exists in the cloud droplets, using polymerized nucleic acids as their information molecules. This is because we know from the evolution of terrestrial life that core metabolism, information storage, and transcription and translation to functional proteins, have remained almost unchanged over billions of years. Extremophiles have been unable to change their core replication and growth biology, despite adapting to their current environments. What they do instead is tinker with the relative production of certain proteins, and evolve new enzymes and pathways to produce new molecules to adapt to the new conditions. Therefore being able to produce ammonia to neutralize the CSA seems a more likely evolutionary path.

If however nucleic acids are found and in a polymerized, functional state, but without accompanying amino acids and functional proteins, is it possible that Venus is in the equivalent condition of the hypothetical pre-biotic RNA World? In this scenario, RNA acts as both the information and functional molecule. We see evidence for its metabolic function as RNA can act as a catalyst and also autocatalyze itself to replicate. On Venus, the RNA analog may be pre-biotic or possibly degenerate, the remaining functional mechanism in a hostile pH environment. Despite this last speculation, it raises the question “How would these ‘nucleic acid bases’ be formed in the clouds?” While these nucleic acids have been shown to have the ability to form from simple molecules like HCN and formamide in aqueous conditions, there is as yet no evidence that they can form in CSA. Unless they can, this would seem to rule out this pre/post-biotic scenario. (See also Bain paper on H2SO4 as a solvent [3])

In summary, the nucleic acids used in the information molecules DNA and RNA are stable in the acid conditions expected in the Venusian clouds. However, they would not be functional as information molecules unless they can effectively polymerize in a way that allows an analog of the stable form that would allow natural selection to operate. They would need different linker molecules than the sugar-phosphate ones on Earth. Furthermore, protonation of the nitrogens in the nucleic acids would disrupt the hydrogen bonding mechanism for the base pairings. This is another important issue that further constrains the possibility of life on Venus unless it can neutralize the pH of the cloud droplets, with a metabolism that relies on methanogenesis of CO2 like terrestrial archaea, or organic molecules produced in the atmosphere.

If organic molecules are detected in the first 2025 scheduled mission, the stability of nucleic acids in CSA indicates that there is potential for their direct detection in a follow-up mission, holding out the possibility of some sort of life or pre-biotic chemistry on Venus.

References

Tolley, A (2022) “Venus Life Finder: Scooping Big Science” Centauri-Dreams https://centauri-dreams.org/2022/06/03/venus-life-finder-scooping-big-science/

Špaček, J (2021) “Organic Carbon Cycle in the Atmosphere of Venus”, arXiv preprint arXiv:2108.02286.

Bains W, Petkowski JJ, Zhan Z, Seager S. Evaluating Alternatives to Water as Solvents for Life: The Example of Sulfuric Acid. Life (Basel). 2021 Apr 27;11(5):400. doi: 10.3390/life11050400. PMID: 33925658; PMCID: PMC8145300.

Seager, S et al (2023) “Stability of nucleic acid bases in concentrated sulfuric acid: Implications for the habitability of Venus’ clouds” PNAS 2023 Vol. 120 No. 25 e2220007120 https://doi.org/10.1073/pnas.2220007120

Database of H2SO4 effects on molecules. Reactivity V4.1- release.xlsx Url: https://zenodo.org/record/4467868/files/Reactivity%20V4.1-%20release.xlsx

SETI: New Tools for Screening Out Radio Interference

Two new techniques for examining interesting SETI signals come into view this morning, one out of Breakthrough Listen work at UC-Berkeley, the other from researchers working with the Five-hundred-meter Aperture Spherical radio Telescope (FAST), the so-called ‘Heaven’s Eye’ instrument located in southwest China. In both cases, the focus is on ways to screen SETI observations from disruptive radio frequency interference (RFI), which can appear at first glance to flag a signal from another star.

The Chinese work relies upon FAST’s array of receiving instruments, each acting as a separate ‘beam’ to cover slightly different portions of the sky. FAST’s currently operational L-band receiver array consists of 19 beams, to which researchers led by Bo-lun Huang (Beijing Normal University) apply a technique called MultiBeam Point-source Scanning (MBPS). Here the instrument scans the target star sequentially with different beams of the instrument, setting up the possibility of cross-verification and allowing researchers to identify local interference quickly and accurately.

The paper on this work points to the SETI ON-OFF strategy as a more conventional way to analyze a target star. In this case, the star is observed for a short time, followed by a different target six or more beamwidths away from the primary. These become the ‘ON’ and ‘OFF’ of the method, the assumption being that an authentic signal from another civilization would appear only in the ON set of observations. MBPS, on the other hand, can be used by any radio telescope with a multibeam receiver and requires the telescope to slew during the observation periods to provide ongoing comparisons between each beam.

Let me quote the paper on this:

…we are effectively adding new parameters and the observation data can thus be interpreted from different perspectives. The additional parameters introduced by the MBPS strategy include the duration of signals in a single beam, intensity variation of signals, and the difference in central frequencies of different beams which are the results of the observation method of the MBPS. With the three newly introduced parameters, we are then able to put in the most rigorous restrictions on the RFI/ETI identifications by confining the characteristics of an ETI/RFI signal in a new multi-parameter space.

Having run a re-observation campaign on TRAPPIST-1 using this strategy (this followed a set of observations taken in 2021), the team was able to retrieve all 16,645 received signals (!) as RFI. The authors’ confidence level in the technique is high:

We speculate that it would be exceedingly rare for the MBPS strategy to return any suspicious signals, even over the course of several years, because the types of false positives found by other strategies are easily identifiable with the MBPS strategy. However, when a genuine narrowband ETI signal does arrive on Earth, the MBPS strategy is capable of identifying it even amidst a substantial influx of RFI.

Image: An illustration shows how FAST receives radio waves emitted by distant pulsars, the rapidly rotating cores of dead stars. At left, a photo shows the huge telescope in Guizhou province. Can the new methods in the Bo-lun Huang paper help us weed radio interference out of signals from another civilization? Credit: China Daily.

At UC-Berkeley, Bryan Brzycki and team have been analyzing interstellar ‘scintillation,’ the refraction or bending of electromagnetic waves that pass through cold plasma in interstellar space. Rising and falling in amplitude, the waves interfere when they reach Earth by different paths. The phenomenon has been well studied through analysis of pulsars and other distant radio sources, and an obvious analog occurs in the twinkling of starlight created by Earth’s atmosphere. In the case of interstellar scintillation, Brzycki has come up with algorithms that can analyze narrowband signals for this effect, quickly selecting for those that show the phenomenon and thus are not local.

On first glance, this appears extraordinarily useful, as co-author (and Brzycki thesis adviser) Imke de Pater (UC-Berkeley) points out:

“This implies that we could use a suitably tuned pipeline to unambiguously identify artificial emission from distant sources vis-a-vis terrestrial interference. Further, even if we didn’t use this technique to find a signal, this technique could, in certain cases, confirm a signal originating from a distant source, rather than locally. This work represents the first new method of signal confirmation beyond the spatial reobservation filter in the history of radio SETI.”

Image: The Green Bank Telescope, nestled in a radio-quiet valley in West Virginia, is a major listening post for Breakthrough Listen. Credit: Steve Croft, Breakthrough Listen.

A useful tool indeed, though bear in mind that it proves useful only for signals originating more than 10,000 light years from Earth, for to produce the needed scintillation, the signal must do a lot of traveling. If we do make a SETI detection with the aid of scintillation, in other words, it will not be of a civilization we’ll be likely to converse with (unless, of course, we find a way someday to actually visit it).

The Brzycki paper dovetails nicely with the FAST work, as witness its discussion of the ON-OFF strategy discussed above. The italics below are mine:

…RFI can also appear in only ON observations. For example, RFI signals could exhibit intensity modulations that follow the observational cadence of 5 minutes per pointing, a false positive that would pass the directional filter. While we observe false positives like this in practice, having directional requirements still serves as an interpretable basis for determining candidates, which would induce follow-up observations for potential re-detection.

This begs the question: can we differentiate narrowband signals as RFI based on morphology alone? Since ETI signals must travel to us through interstellar space, are there effects that would be observable and sufficiently unique compared to RFI modulations?

Thus the important result: The effect of scintillation does indeed provide a way to single out RFI simply because no local interference will produce the effect. Indeed, as the authors note, ETI might well consider the presence of scintillation in an artificial, narrowband signal as an announcement: ‘we are here.’ Where this work points is to further analysis of radio emission – the authors single out polarization, which they say is only beginning to be studied in the SETI context. Who can doubt their conclusion?

Whether it is because certain effects are stochastic or because human radio emission exploits every facet of radio light possible for communication, extracting non trivial information from a radio signal’s detailed morphology has been and will remain difficult. We may need to push the limits of detectability along hitherto unexplored axes to discover the first technosignature.

The paper from FAST is Bo-lun Huang et al., “A solution to persistent RFI in narrowband radio SETI: The MultiBeam Point-source Scanning strategy,” currently available as a preprint. The paper on scintillation is Brzycki et al., “On Detecting Interstellar Scintillation in Narrowband Radio SETI,” Astrophysical Journal 17 July 2023 (full text).

The “Habitability” of Worlds (Part II)

If we ever thought it would be easy to tell whether a planet was ‘habitable’ or not, Stephen Dole quickly put the idea to rest when he considered all the factors involved in his study Habitable Planets for Man (1964). In this second part of his essay on habitability, Dave Moore returns to Dole’s work and weighs these factors in light of our present knowledge. What I particularly appreciate about this essay in addition to Dave’s numerous insights is the fact that he has brought Dole’s work back into focus. The original Habitable Planets for Man was a key factor in firing my interest in writing about interstellar issues. And Centauri Dreams reader Mark Olson has just let me know that Dole appears as a major character in a novel by Harry Turtledove called Three Miles Down. It’s now in my reading stack.

by Dave Moore

In Part I of this essay, I listed the requirements for human habitability in Stephen Dole’s report, Habitable Planets for Man. Now I’ll go over what we’ve subsequently learned and see how this has changed our perspective.

Dole, in calculating the likelihood of a star having a habitable planet, produced his own ‘Drake equation.’

Image: Dole’s ‘Drake Equation.’

Dole assigns the following probabilities to his equation: PHP=Nsub>S Pp Pi PD PM Pe PB PR PA PL:

Pp = 1.0, Pi = 0.81, PM = 0.19, Pe = 0.94, PR = 0.9, PL = 1.0, PB = 0.95 for a star taken at random, 1.0 if there is no interference with the other star in a binary system. He calculates that for stars around solar mass there is a 5.4% chance of having a habitable planet.

I’ll only summarize his calculations as this is not the primary thrust of this essay. Some of his estimates such as Pp = 1.0, the number of stars with planets, have held up well. Others need adjusting, but by far the biggest factors that determine the likelihood of a planet being habitable for humans are those he didn’t consider in depth.

Since Dole’s report, we’ve learned a lot more about the carbonate-silicate cycle and atmospheric circulation. The carbonate-silicate cycle provides a stronger negative feedback loop over a wider range of insolation than thought at the time of his report. Atmospheric and oceanic heat transport have been shown to work more efficiently also. This leads to a more positive assessment to the range of habitability. Planets with high axial tilts and eccentricities, which Dole had excluded, are now considered potentially habitable; and more importantly, there’s the possibility that tidally-locked planets around M-dwarf stars may be habitable. M-dwarf stars being the most common in the galaxy, this makes a big difference to the number of potentially habitable planets. Nsub>S, the mass range of stars, is now opened up. Pi, the range of inclination, is probably 1.0, and PD, the probability that there is a planet in the habitable zone, which he gave as 0.63 and is still a good estimate, is now extended to M dwarfs. And given that tidally locked planets are no longer excluded, PR, the rate of rotation is not a limiting factor.

On PM, Dole’s assumptions for the size of a habitable Earth-like world have held up well. His calculations on atmospheric retention and escape conclude that planets between 0.4 Earth mass and 2.35 Earth mass could be Earth-like. Planets below 0.4 Earth mass would lose their atmospheres. Planets above 2.35 Earth mass would retain their primordial Hydrogen and Helium atmospheres and become what we now call Hycean planets or Super-Earths.

This gives a range of surface gravities, assuming a composition similar to Earth’s, of between 0.68 and 1.5 G, which would mean from a gravitational perspective most of the range is within what humanity could handle. Dole puts the upper limit at 1.25 G based on mobility measurements made in centrifuges from that time. I would agree with him even though there are a lot of people walking around today with one and a half times their ideal weight. The limiting factor for high G is heart failure at an early age, a condition extremely tall people here on Earth suffer from. If you are a six-foot person on a 1.5 G world, your heart is pumping blood equivalent to that of a nine-foot person. In this case, people of short stature have a distinct advantage. A five-foot person would have the blood pressure equivalent of being seven foot six on a 1.5 G world and six foot three on a 1.25 G world.

However, when it comes to the frequency of Earth-sized worlds in the habitable zone, Dole’s guess at PM = 0.19 is probably too high even when we now include tidally-locked planets around red-dwarf stars. He, like the rest of us until recently, had no clue that sub-Neptunes and super-Earths would be the most frequently-sized planets in the habitable zone of a roughly Sol-mass star.

From our observations, Dole’s guess on orbital eccentricity, Pe, looks like it’s in the ballpark, again due to the inclusion of red-dwarf stars with their tidally circularized orbits. With a lot of these factors, though, slight changes in probability do not make a big difference in the frequency of habitable planets. The big differences come from those he didn’t consider.

Dole noted that water coverage on a planet could determine its habitability. He did not go over this in any detail, however, mainly I suspect because he had no information to go on. He didn’t include a term for it in his calculations. But, we do know from density determinations of transiting Earth-sized planets that there’s a significant possibility that a large percentage of them may be excluded due to being covered by deep oceans. This would mean, even if they had breathable atmospheres, they would not meet Dole’s criteria for habitability.

While Dole went carefully over the range of breathable atmospheres humans could tolerate, he essentially assigned a probability of 1.0 to the formation of this atmosphere once life appears on the planet, PL, and sufficient time has passed, PA, to which he arbitrarily assigns a period of 3 billion years. He made no consideration of how likely it would be for this process to go off the rails.

Yet, if you consider the range of possible atmospheric compositions and pressures on Earth-like planets, those that meet the requirements of human habitability are narrow. This is the one factor that is most likely to winnow the field with the possible exception of average water composition.

When considering what percentage of Earth-like planets could have a breathable atmosphere: Oxygen between 100 and 400 millibars, Nitrogen less than 2.3 bar, CO2 less than 10 millibars, and no poisonous gasses, we are helped by a natural connection of these parameters. Oxygen destroys most poisonous gasses. The Carbonate-Silicate cycle will draw down CO2 to low levels. With Nitrogen we note that Venus has 3 bars of Nitrogen. Earth has a similar stock, but most of it is either dissolved in the oceans or mineralized as nitrates. Mars still has a 2.6% by volume trace of its primordial Nitrogen atmosphere. This points to a certain consistency for terrestrial planets with regard to their Nitrogen stock; however, Oxygen to Nitrogen ratios do vary from star to star. Getting the level of Oxygen within breathable parameters is more problematic, though. It’s a reactive gas that disappears with time. I can see two possible pathways that can lead to a breathable atmosphere, one abiotic and one biotic.

On the abiotic front, there’s a robust mechanism available for generating Oxygen. If the planet is warm enough to have significant quantities of water vapor in the upper atmosphere or has a steam atmosphere, then photolysis and subsequent Hydrogen escape will result in the build-up of Oxygen.

Planets less massive than the Earth-like range lose their atmospheres. Planets more massive retain their primordial Hydrogen, which means any Oxygen resulting from photolysis will recombine to form water. Intermediate-sized planets, however, can build up Oxygen via Hydrogen escape.

How much it builds up depends on the balance of production and removal. The amount produced depends on stratospheric water vapor and UV levels. The rate of removal is determined by three main processes: Oxygen escape, which is dependent on planetary mass, magnetic field strength and the strength of plasma wind from its primary; chemical reaction with reducing gasses, which is proportional to the level of volcanic emissions; and the oxidation of exposed regolith due to volcanism and weathering, the first being proportional to the level of volcanism and the second being proportional to the planet’s temperature.

Abiotic Oxygen atmospheres are probably transitory in nature over geological time periods, but I do see sufficient Oxygen being generated at various stages in an Earth-like planet’s history. The first is from the time when a planet’s red-dwarf primary is sliding down its Hayashi track towards its position on the main sequence. Due to the star’s greater luminosity at this time, an Earth-like planet destined for the habitable zone will spend 100 million to a billion years with a steam atmosphere. Models of this process indicate it could lose up to several Earth oceans of water through photolysis and Hydrogen loss. The loss of an Earth ocean translates into roughly 300 bar of Oxygen, most of which, as with Venus, will finish up oxidizing the crust. If, however, the various factors balance out, so that when the planet’s steam atmosphere condenses as the star arrives at its main sequence position, the water fraction is sufficient to provide both oceans and continents, and the Oxygen production and removal hove balanced out to produce a breathable but non-toxic level of Oxygen, then we should get a habitable planet, albeit one with a highly oxidizing surface chemistry like Mars.

If this all sounds highly unlikely, you are probably right, but there are a lot of red dwarf stars in our galaxy.

Image: Artist’s impression of the ultracool dwarf star TRAPPIST-1 from the surface of one of its planets. We’re beginning to learn whether the inner worlds here have atmospheres, but will we find that any of the seven are habitable? Credit: ESO.

Oxygen generation through photolysis occurs anytime an Earth-like planet has a high level of water loss. Mars is thought to have lost an ocean of water corresponding to 1.4% of Earth’s ocean early in its history, which translates into a total partial pressure of 4.2 bar of Oxygen (under 1 G.) This Oxygen generation would have occurred over a long period, so the partial pressure at any given time was probably low; but you’ll notice that the mineralogy of Mars from around 4 billion years ago is highly oxidizing whereas Earth’s surface didn’t become oxidizing until 2.2 billion years ago.

Also an Earth-like planet suffering from runaway greenhouse such as Venus did two billion odd years ago would also experience a build-up in Oxygen.

If the presence of life in the galaxy is sparse, then this mechanism may result in more planets having Oxygen in their atmospheres than those that get it through biotic means, so Oxygen lines in the spectra of a planet’s atmosphere would not be a good indication that it harbors life.

We are familiar through descriptions of the history of life on how the biotic process leads to a breathable atmosphere. This has implications, however. To frame this, I’ll use a model in which planets become habitable at the rate of one per million stars starting nine billion years ago. (The figure I selected is arbitrary. You are welcome to adjust it and see what sort of results you get.) Given that star formation in our galaxy is about one star per year (star formation rates have varied over time but an average of one per year will suffice for this model), this will result in the total of 9000 planets that will be habitable to humans at some point in their lifetime. There may well be many more life-bearing planets than this, but this model is only interested in the ones that become habitable to humans.

If we assume these planets have a similar evolutionary track to Earth, then the youngest 5% of these will be at the prebiotic stage. Until about 2.2 billion years ago Earth was dominated by anaerobic life, so the next 20% will have anaerobic atmospheres full of toxic gasses. Hydrogen Sulfide in particular is lethal, killing at 1000 ppm. Intrepid explorers will have to live in sealed habitats with airlocks and go around on the surface in spacesuits. Does this meet your definition of habitable?

About 2.2 billion years ago on Earth, photosynthetic aerobes got the upper hand in Earth’s chemistry and the surface became oxidized with an atmosphere of 1-2% of oxygen. If their timeline is similar to Earth’s, then 20% of these planets would fit this condition.

These planets would be a far more pleasant place to explore. Toxic gasses would be removed by the Oxygen. You could probably go around with just an oxygen concentrator on your back feeding a tube to your nose. Habitats wouldn’t need airlocks; double doors would do. How would you classify these planets?

Then 500 million years ago Earth became fully habitable when the Oxygen concentration crossed 15% and the air became breathable. This period represents 5% of the sample. However, there’s a side effect to this. Oxygen is not very soluble in water and O2 concentrations fall off rapidly with distance. This is why the macroscopic lifeforms from the Pre-Cambrian age (>500 mya) were either flat leaf-like shapes or sponges, both of which give short diffusion distances throughout the organism. Once the oxygen concentration rose, however, lifeforms could develop thickness, and with thickness, they could develop organs such as hearts and circulatory systems, which could then circulate an oxygenated fluid throughout their bodies. A breathable atmosphere allows for the development of complex macroscopic life.

And, over time, complex macroscopic life gives rise to the second side effect of breathable Oxygen levels – sapience. This has often been considered a rare possibility, a fortuitous combination of circumstance, and in the Drake equation it is assigned a low fractional value, but the idea that intelligent life is rare and unique derives from our historical and religious concept that mankind is something unique and apart from the animal kingdom. However, studies show a steady increase in encephalization over time and its widespread occurrence in different phyla and classes: octopi in the mollusks, parrots and corvids in the birds, and dolphins, elephants and apes in the mammals.

Varying levels of communication signaling have been found in numerous species. Just recently, a troop of Chimpanzees has been found to have a 390-word vocabulary constructed by combining grunts and chirps in various sequences. It therefore seems that our ability with language is merely a development of existing trends rather than something that came out of nowhere. And language is the abstract representation of an object or action, so the manipulation of language leads to abstract reasoning.

Encephalization is a tradeoff between the energy consumption of neurons and the benefits they produce in reproductive fitness. Increasing the number of neurons in an organism is easy. A simple mutation in the precursor cells allowing them to divide one more time will do this; however, organizing those extra neurons into something useful enough to justify their extra metabolic cost is a lot more difficult. But increases in neural complexity can lead to more complex behaviors, which can increase fitness or allow the creature to colonize new niches. In addition, neurons, over time, have evolved to become more efficient. Moore’s law operates, but with a doubling time on the order of 100 million years. Parrots’ neurons are both smaller and three times more energy efficient than human ones. So, not only does encephalization increase with time, but the tradeoff moves in its favor. However, like any increases in biological complexity and sophistication, this does take time.

This points to the conclusion that on planets habitable to humans, the evolution of sentience is not so much a case of if, but when.

An atmosphere breathable to humans is also flammable over most of its range, so a good proportion of these sapients would have access to fire allowing smelting technology to develop. What the model I used implies is that 50% of habitable planets will by now have had intelligent life forms evolve on them, a majority of which could develop technology.

I would support this argument by applying the Law of Universality that states that no matter where you are in the universe the laws of nature operate in the same way. This means that a planet like Earth would produce intelligent life forms. There is a certain contingent element in evolution, so the timing and the resulting life forms would not be identical; however, the broad driving forces of evolution would produce something similar. This can be seen in the many cases of convergent evolution that have occurred on Earth. How different from Earth a planet has to be before it stops producing intelligent life forms is a matter of conjecture, but if these changes cripple the evolution of intelligent lifeforms, there’s a good chance they cripple the formation of a breathable atmosphere.

What these intelligent life forms would do to their planet over the eons is a matter of speculation, but if for some reason intelligent life did not arise, then complex life could thrive and the planet would be habitable for another billion years or more – depending on the star’s spectral type – before the star’s increasing luminosity sets off a runaway greenhouse. This means that of the planets that are habitable for humans at some stage in their life approximately 15-25% will be habitable at any given time. (The upper bound assumes that there are a high proportion of them around lower mass stars with longer lifetimes.)

If, however, intelligent life develops on planets as a matter of course, then the model indicates that for every habitable planet we have now (5% of the total) approximately ten planets had intelligent lifeforms at some stage in their history (50% of the total.) And if intelligent life is a side effect of habitability, then there will be a correlation between the number of habitable planets and the number of exosolar technological civilizations in our galaxy. So, in an inversion of the usual order of things, we can estimate the number of planets habitable for humans from the number of alien civilizations in the galaxy. The model I’ve been using points to them being within an order of magnitude of each other.

Adding in the fact that we have no information on the evolution of intelligent life on non-habitable planets, then calculating the number of habitable planets from evidence of alien civilizations is an upper bound. On the other side of the scales, there’s the number of planets that are habitable through abiotic means. Planetary atmospheric spectra within the next couple of decades may give us some indication of this. If, however, we use Hanson’s estimate where he deduces that from the lack of evidence of alien civilizations in our galaxy that the number of technological life forms is just one – us – then this would also point to the number of habitable planets in our galaxy being just one: Earth.

As a final point I would like to add that while I have not done a full literature search, I have read widely in this field and have not come across as rigorous consideration as Dole’s work on defining habitability for humans and considering the likelihood of finding planets that match that criterion. The field’s general mindset seems to focus on finding the conditions upon which life arises; then it just assumes evolution will automatically lead to a habitable planet for humans. We have learned a lot since Dole wrote his paper, but there does not seem to have been much reexamination of the topic. It is perhaps time we applied our minds to it.

References

Stephen Dole, Habitable Planets For Man, The Rand Corporation, R414-R
https://www.rand.org/content/dam/rand/pubs/reports/2005/R414.pdf

Dave Moore, “’If Loud Aliens Explain Human Earliness, Quiet Aliens Are Also Rare’: A review”
https://centauri-dreams.org/2022/05/20/if-loud-aliens-explain-human-earliness-quiet-aliens-are-also-rare-a-review/

Robin Hanson, Daniel Martin, Calvin McCarter, Jonathan Paulson, “If Loud Aliens Explain Human Earliness, Quiet Aliens Are Also Rare,” The Astrophysical Journal, 922, (2) (2021)

Charter

In Centauri Dreams, Paul Gilster looks at peer-reviewed research on deep space exploration, with an eye toward interstellar possibilities. For many years this site coordinated its efforts with the Tau Zero Foundation. It now serves as an independent forum for deep space news and ideas. In the logo above, the leftmost star is Alpha Centauri, a triple system closer than any other star, and a primary target for early interstellar probes. To its right is Beta Centauri (not a part of the Alpha Centauri system), with Beta, Gamma, Delta and Epsilon Crucis, stars in the Southern Cross, visible at the far right (image courtesy of Marco Lorenzi).

Now Reading

Recent Posts

On Comments

If you'd like to submit a comment for possible publication on Centauri Dreams, I will be glad to consider it. The primary criterion is that comments contribute meaningfully to the debate. Among other criteria for selection: Comments must be on topic, directly related to the post in question, must use appropriate language, and must not be abusive to others. Civility counts. In addition, a valid email address is required for a comment to be considered. Centauri Dreams is emphatically not a soapbox for political or religious views submitted by individuals or organizations. A long form of the policy can be viewed on the Administrative page. The short form is this: If your comment is not on topic and respectful to others, I'm probably not going to run it.

Follow with RSS or E-Mail

RSS
Follow by Email

Follow by E-Mail

Get new posts by email:

Advanced Propulsion Research

Beginning and End

Archives