Centauri Dreams
Imagining and Planning Interstellar Exploration
SPARCS: Zeroing in on M-dwarf Flares
Although we’ve been talking this week about big telescopes, from extremely large designs like the Thirty Meter Telescope and the European Extremely Large Telescope to the space-based HabEx/LUVOIR descendant prioritized by Astro2020, small instruments continue to do interesting work around the edges. I just noticed a tiny one called the Star-Planet Activity Research CubeSat (SPARCS) that fills a gap in our study of M-dwarfs, those small stars whose flares are so problematic for habitability.
Under development at Arizona State University, the space-based SPARCS is just halfway into its development phase, but let’s take a look at it in light of ongoing work on M-dwarf planets, because it bodes well for turning theories about flare activity into data that can firm up our understanding. The problem is that while theoretical studies delve into ultraviolet flaring on these stars, the longest intensive UV monitoring on an M-dwarf done thus far has been a thirty hour effort with the Hubble instrument.
We need more, which is why the SPARCS idea emerged. A team of researchers led by ASU’s Evgenya Shkolnik has produced an overview of the NASA-funded mission’s science drivers and its intention of deepening our understanding of star-planet interactions. “Know thy star, know thy planet,. . . especially in the ultraviolet (UV),” comments the team in their abstract, which also points to the necessity of data collection for these intensely studied stars, ubiquitous in the galaxy and known to host interesting planets like Proxima Centauri b.
Image: An example of M-dwarf flaring. DG CVn, a binary consisting of two red dwarf stars shown here in an artist’s rendering, unleashed a series of powerful flares seen by NASA’s Swift. At its peak, the initial flare was brighter in X-rays than the combined light from both stars at all wavelengths under typical conditions. Credit: NASA’s Goddard Space Flight Center/S. Wiessinger.
Can such a world be habitable? Recent observations have shown that flare events produce a more severe flux increase in the ultraviolet than the optical; a flare peaking on the order of 0.01x the star’s quiescent flux in the optical, write the authors, can at UV wavelengths brighten by a factor of 14000. UV ‘superflare’ events — as much as 10,000 times more energetic than the flares produced by our G-class Sun — can produce 200x flux increases that are expected to occur on a daily basis on young, active M-dwarfs.
Thus habitability can be compromised, with UV radiation damaging planetary atmospheres, eroding ozone and producing lethal levels of radiation at the surface. An Earth-like planet in the habitable zone can likewise be subject to methane depletion under the kind of flaring Proxima Centauri has been known to produce. Thus the composition of an M-dwarf planet’s atmosphere is subject to interactions with its star that may prevent life from ever arising, or drastically affect its development.
SPARCS is a CubeSat observatory carrying a 9-cm telescope and the associated gear to perform photometric monitoring of M-dwarf flare activity in the near (258?308 nm) and far ultraviolet (153?171 nm). The target: 20 M-dwarfs in a range of ages from 10 million to 5 billion years old, examined during a mission lifetime of one year. Planned for launch in 2023 into a heliosynchronous orbit that offers “decent thermal stability and optimized continuity in target monitoring,” SPARCS will track flare color, energies, occurrence rate and duration on active as well as inactive M-dwarfs.
The authors believe the observatory will also improve our atmospheric models for M-dwarf planets, useful information as we look toward future biosignature investigations, and helpful as we fill an obvious gap in our data on this class of star. The software onboard is interesting in itself:
The payload software is able to run monitoring campaigns at constant detector exposure time and gain, but due to the expected high amplitudes of M dwarf UV flares, observations throughout the nominal mission will be conducted using a feature of the software that autonomously adjusts detector exposure times and gains to mitigate the occurrence of pixel saturation during observations of flaring events. SPARCS will be the first space-based stellar astrophysics observatory that adopts such an onboard autonomous exposure control.
So we have a small space telescope that will be able to monitor its targets in both near- and far-ultraviolet wavelengths simultaneously, managed by a dedicated onboard payload processor that allows the observatory to adjust for pixel saturation during flare events. This “autonomous dynamic exposure control algorithm” is a story in itself, adding depth to a mission to investigate the most extremely variable stars in the Hertzsprung–Russell diagram. SPARCS should help us learn whether these long-lived stars can allow planetary habitability as they age into a less dramatic maturity.
The paper is Ramiaramanantsoa et al., “Time-Resolved Photometry of the High-Energy Radiation of M Dwarfs with the Star-Planet Activity Research CubeSat (SPARCS),” accepted for publication in Astronomische Nachrichten (preprint).
The Exoplanet Pipeline
Looking into Astro2020’s recommendations for ground-based astronomy, I was heartened with the emphasis on ELTs (Extremely Large Telescopes), as found within the US-ELT project to develop the Thirty Meter Telescope and the Giant Magellan Telescope, both now under construction. Such instruments represent our best chance for studying exoplanets from the ground, even rocky worlds that could hold life. An Astro2020 with different priorities could have spelled the end of both these ELT efforts in the US even as the European Extremely Large Telescope, with its 40-meter mirror, moves ahead, with first light at Cerro Armazones (Chile) projected for 2027.
So the ELTs persist in both US and European plans for the future, a context within which to consider how planet detection continues to evolve. So much of what we know about exoplanets has come from radial velocity methods. These in turn rely critically on spectrographs like HARPS (High Accuracy Radial Velocity Planet Searcher), which is installed at the European Southern Observatory’s 3.6m telescope at La Silla in Chile, and its successor ESPRESSO (Echelle Spectrograph for Rocky Exoplanet and Stable Spectroscopic Observations). We can add the NEID spectrometer on the WIYN 3.5m telescope at Kitt Peak to the mix, now operational and in the hunt for ever tinier Doppler shifts in the light of host stars.
We’re measuring the tug a planet puts on its star by looking radially — how is the star pulled toward us, then away, as the planet moves along its orbit? Given that the Earth produces a movement of a mere 9 centimeters per second on the Sun, it’s heartening to see that astronomers are closing on that range right now. NEID has demonstrated a precision of better than 25 centimeters per second in the tests that led up to its commissioning, giving us another tool for exoplanet detection and confirmation.
But this is a story that also reminds us of the vast amount of data being generated in such observations, and the methods needed to get this information distributed and analyzed. On an average night, NEID will collect about 150 gigabytes of data that is sent to Caltech, and from there via a data management network called Globus to the Texas Advanced Computing Center (TACC) for analysis and processing. TACC, in turn, extracts metadata and returns the data to Caltech for further analysis. The results are made available by the NASA Exoplanet Science Institute via its NEID Archive.
Image: The NEID instrument is shown mounted on the 3.5-meter WIYN telescope at the Kitt Peak National Observatory. Credit: NSF’s National Optical-Infrared Astronomy Research Laboratory/KPNO/NSF/AURA.
What a contrast with the now ancient image of the astronomer on a mountaintop coming away with photographic plates that would be analyzed with instruments like the blink comparator Clyde Tombaugh used to discover Pluto in 1930. The data now come in avalanche form, with breakthrough work occurring not only on mountaintops but in the building of data pipelines like these that can be generalized for analysis on supercomputers. The vast caches of data contain the seeds of future discovery.
Joe Stubbs leads the Cloud & Interactive Computing group at TACC:
“NEID is the first of hopefully many collaborations with the NASA Jet Propulsion Laboratory (JPL) and other institutions where automated data analysis pipelines run with no human-in-the-loop. Tapis Pipelines, a new project that has grown out of this collaboration, generalizes the concepts developed for NEID so that other projects can automate distributed data analysis on TACC’s supercomputers in a secure and reliable way with minimal human supervision.”
NEID also makes a unique contribution to exoplanet detection by being given over to the analysis of activity on our own star. Radial velocity is vulnerable to confusion over starspots — created by convection on the surface of exoplanet host stars and mistaken for planetary signatures. The plan is to use NEID during daylight hours with a smaller solar telescope developed for the purpose to track this activity. Eric Ford (Penn State) is an astrophysicist at the university where NEID was designed and built:
“Thanks to the NEID solar telescope, funded by the Heising-Simons Foundation, NEID won’t sit idle during the day. Instead, it will carry out a second mission, collecting a unique dataset that will enhance the ability of machine learning algorithms to recognize the signals of low-mass planets during the nighttime.”
Image: A new instrument called NEID is helping astronomers scan the skies for alien planets. TACC supports NEID with supercomputers and expertise to automate the data analysis of distant starlight, which holds evidence of new planets waiting to be discovered. WIYN telescope at the Kitt Peak National Observatory. Credit: Mark Hanna/NOAO/AURA/NSF.
Modern astronomy in a nutshell. We’re talking about data pipelines operational without human intervention, and machine-learning algorithms that are being tuned to pull exoplanet signals out of the noise of starlight. In such ways does a just commissioned spectrograph contribute to exoplanetary science through an ever-flowing data network now indispensable to such work. Supercomputing expertise is part of the package that will one day extract potential biosignatures from newly discovered rocky worlds. Bring on the ELTs.
Two Takes on the Extraterrestrial Imperative
Topping the list of priorities for the Decadal Survey on Astronomy and Astrophysics 2020 (Astro2020), just released by the National Academy of Sciences, Engineering and Medicine, is the search for extraterrestrial life. Entitled Pathways to Discovery in Astronomy and Astrophysics for the 2020s, the report can be downloaded as a free PDF here. At 614 pages, this is not light reading, but it does represent an overview in which to place continuing work on exoplanet discovery and characterization.
In the language of the report:
“Life on Earth may be the result of a common process, or it may require such an unusual set of circumstances that we are the only living beings within our part of the galaxy, or even in the universe. Either answer is profound. The coming decades will set humanity down a path to determine whether we are alone.”
A ~6 meter diameter space telescope capable of spotting exoplanets 10 billion times fainter than their host stars, thought to be feasible by the 2040s, leads the observatory priorities. As forwarded to me by Centauri Dreams regular John Walker, the survey recommends an instrument covering infrared, optical and ultraviolet wavelengths with high-contrast imaging and spectroscopy. Its goal: Searching for biosignatures in the habitable zone. Cost is estimated at an optimistic $11 billion.
I say ‘optimistic’ because of the cost overruns we’ve seen in past missions, particularly JWST. But perhaps we’re learning how to rein in such problems, according to Joel Bregman (University of Michigan), chair of the AAS Committee on Astronomy and Public Policy. Says Bregman:
“The Astro2020 report recommends a ‘technology development first’ approach in the construction of large missions and projects, both in space and on the ground. This will have a profound effect in the timely development of projects and should help avoid budgets getting out of control.”
Time will tell. It should be noted that a number of powerful telescopes, both ground- and space-based, have been built following the recommendations of earlier decadal surveys, of which this is the seventh.
Suborbital Building Blocks
We’re a long way from the envisioned instrument in terms of both technology and time, but the building blocks are emerging and the characterization of habitable planets is ongoing. What a difference between a flagship level space telescope like the one described by Astro2020 and the small, suborbital instrument slated for launch from the White Sands Missile Range in New Mexico on Nov. 8. SISTINE (Suborbital Imaging Spectrograph for Transition region Irradiance from Nearby Exoplanet host stars) is the second of a series of missions homing in on how the light of a star affects biosignatures on its planets.
False positives will likely bedevil biosignature searches as our technology improves. Principal investigator Kevin France (University of Colorado Boulder) points particularly to ultraviolet levels and their role in breaking down carbon dioxide, which frees oxygen atoms to form molecular oxygen, made of two oxygen atoms, or ozone, made of three. These oxygen levels can easily be mistaken for possible biosignatures. Says France: “If we think we understand a planet’s atmosphere but don’t understand the star it orbits, we’re probably going to get things wrong.”
Image: A sounding rocket launches from the White Sands Missile Range, New Mexico. Credit: NASA/White Sands Missile Range.
It’s a good point considering that early targets for atmospheric biosignatures will be M-dwarf stars. Now consider the early Earth, laden with perhaps 200 times more carbon dioxide than today, its atmosphere likewise augmented with methane and sulfur from volcanic activity in the era not long after its formation. It took molecular oxygen a billion and a half years to emerge as nothing more than a waste product produced during photosynthesis, eventually leading to the Great Oxygenation Event.
Oxygen becomes a biomarker on Earth, but it’s an entirely different question around other stars. M-dwarf stars like Proxima Centauri generate extreme levels of ultraviolet light, making France’s point that simple photochemistry can produce oxygen in the absence of living organisms. Bearing in mind that M-dwarfs make up as many as 80 percent of the stars in the galaxy, we may find ourselves with a number of putative biosignatures that turn out to be a reflection of these abiotic reactions. Aboard the spacecraft is a telescope and a spectrograph that will home in on ultraviolet light from 100 to 160 nanometers, which includes the range known to produce false positive biomarkers. The UV output in this range varies with the mass of the star; thus the need to sample widely.
SISTINE-2’s target is Procyon A. The craft will have a brief window of about five minutes from its estimated altitude of 280 kilometers to observe the star, with the instrument returning by parachute for recovery.
An F-class star larger and hotter than the Sun, Procyon A has no known planets, but what is at stake here is accurate determination of its ultraviolet spectrum. A reference spectrum for F-stars growing out of these observations of Procyon A and incorporating existing data on other F-class stars at X-ray, extreme ultraviolet and visible light is the goal. France says the next SISTINE target will be Alpha Centauri A and B.
Image: A size comparison of main sequence Morgan-Keenan classifications. Main sequence stars are those that fuse hydrogen into helium in their cores. The Morgan-Keenan system shown here classifies stars based on their spectral characteristics. Our Sun is a G-type star. SISTINE-2’s target is Procyon A, an F-type star. Credit: NASA GSFC.
Launch is to be aboard a Black Brant IX sounding rocket. And although it sounds like a small mission, SISTINE-2 will be working at wavelengths the Hubble Space Telescope cannot observe. Likewise, the James Webb Space Telescope will work at visible to mid-infrared wavelengths, making the SISTINE observations useful for frequencies that Webb cannot see. The mission also experiments with new optical coatings and what NASA describes as ‘novel UV detector plates’ for better reflection of extreme UV.
Image: SISTINE’s third mission, to be launched in 2022, will target Alpha Centauri A and B. Here we see the system in optical (main) and X-ray (inset) light. Only the two largest stars, Alpha Cen A and B, are visible. These two stars will be the targets of SISTINE’s third flight. Credit: Zdenek Bardon/NASA/CXC/Univ. of Colorado/T. Ayres et al.
White Dwarf Clues to Unusual Planetary Composition
The surge of interest in white dwarfs continues. We’ve known for some time that these remnants of stars like the Sun, having been through the red giant phase and finally collapsing into a core about the size of the Earth, can reveal a great deal about objects that have fallen into them. That would be rocky material from planetary objects that once orbited the star, just as the planets of our Solar System orbit the Sun in our halcyon, pre-red-giant era.
The study of atmospheric pollution in white dwarfs rests on the fact that white dwarfs that have cooled below 25,000 K have atmospheres of pure hydrogen or helium. Heavier elements sink rapidly to the stellar core at these temperatures, so the only source of elements higher than helium — metals in astronomy parlance — is through accretion of orbiting materials that cross the Roche limit and fall into the atmosphere.
These contaminants of stellar atmospheres are now the subject of a new investigation led by astronomer Siyi Xu (NSF NOIRLab), partnering with Keith Putirka (California State University, Fresno). Putirka is a geologist, and thus a good fit for this study. Working with Xu, an astronomer, he examined 23 white dwarfs whose atmospheres are found to be polluted by such materials. The duo took advantage of existing measurements of calcium, silicon, magnesium and iron from the Keck Observatory’s HIRES instrument (High-Resolution Echelle Spectrometer) in Hawai’i, along with data from the Hubble Space Telescope, whose Cosmic Origins Spectrograph came into play.
Their focus is on the abundance of elements that make up the major part of rock on an Earth-like planet, especially silicon, which would imply the composition of rocks that would have existed on white dwarf planets before their disintegration and accretion. The variety of rock types that emerge is wider than found in the rocky planets of our inner Solar System. Some of them are unusual enough that the authors create new terms to describe them. Thus “quartz pyroxenites” and “periclase dunites.” None have analogs in our own system.
The finding has implications for planetary development, as Putirka explains:
“Some of the rock types that we see from the white dwarf data would dissolve more water than rocks on Earth and might impact how oceans are developed. Some rock types might melt at much lower temperatures and produce thicker crust than Earth rocks, and some rock types might be weaker, which might facilitate the development of plate tectonics.”
The paper goes into greater detail:
…while PWDs [polluted white dwarfs] might record single planets that have been destroyed and assimilated piecemeal, the pollution sources might also represent former asteroid belts, in which case the individual objects of these belts would necessarily be more mineralogically extreme. If current petrologic models may be extrapolated, though, PWDs with quartz-rich mantles…might create thicker crusts, while the periclase-saturated mantles could plausibly yield, on a wet planet like Earth, crusts made of serpentinite, which may greatly affect the kinds of life that might evolve on the resulting soils. These mineralogical contrasts should also control plate tectonics, although the requisite experiments on rock strength have yet to be carried out.
Image: Rocky debris, the pieces of a former rocky planet that has broken up, spiral inward toward a white dwarf in this illustration. Studying the atmospheres of white dwarfs that have been polluted by such debris, a NOIRLab astronomer and a geologist have identified exotic rock types that do not exist in our Solar System. The results suggest that nearby rocky exoplanets must be even stranger and more diverse than previously thought. Credit: NOIRLab/NSF/AURA/J. da Silva.
High levels of magnesium and low levels of silicon are found in the sample white dwarfs, suggesting to the authors that the source debris came from a planetary interior, the mantle rather than the crust. That contradicts some earlier papers reporting signs of crustal rocks as the original polluters, but Xu and Patirka believe that such rock occurs as no more than a small fraction of core and mantle components.
Adds Putirka:
“We believe that if crustal rock exists, we are unable to see it, probably because it occurs in too small a fraction compared to the mass of other planetary components, like the core and mantle, to be measured.”
The paper is Putirka & Xu, “Polluted white dwarfs reveal exotic mantle rock types on exoplanets in our solar neighborhood,” Nature Communications 12, 6168 (2 November 2021). Full text.
Going After Sagittarius A*
Only time will tell whether humanity has a future beyond the Solar System, but if we do have prospects among the stars — and I fervently hope that we do — it’s interesting to speculate on what future historians will consider the beginning of the interstellar era. Teasing out origins is tricky. You could label the first crossing of the heliopause by a functioning probe (Voyager 1) as a beginning, but neither the Voyagers nor the Pioneers (nor, for that matter, New Horizons) were built as interstellar missions.
I’m going to play the ‘future history’ game by offering my own candidate. I think the image of the black hole in the galaxy M87 marks the beginning of an era, one in which our culture begins to look more and more at the universe beyond the Solar System. I say that not because of what we found at M87, remarkable as it was, but because of the instrument used. The creation of a telescope that, through interferometry, can create an aperture the size of our planet speaks volumes about what a small species can accomplish. An entire planet is looking into the cosmos.
So will some future historian look back on the M87 detection as the beginning of the ‘interstellar era’? No one can know, but from the standpoint of symbolism — and that’s what this defining of eras is all about — the creation of a telescope like this is a civilizational accomplishment. I think its cultural significance will only grow with time.
Image: Composite image showing how the M87 system looked, across the entire electromagnetic spectrum, during the Event Horizon Telescope’s April 2017 campaign to take the iconic first image of a black hole. Requiring 19 different facilities on the Earth and in space, this image reveals the enormous scales spanned by the black hole and its forward-pointing jet, launched just outside the event horizon and spanning the entire galaxy. Credit: the EHT Multi-Wavelength Science Working Group; the EHT Collaboration; ALMA (ESO/NAOJ/NRAO); the EVN; the EAVN Collaboration; VLBA (NRAO); the GMVA; the Hubble Space Telescope, the Neil Gehrels Swift Observatory; the Chandra X-ray Observatory; the Nuclear Spectroscopic Telescope Array; the Fermi-LAT Collaboration; the H.E.S.S. collaboration; the MAGIC collaboration; the VERITAS collaboration; NASA and ESA. Composition by J.C. Algaba.
Into the Milky Way’s Heart
The Event Horizon Telescope (EHT) is not a single physical installation but a collection of telescopes around the world that use Very Long Baseline Interferometry to produce a virtual observatory with, as mentioned above, an aperture the size of our planet. Heino Falcke’s book Light in the Darkness (HarperOne, 2021) tells this story from the inside, and it’s as exhilarating an account of scientific research as any I’ve read.
M87 seemed in some ways an ideal target, with a black hole thought to mass well over 6 billion times more than the Sun. In terms of sheer size, M87 dwarfed estimates of the Milky Way’s supermassive black hole (Sgr A*), which weighs in at 4.3 million solar masses, but it’s also 2,000 times farther away. Even so, it was the better target, for M87 was well off the galactic plane, whereas astronomers hoping to study the Milky Way’s black hole have to contend with shrouds of gas and dust and the fact that, while average quasars consume one sun per year, Sgr A* pulls in 106 times less.
But the investigation of Sgr A* continues as new technologies come into play, with the James Webb Space Telescope now awaiting launch in December and already on the scene in French Guiana. Early in JWST’s observing regime, Sgr A* is to be probed at infrared wavelengths, adding the new space-based observatory to the existing Event Horizon Telescope. Farhad Yusef-Zadeh, principal investigator on the Webb Sgr A* program, points out that JWST will allow data capture at two different wavelengths simultaneously and continuously, further enhancing the EHT’s powers.
Among other reasons, a compelling driver for looking hard at Sgr A* is the fact that it produces flares in the dust and gas surrounding it. Yusef-Zadeh (Northwestern University) notes that the Milky Way’s supermassive black hole is the only one yet observed with this kind of flare activity, which makes it more difficult to image the black hole but also adds considerably to the scientific interest of the investigation. The flares are thought to be the result of particles accelerating around the object, but details of the mechanism of light emission here are not well understood.
Image: An enormous swirling vortex of hot gas glows with infrared light, marking the approximate location of the supermassive black hole at the heart of our Milky Way galaxy. This multiwavelength composite image includes near-infrared light captured by NASA’s Hubble Space Telescope, and was the sharpest infrared image ever made of the galactic center region when it was released in 2009. While the black hole itself does not emit light and so cannot be detected by a telescope, the EHT team is working to capture it by getting a clear image of the hot glowing gas and dust directly surrounding it. Credit: NASA, ESA, SSC, CXC, STS.
Thus we combine radio data from the Event Horizon Telescope with JWST’s infrared data. How different wavelengths can tease out more information is evident in the image above. Here we have a composite showing Hubble near-infrared observations in yellow, and deeper infrared observations from the Spitzer Space Telescope in red, while light detected by the Chandra X-Ray Observatory appears in blue and violet. Flare detection and better imagery of the region as enabled by adding JWST to the EHT mix, which will include X-ray and other observatories, should make for the most detailed look at Sgr A* that has ever been attempted.
What light we detect associated with a black hole is from the accreting material surrounding it, with the event horizon being its inner edge — this is what we saw in the famous M87 image. The early JWST observations, expected in its first year of operation, are to be supplemented by further work to build up our knowledge of the flare activity and enhance our understanding of how Sgr A* differs from other supermassive black holes.
Image: Heated gas swirls around the region of the Milky Way galaxy’s supermassive black hole, illuminated in near-infrared light captured by NASA’s Hubble Space Telescope. Released in 2009 to celebrate the International Year of Astronomy, this was the sharpest infrared image ever made of the galactic center region. NASA’s upcoming James Webb Space Telescope, scheduled to launch in December 2021, will continue this research, pairing Hubble-strength resolution with even more infrared-detecting capability. Of particular interest for astronomers will be Webb’s observations of flares in the area, which have not been observed around any other supermassive black hole and the cause of which is unknown. The flares have complicated the Event Horizon Telescope (EHT) collaboration’s quest to capture an image of the area immediately surrounding the black hole, and Webb’s infrared data is expected to help greatly in producing a clean image. Credit: NASA, ESA, STScI, Q. Daniel Wang (UMass).
Whether we’re entering an interstellar era or not, we’re going to be learning a lot more about the heart of the Milky Way, assuming we can get JWST aloft. How many hopes and plans ride on that Ariane 5!
Talking to the Lion
Extraterrestrial civilizations, if they exist, would pose a unique challenge in comprehension. With nothing in common other than what we know of physics and mathematics, we might conceivably exchange information. But could we communicate our cultural values and principles to them, or hope to understand theirs? It was Ludwig Wittgenstein who said “If a lion could speak, we couldn’t understand him.” True?
One perspective on this is to look not into space but into time. Traditional SETI is a search through space and only indirectly, through speed of light factors, a search through time. But new forms of SETI that look for technosignatures — and this includes searching our own Solar System for signs of technology like an ancient probe, as Jim Benford has championed — open up the chronological perspective in a grand way.
Now we are looking for conceivably ancient signs of a civilization that may have perished long before our Sun first shone. A Dyson shell, gathering most of the light from its star, could be an artifact of a civilization that died billions of years ago.
Image: Philosopher Ludwig Wittgenstein (1889-1951), whose Tractatus Logico-Philosophicus was written during military duty in the First World War. It has been confounding readers like me ever since.
Absent aliens to study, ponder ourselves as we look into our own past. I’ve spent most of my life enchanted with the study of the medieval and ancient world, where works of art, history and philosophy still speak to our common humanity today. But how long will we connect with that past if, as some predict, we will within a century or two pursue genetic modifications to our physiology and biological interfaces with computer intelligence? It’s an open question because these trends are accelerating.
What, in short, will humans in a few hundred years have in common with us? The same question will surface if we go off-planet in large numbers. Something like an O’Neill cylinder housing a few thousand people, for example, would create a civilization of its own, and if we ever launch ‘worldships’ toward other stars, it will be reasonable to consider that their populations will dance to an evolutionary tune of their own.
The crew that boards a generation ship may be human as we know the term, but will it still be five thousand years later, upon reaching another stellar system? Will an interstellar colony create a new branch of humanity each time we move outward?
Along with this speculation comes the inevitable issue of artificial intelligence, because it could be that biological evolution has only so many cards to play. I’ve often commented on the need to go beyond the conventional mindset of missions as being limited to the lifetime of their builders. The current work called Interstellar Probe at Johns Hopkins, in the capable hands of Voyager veteran Ralph McNutt, posits data return continuing for a century or more after launch. So we’re nudging in the direction of multi-generational ventures as a part of the great enterprise of exploration.
But what do interstellar distances mean to an artilect, a technological creation that operates by artificial intelligence that eclipses our own capabilities? For one thing, these entities would be immune to travel fatigue because they are all but immortal. These days we ponder the relative advantages of crewed vs. robotic missions to places like Mars or Titan. Going interstellar, unless we come up with breakthrough propulsion technologies, favors computerized intelligence and non-biological crews. Martin Rees has pointed out that the growth of machine intelligence should happen much faster away from Earth as systems continually refine and upgrade themselves.
It was a Rees essay that reminded me of the Wittgenstein quote I used above. And it leads me back to SETI. If technological civilizations other than our own exist, it’s reasonable to assume they would follow the same path. Discussing the Drake Equation in his recent article Why extraterrestrial intelligence is more likely to be artificial than biological, Lord Rees points out there may be few biological beings to talk to:
Perhaps a starting point would be to enhance ourselves with genetic modification in combination with technology—creating cyborgs with partly organic and partly inorganic parts. This could be a transition to fully artificial intelligences.
AI may even be able to evolve, creating better and better versions of itself on a faster-than-Darwinian timescale for billions of years. Organic human-level intelligence would then be just a brief interlude in our “human history” before the machines take over. So if alien intelligence had evolved similarly, we’d be most unlikely to “catch” it in the brief sliver of time when it was still embodied in biological form. If we were to detect extraterrestrial life, it would be far more likely to be electronic than flesh and blood—and it may not even reside on planets.
Image: Credit: Breakthrough Listen / Danielle Futselaar.
I don’t think we’ve really absorbed this thought, even though it seems to be staring us in the face. The Drake Equation’s factor regarding the lifetime of a civilization is usually interpreted in terms of cultures directed by biological beings. An inorganic, machine-based civilization that was spawned by biological forebears could refine the factors that limit human civilization out of existence. It could last for billions of years.
It’s an interesting question indeed how we biological beings would communicate with a civilization that has perhaps existed since the days when the Solar System was nothing more than a molecular cloud. We often use human logic to talk about what an extraterrestrial civilization would want, what its motives would be, and tell ourselves the fable that ‘they’ would certainly act rationally as we understand rationality.
But we have no idea whatsoever how a machine intelligence honed over thousands of millenia would perceive reality. As Rees points out, “we can’t assess whether the current radio silence that Seti are experiencing signifies the absence of advanced alien civilisations, or simply their preference.” Assuming they are there in the first place.
And that’s still a huge’ if.’ For along with our other unknowns, we have no knowledge whatsoever about abiogenesis on other worlds. To get to machine intelligence, you need biological intelligence to evolve to the point where it can build the machines. And if life is widespread — I suspect that it is — that says nothing about whether or not it is likely to result in a technological civilization. We may be dealing with a universe teeming with lichen and pond scum, perhaps enlivened with the occasional tree.
A SETI reception would be an astonishing development, and I believe that if we ever receive a signal, likely as a byproduct of some extraterrestrial activity, we will be unlikely to decode it or even begin to understand its meaning and motivation. Certainly that seems true if Rees is right and the likely sources are machines. A SETI ‘hit’ is likely to remain mysterious, enigmatic, and unresolved. But let’s not stop looking.