Technosignatures and the Search for Extraterrestrial Intelligence

Facebooktwittergoogle_plusredditpinterestlinkedinmail
A rendering of a potential Dyson sphere, named after Freeman A. Dyson. As proposed by the physicist and astromomer decades ago, they would collect solar energy on a solar system wide scale for highly advanced civilizations. (SentientDevelopments.com)

The word “SETI” pretty much brings to mind the search for radio signals come from distant planets, the movie “Contact,” Jill Tarter, Frank Drake and perhaps the SETI Institute, where the effort lives and breathes.

But there was a time when SETI — the Search for Extraterrestrial Intelligence — was a significantly broader concept, that brought in other ways to look for intelligent life beyond Earth.

In the late 1950s and early 1960s — a time of great interest in UFOs, flying saucers and the like — scientists not only came up with the idea of searching for distant intelligent life via unnatural radio signals, but also by looking for signs of unexpectedly elevated heat signatures and for optical anomalies in the night sky.

The history of this search has seen many sharp turns, with radio SETI at one time embraced by NASA, subsequently de-funded because of congressional opposition, and then developed into a privately and philanthropically funded project of rigor and breadth at the SETI Institute.  The other modes of SETI went pretty much underground and SETI became synonymous with radio searches for ET life.

But this history may be about to take another sharp turn as some in Congress and NASA have become increasingly interested in what are now called “technosignatures,” potentially detectable signatures and signals of the presence of distant advanced civilizations.  Technosignatures are a subset of the larger and far more mature search for biosignatures — evidence of microbial or other primitive life that might exist on some of the billions of exoplanets we now know exist.

And as a sign of this renewed interest, a technosignatures conference was scheduled by NASA at the request of Congress (and especially retiring Republican Rep. Lamar Smith of Texas.)  The conference took place in Houston late last month, and it was most interesting in terms of the new and increasingly sophisticated ideas being explored by scientists involved with broad-based SETI.

“There has been no SETI conference this big and this good in a very long time,” said Jason Wright, an astrophysicist and professor at Pennsylvania State University and chair of the conference’s science organizing committee.  “We’re trying to rebuild the larger SETI community, and this was a good start.”

 

At this point, the search for technosignatures is often likened to that looking for a needle in a haystack. But what scientists are trying to do is define their haystack, determine its essential characteristics, and learn how to best explore it. (Wiki Commons)

 

During the three day meeting in Houston, scientists and interested private and philanthropic reps. heard talks that ranged from the trials and possibilities of traditional radio SETI to quasi philosophical discussions about what potentially detectable planetary transformations and by-products might be signs of an advanced civilization. (An agenda and videos of the talks are here.)

The subjects ranged from surveying the sky for potential millisecond infrared emissions from distant planets that could be purposeful signals, to how the presence of certain unnatural, pollutant chemicals in an exoplanet atmosphere that could be a sign of civilization.  From the search for thermal signatures coming from megacities or other by-products of technological activity, to the possible presence of “megastructures” built to collect a star’s energy by highly evolved beings.

Michael New is Deputy Associate Administrator for Research within NASA’s Science Mission Directorate. He was initially trained in chemical physics. (NASA)

All but the near infrared SETI are for the distant future — or perhaps are on the science fiction side — but astronomy and the search for distant life do tend to move forward slowly.  Theory and inference most often coming well before observation and detection.

So thinking about the basic questions about what scientists might be looking for, Wright said, is an essential part of the process.

Indeed, it is precisely what Michael New, Deputy Associate Administrator for Research within NASA’s Science Mission Directorate, told the conference. 

He said that he, NASA and Congress wanted the broad sweep of ideas and research out there regarding technosignatures, from the current state of the field to potential near-term findings, and known limitations and possibilities.

“The time is really ripe scientifically for revisiting the ideas of technosignatures and how to search for them,” he said.

He offered the promise of NASA help  (admittedly depending to some extent on what Congress and the administration decide) for research into new surveys, new technologies, data-mining algorithms, theories and modelling to advance the hunt for technosignatures.

 

Crew members aboard the International Space Station took this nighttime photograph of much of the Atlantic coast of the United States. The ability to detect the heat and light from this kind of activity on distant exoplanets does not exist today, but some day it might and could potentially help discover an advanced extraterrestrial civilization. (NASA)

 

Among the several dozen scientists who discussed potential signals to search for were the astronomer Jill Tarter, former director of the Center for SETI Research, Planetary Science Institute astrobiologist David Grinspoon and University of Rochester astrophysicist Adam Frank.  They all looked at the big picture, what artifacts in atmospheres, on surfaces and perhaps in space that advanced civilizations would likely produce by dint of their being “advanced.”

All spoke of the harvesting of energy to perform work as a defining feature of a technological planet, with that “work” describing transportation, construction, manufacturing and more.

Beings that have reached the high level of, in Frank’s words, exo-civilization produce heat, pollutants, changes to their planets and surroundings in the process of doing that work.  And so a detection of highly unusual atmospheric, thermal, surface and orbital conditions could be a signal.

One example mentioned by several speakers is the family of chemical chloroflourohydrocarbons (CFCs,)  which are used as commercial refrigerants, propellants and solvents.

Astronomner Jill Tarter is an iconic figure in the SETI world and led the SETI Institute for 30 years. (AFP)

These CFCs are a hazardous and unnatural pollutant on Earth because they destroy the ozone layer, and they could be doing something similar on an exoplanet.  And as described in the conference, the James Webb Space Telescope — once it’s launch and working — could most likely detect such an atmospheric compound if it’s in high concentration and the project was given sufficient telescope time.

A similar single finding described by Tarter that could be revolutionary is the radioactive isotope tritium, which is a by-product of the nuclear fusion process.  It has a short half-life and so any distant discovery would point to a recent use of nuclear energy (as long as it’s not associated with a recent supernova event, which can also produce tritium.)

But there many other less precise ideas put forward.

Glints on the surface of planets could be the product of technology,  as might be weather on an exoplanet that has been extremely well stabilized, modified planetary orbits and chemical disequilibriums in the atmosphere based on the by-products of life and work.  (These disequilibriums are a well-established feature of biosignature research, but Frank presented the idea of a technosphere which would process energy and create by-products at a greater level than its supporting biosphere.)

Another unlikely but most interesting example of a possible technosignature put forward by Tarter and Grinspoon involved the seven planets of the Trappist-1 solar system, all tidally locked and so lit on only one side.  She said that they could potentially be found to be remarkably similar in their basic structure, alignment and dynamics. As Tarter suggested, this could be a sign of highly advanced solar engineering.

 

Artist rendering of the imagined Trappist-1 solar system that had been terraformed to make the planets similar and habitable.  The system is one of the closest found to our own — about 40 light years.

 

Grinspoon seconded that notion about Trappist-1, but in a somewhat different context.

He has worked a great deal on the question of today’s anthroprocene era — when humans actively change the planet — and he expanded on his thinking about Earth into the galaxies.

Grinspoon said that he had just come back from Japan, where he had visited Hiroshima and its atomic bomb sites, and came away with doubts that we were the “intelligent” civilization we often describe ourselves in SETI terms.  A civilization that may well self destruct — a fate he sees as potentially common throughout the cosmos — might be considered “proto-intelligent,” but not smart enough to keep the civilization going over a long time.

Projecting that into the cosmos, Grinspoon argued that there may well be many such doomed civilizations, and then perhaps a far smaller number of those civilizations that make it through the biological-technological bottleneck that we seem to be facing in the centuries ahead.

These civilizations, which he calls semi-immortal, would develop inherently sustainable methods of continuing, including modifying major climate cycles, developing highly sophisticated radars and other tools for mitigating risks, terraforming nearby planets, and even finding ways to evolve the planet as its place in the habitable zone of its host star becomes threatened by the brightening or dulling of that star.

The trick to trying to find such truly evolved civilizations, he said, would be to look for technosignatures that reflect anomalous stability and not rampant growth. In the larger sense, these civilizations would have integrated themselves into the functioning of the planet, just as oxygen, first primitive and then complex life integrated themselves into the essential systems of Earth.

And returning to the technological civilizations that don’t survive, they could produce physical artifacts that now permeate the galaxy.

 

MeerKAT, originally the Karoo Array Telescope, is a radio telescope consisting of 64 antennas now being tested and verified in the Northern Cape of South Africa. When fully functional it will be the largest and most sensitive radio telescope in the southern hemisphere until the Square Kilometre Array is completed in approximately 2024. (South African Radio Astronomy Observatory)

 

This is exciting – the next phase Square kilometer Array (SKA2) will be able to detect Earth-level radio leakage from nearby stars. (South African Radio Astronomy Observatory)

 

While the conference focused on technosignature theory, models, and distant possibilities, news was also shared about two concrete developments involving research today.

The first involved the radio telescope array in South Africa now called MeerKAT,  a prototype of sorts that will eventually become the gigantic Square Kilometer Array.

Breakthrough Listen, the global initiative to seek signs of intelligent life in the universe, would soon announce the commencement of  a major new program with the MeerKAT telescope, in partnership with the South African Radio Astronomy Observatory (SARAO).

Breakthrough Listen’s MeerKAT survey will examine a million individual stars – 1,000 times the number of targets in any previous search – in the quietest part of the radio spectrum, monitoring for signs of extraterrestrial technology. With the addition of MeerKAT’s observations to its existing surveys, Listen will operate 24 hours a day, seven days a week, in parallel with other surveys.

This clearly has the possibility of greatly expanded the amount of SETI listening being done.  The SETI Institute, with its radio astronomy array in northern California and various partners, have been listening for almost 60 years, without detecting a signal from our galaxy.

That might seem like a disappointing intimation that nothing or nobody else is out there, but not if you listen to Tarter explain how much listening has actually been done.  Almost ten years ago, she calculated that if the Milky Way galaxy and everything in it was an ocean, then SETI would have listened to a cup full of water from that ocean.  Jason Wright and his students did an updated calculation recently, and now the radio listening amounts to a small swimming pool within that enormous ocean.

 

The NIROSETI team with their new infrared detector inside the dome at Lick Observatory. Left to right: Remington Stone, Dan Wertheimer, Jérome Maire, Shelley Wright, Patrick Dorval and Richard Treffers. (Laurie Hatch)

The other news came from Shelley Wright of the University of California, San Diego, who has been working on an optical SETI instrument for the Lick Observatory.

The Near-Infrared Optical SETI (NIROSETI) instrument she and her colleagues have developed is the first instrument of its kind designed to search for signals from extraterrestrials at near-Infrared wavelengths. The near-infrared regime is an excellenr spectral region to search for signals from extraterrestrials, since it offers a unique window for interstellar communication.

The NIROSETI instrument utilizes two near-infrared photodiodes to be able to detect artificial, very fast (nanosecond) pulses of infrared radiation.

The NIROSETI instrument, which is mounted on the Nickel telescope at Lick Observatory, splits the incoming near-infrared light onto two channels, and then checks for coincident events, which indicate signals that are identified by both detectors simultaneously.

Jason Wright is an assistant professor of astronomy and astrophysics at Penn State. His reading list is here.

Wright of Penn State was especially impressed by the project, which he said can look at much of the sky at once and was put together with on very limited budget.

Wright, who teaches a course on SETI at Penn State and is a co-author of a recent paper trying to formalize SETI terminology, said his own take-away from the conference is that it may well represent an important and positive moment in the history of technosignatures.

“Without NASA support, the whole field has lacked the normal structure by which astronomy advances,” he said.  “No teaching of the subject, no standard terms, no textbook to formalize findings and understandings.

“The Seti Institiute carried us through the dark times, and they did that outside of normal, formal structures. The Institute remains essential, but hopefully that reflex identification will start to change.”

 

Participants in the technosignatures conference in Houston last month, the largest SETI gathering in years.  And this one was sponsored by NASA and put together by the NExSS for Exoplanet Systems Science (NExSS,)  an interdisciplinary agency initiative. (Delia Enriquez)
Facebooktwittergoogle_plusredditpinterestlinkedinmail

Human Space Travel, Health and Risk

Facebooktwittergoogle_plusredditpinterestlinkedinmail
Astronauts in a mock-up of the Orion space capsule, which NASA plans to use in some form as a deep-space vehicle. (NASA)

 

We all know that human space travel is risky. Always has been and always will be.

Imagine, for a second, that you’re an astronaut about to be sent on a journey to Mars and back, and you’re in a capsule on top of NASA’s second-generation Space Launch System designed for that task.

You will be 384 feet in the air waiting to launch (as tall as a 38-floor building,) the rocket system will weigh 6.5 million pounds (equivalent to almost nine fully-loaded 747 jets) and you will take off with 9.2 million pounds of thrust (34 times the total thrust of one of those 747s.)

Given the thrill and power of such a launch and later descent, everything else seemed to pale in terms of both drama and riskiness.  But as NASA has been learning more and more, the risks continue in space and perhaps even increase.

We’re not talking here about a leak or a malfunction computer system; we’re talking about absolutely inevitable risks from cosmic rays and radiation generally — as well as from micro-gravity — during a long journey in space.

Since no human has been in deep space for more than a short time, the task of understanding those health risks is very tricky and utterly dependent on testing creatures other than humans.

The most recent results are sobering.  A NASA-sponsored team at Georgetown University Medical Center in Washington looked specifically at what could happen to a human digestive system on a long Martian venture, and the results were not reassuring.

Their results, published in the Proceedings of the National Academy of Sciences  (PNAS), suggests that deep space bombardment by galactic cosmic radiation and solar particles could significantly damage gastrointestinal tissue leading to long-term functional changes and problems. The study also raises concern about high risk of tumor development in the stomach and colon.

 

Galactic cosmic rays are a variable shower of charged particles coming from supernova explosions and other events extremely far from our solar system. The sun is the other main source of energetic particles this investigation detects and characterizes. The sun spews electrons, protons and heavier ions in “solar particle events” fed by solar flares and ejections of matter from the sun’s corona. Magnetic fields around Earth protect the planet from most of these heavy particles, but astronauts do not have that protect beyond low-Earth orbit. (NASA)

 

Kamal Datta, an associate professor in the Department of Biochemistry is project leader of the NASA Specialized Center of Research (NSCOR) at Georgetown and has been studying the effects of space radiation on humans for more than a decade.

He said that heavy ions (electrically charged atoms and molecules) of elements such as iron and silicon are damaging to humans because of their greater mass compared to mass-less photons such as x-rays and gamma rays prevalent on Earth.  These heavy ions, as well as low mass protons, are ubiquitous in deep space.

Kamal Datta of Georgetown University Medical Center works with NASA to understand the potential risks from galactic cosmic radiation to astronauts who may one day travel in deep space.

“With the current shielding technology, it is difficult to protect astronauts from the adverse effects of heavy ion radiation. Although there may be a way to use medicines to counter these effects, no such agent has been developed yet,” says Datta, also a member of Georgetown Lombardi Comprehensive Cancer Center.

“While short trips, like the times astronauts traveled to the moon, may not expose them to this level of damage, the real concern is lasting injury from a long trip, such as a Mars or other deep space missions which would be much longer” he said in a release.

Datta’s team has also published on the potentially harmful effects of galactic cosmic radiation on the brain and other teams are looking at potential deep-space travel dangers the human cardio-vascular system.  Researchers are also concerned about known weakening of bone and muscle tissue, harming vision, as well as speeded-up aging during long stays in space.

With current technology, it would take about three years to travel from Earth to Mars, orbit the planet until it is in the right place for a sling-shot boost home, and then to travel back.

A radiation detection instrument on the Mars Science Laboratory (MSL),  which carried the rover Curiosity to Mars in 2011-2012, measured an estimated overall human radiation exposure for a Mars trip that would would be two-thirds of the agency’s allowed lifetime limits.  That was based on the high-energy radiation hitting the capsule, but NASA later detected radiation bursts from solar flares on Mars far higher than anything detected during the MSL transit.

All of this seems, and is, quite daunting when thinking about human travel to Mars and other deep space destinations.  And Datta is clearly sensitive about how the new results are conveyed to the public.

“I am in no way saying that people cannot travel to Mars,” he told me. “What we are doing is trying to understand the health risks so proper mitigation can be devised, not to say this kind of travel is impossible.”

“We don’t have medicines now to protect astronauts from heavy particle radiation, and we don’t have the technology now to shield them while they’re in space.  But many people are working on these problems.”

 

The Orion spacecraft in flight, as drawn by an artist. The capsule has an attached habitat module. (NASA)

 

On the medical research side, scientists have to rely on data gained from exposing mice to radiation and extrapolating those results to humans.  It would, of course, be unethical to do the testing on people.

While this kind of animal testing is accepted as generally accurate, it certainly could hide either increased protections or increased risks in humans.

Datta said that another testing issue that has been present so far is that the mice have had to be irradiated in one large dose rather than in much smaller doses over time.  It is unclear how that effects the potential damage to human organs and the breaking of DNA bonds (which can result in the growth of cancers.)  But Datta said that new instruments at NASA’s Space Radiation Laboratory (NSRL) at the Brookhaven National Laboratory on Long Island, New York, will allow for a more gradual, lower-dose experiment.

While Datta’s work has been focused on the health risks of deep space travel, galactic cosmic radiation and solar heavy particles also bombard the moon — which has no magnetic field and only a very thin atmosphere to protect it.  Apollo astronauts could safely stay on the moon for several days in their suits and their lander, but months or years of living in a colony there would pose far greater risks.

NASA has actually funded projects to shield small areas on the moon from radiation, but the issue remains very much unresolved.

Shielding also plays a major role in thinking about how to protect astronauts traveling into deep space.  The current aluminum skin of space capsules allows much of the harmful radiation to pass through, and so something is needed to block it.

 

The goal of building an inhabited colony on the moon has many avid supporters in government and the private sector. The health risks for astronauts are similar to those in deep space. (NASA/Pat Rawlings)

 

Experts have concluded that perhaps the best barrier to radiation would be a layer of water, but it is too heavy to carry in the needed amounts.  Other possibilities include organic polymers (large macromolecules with repeating subunits) such as polyethelyne.

It seems clear that issues such as these — the effects of more hazardous space radiation on astronauts in deep space and on the moon, and how to minimize those dangers — will be coming to the front burner in the years ahead.  And assuming that progress can be made, it’s a thrilling time.

What this means for space science, however, is less clear.

On one hand I recall hearing former astronaut extraordinaire and then head of the NASA Science Mission Directorate John Grunsfeld talk about how an astronaut on Mars could gather data and understandings in one week that the Curiosity rover would need a full year to match.

On the other, human space exploration is much more expensive than anything without people — yes, even including the long-delayed and ever-more-costly James Webb Space Telescope — and NASA budgets are limited.

So the question arises whether human exploration will, when it gets into high gear, swallow up the resources needed for the successors to the Hubble, Curiosity, Cassini and the other missions that have helped create what I consider to be a golden age of space science.  Risks come in many forms.

 

Facebooktwittergoogle_plusredditpinterestlinkedinmail

A National Strategy for Finding and Understanding Exoplanets (and Possibly Extraterrestrial Life)

Facebooktwittergoogle_plusredditpinterestlinkedinmail
The National Academies of Science, Engineering and Medicine took an in-depth look at what NASA, the astronomy community and the nation need to grow the burgeoning science of exoplanets — planets outside our solar system that orbit a star. (NAS)

 

An extensive, congressionally-directed study of what NASA needs to effectively learn how exoplanets form and whether some may support life was released today, and it calls for major investments in next-generation space and ground telescopes.  It also calls for the adoption of an increasingly multidisciplinary approach for addressing the innumerable questions that remain unanswered.

While the recommendations were many, the top line calls were for a sophisticated new space-based telescope for the 2030s that could directly image exoplanets, for approval and funding of the long-delayed and debated WFIRST space telescope, and for the National Science Foundation and to help fund two of the very large ground-based telescopes now under development.

The study of exoplanets has seen remarkable discoveries in the past two decades.  But the in-depth study from the private, non-profit National Academies of Sciences, Engineering and Medicine concludes that there is much more that we don’t understand than that we do, that our understandings are “substantially incomplete.”

So the two overarching goals for future exoplanet science are described as these:

 

  • To understand the formation and evolution of planetary systems as products of star formation and characterize the diversity of their architectures, composition, and environments.
  • To learn enough about exoplanets to identify potentially habitable environments and search for scientific evidence of life on worlds orbiting other stars.

 

Given the challenge, significance and complexity of these science goals, it’s no wonder that young researchers are flocking to the many fields included in exoplanet science.  And reflecting that, it is perhaps no surprise that the NAS survey of key scientific questions, goals, techniques, instruments and opportunities runs over 200 pages. (A webcast of a 1:00 pm NAS talk on the report can be accessed here.)

 


Artist’s concept showing a young sun-like star surrounded by a planet-forming disk of gas and dust.
(NASA/JPL-Caltech/T. Pyle)

These ambitious goals and recommendations will now be forwarded to the arm of the National Academies putting together 2020 Astronomy and Astrophysics Decadal Survey — a community-informed blueprint of priorities that NASA usually follows.

This priority-setting is probably most crucial for the two exoplanet direct imaging missions now being studied as possible Great Observatories for the 2030s — the paradigm-changing space telescopes NASA has launched almost every decade since the 1970s.

HabEx (the Habitable Exoplanet Observatory) and LUVOIR (the Large UV/Optical/IR Surveyor) are two direct-imaging exoplanet projects in conception phase that would indeed significantly change the exoplanet field.

Both would greatly enhance scientists’ ability to detect and characterize exoplanets. But the more ambitious LUVOIR in particular, would not only find many exoplanets in all stages of formation, but could readily read chemical components of the atmospheres and thereby get clear data on whether the planet was habitable or even if it supported life.  The LUVOIR would provide either an 8 meter or a record-breaking 15-meter space telescope, while HabEx would send up a 4 meter mirror.

HabEx and LUVOIR are competing with two other astrophysics projects for that Great Observatory designation, and so NAS support now and prioritizing later is essential if they are to become a reality.

 

An artist notional rendering of an approximately 15-meter telescope in space. This image was created for an earlier large space telescope feasibility project called ATLAST, but it is similar to what is being discussed inside and outside of NASA as a possible great observatory after the James Webb Space Telescope and the Wide-Field Infrared Survey Telescope. (NASA)

These two potential Great Observatories will be costly and would take many years to design and build.  As the study acknowledges and explains, “While the committee recognized that developing a direct imaging capability will require large financial investments and a long time scale to see results, the effort will foster the development of the scientific community and technological capacity to understand myriad worlds.”

So a lot is at stake.  But with budget and space priorities in flux, the fate of even the projects given the highest priority in the Decadal Survey remains unclear.

That’s apparent in the fact that one of the top recommendations of today’s study is the funding of the number one priority put forward in the 2010 Astronomy and Astrophysics Decadal Survey — the Wide Field Infrared Survey Telescope (WFIRST.)

The project — which would boost the search for exoplanets further from their stars than earlier survey mission using microlensing– was cancelled in the administration’s proposed 2019 federal budget.  Congress has continued funding some development of this once top priority, but its future nonetheless remains in doubt.

WFIRST could have the capability of directly imaging exoplanets if it were built with technology to block out the blinding light of the star around which exoplanets would be orbiting — doing so either with internal coronagraph or a companion starshade.  This would be novel technology for a space-based telescope, and the NAS survey recommends it as well.

 

An artist’s rendering of a possible “starshade” that could be launched to work with WFIRST or another space telescope and allow the telescope to take direct pictures of other Earth-like planets. (NASA/JPL-Caltech)

The list of projects the study recommends is long, with these important additions:

That “ground-based astronomy – enabled by two U.S.-led telescopes – will also play a pivotal role in studying planet formation and potentially terrestrial worlds, the report says. The future Giant Magellan telescope (GMT) and proposed Thirty Meter Telescope (TMT) would allow profound advances in imaging and spectroscopy – absorption and emission of light – of entire planetary systems. They also could detect molecular oxygen in temperate terrestrial planets in transit around close and small stars, the report says.”

The committee concluded that the technology road map to enable the full potential of GMT and TMT in the study of exoplanets is in need of investments, and should leverage the existing network of U.S. centers and laboratories. To that end, the report recommends that the National Science Foundation invest in both telescopes and their exoplanet instrumentation to provide all-sky access to the U.S. community.

And for another variety of ground-based observing the study called for the funding of a project to substantially increase the precision of instruments that find and measure exoplanets using the detected “wobble” of the host star.  But stars are active with or without a nearby exoplanet, and so it has been difficult to achieve the precision that astronomers using this “radial velocity” technique need to find and characterize smaller exoplanets.

Several smaller efforts to increase this precision are under way in the U.S., and the European Southern Observatory has a much larger project in development.

Additionally, the report recommends that the administrators of the James Webb Space Telescope give significant amounts of observing time to exoplanet study, especially early in its time aloft (now scheduled to begin in 2021.)  The atmospheric data that JWST can potentially collect could and would be used in conjunction with results coming from other telescopes, and to further study of exoplanet targets that are already promising based on existing theories and findings.

 

Construction has begun on the Giant Magellan Telescope at the Carnegie Institution’s Las Campanas Observatory in Chile. This artist rendering shows what the 24.5 meter (80 foot) segmented mirror and observatory will look like when completed, estimated to be in 2024. (Mason Media Inc.)

 

While the NAS report gives a lot of attention to instruments and ways to use them, it also focuses as never before on astrobiology — the search for life beyond Earth.

Much work has been done on how to determine whether life exists on a distant planet through modeling and theorizing about biosignatures.  The report encourages scientists to expand that work and embraces it as a central aspect of exoplanet science.

The study also argues that interdisciplinary science — bringing together researchers from many disciplines — is the necessary way forward.  It highlights the role of the Nexus for Exoplanet System Science, a NASA initiative which since 2015 has brought together a broad though limited number  of science teams from institutions across the country to learn about each other’s work and collaborate whenever possible.

The initiative itself has not required much funding, instead bringing in teams that had been supported with other grants.   However, that may be changing. One of the study co-chairs, David Charbonneau of Harvard University, said after the release of the study that the “promise of NExSS is tremendous…We really want that idea to grow and have a huge impact.”

The NAS study itself recommends that “building on the NExSS model, NASA should support a cross-divisional exoplanet research coordination network that includes additional membership opportunities via dedicated proposal calls for interdisciplinary research.”

The initiative, I’m proud to say, sponsors this interdisciplinary column in addition to all that interdisciplinary science.

Facebooktwittergoogle_plusredditpinterestlinkedinmail

Large Reservoir of Liquid Water Found Deep Below the Surface of Mars

Facebooktwittergoogle_plusredditpinterestlinkedinmail
Artist impression of the Mars Express spacecraft probing the southern hemisphere of Mars, superimposed on a radar cross section of the southern polar layered deposits. The leftmost white line is the radar echo from the Martian surface, while the light blue spots are highlighted radar echoes along the bottom of the ice.  Those highlighted areas measure very high reflectivity, interpreted as being caused by the presence of water. (ESA, INAF. Graphic rendering by Davide Coero Borga )

Far beneath the frigid surface of the South Pole of Mars is probably the last place where you might expect the first large body of Martian liquid water would be found.  It’s -170 F on the surface, there are no known geothermal sources that could warm the subterranean ice to make a meltwater lake, and the liquid water is calculated to be more than a mile below the surface.

Yet signs of that liquid water are what a team of Italian scientists detected — a finding that they say strongly suggests that there are other underground lakes and streams below the surface of Mars.  In a Science journal article released today, the scientists described the subterranean lake they found as being about 20 kilometers in diameter.

The detection adds significantly to the long-studied and long-debated question of how much surface water was once on Mars, a subject that has major implications for the question of whether life ever existed on the planet.

Finding the subterranean lake points to not only a wetter early Mars, said co-author Enrico Flamini of the Italian space agency, but also to a Mars that had a water cycle that collected and delivered the liquid water.  That would mean the presence of clouds, rain, evaporation, rivers, lakes and water to seep through surface cracks and pool underground.

Scientists have found many fossil waterways on Mars, minerals that can only be formed in the presence of water, and what might be the site of an ancient ocean.

But in terms of liquid water now on the planet, the record is thin.  Drops of water collected on the leg of NASA’s Phoenix Lander after it touched down in 2008, and what some have described as briny water appears to be flowing down some steep slopes in summertime.  Called recurrent slope lineae or RSLs, they appear at numerous locations when the temperatures rise and disappear when they drop.

This lake is different, however, and its detection is a major step forward in understanding the history of Mars.

Color photo mosaic of a portion of Planum Australe on Mars.  The subsurface reflective echo power is color coded and deep blue corresponds to the strongest reflections, which are interpreted as being caused by the presence of water. (USGS Astrogeology Science Center, Arizona State University, INAF)

The discovery was made analyzing echoes captured by the the radar instruments on the European Space Agency’s Mars Express, a satellite orbiting the planet since 2002.  The data for this discovery was collected from observation made between 2012 and 2015.

 

A schematic of how scientists used radar to find what they interpret to be liquid water beneath the surface of Mars. (ESA)

Antarctic researchers have long used radar on aircraft to search for lakes beneath the thick glaciers and ice layers,  and have found several hundred.  The largest is Lake Vostok, which is the sixth largest lake on Earth in terms of volume of water.  And it is two miles below the coldest spot on Earth.

So looking for a liquid lake below the southern pole of Mars wasn’t so peculiar after all.  In fact, lead author Roberto Orosei of the Institute of Radioastronomy of Bologna, Italy said that it was the ability to detect subsurface water beneath the ice of Antarctica and Greenland that helped inspire the team to look at Mars.

There are a number of ways to keep water liquid in the deep subsurface even when it is surrounded by ice.  As described by the Italian team and an accompanying Science Perspective article by Anja Diez of the Norwegian Polar Institute, the enormous pressure of the ice lowers the freezing point of water substantially.

Added to that pressure on Mars is the known presence of many salts, that the authors propose mix with the water to form a brine that lowers the freezing point further.

So the conditions are present for additional lakes and streams on Mars.  And according to Flamini, solar system exploration manager for the Italian space agency, the team is confident there are more and some of them larger than the one detected.  Finding them, however, is a difficult process and may be beyond the capabilities of the radar equipment now orbiting Mars.

 

Subsurface lakes and rivers in Antarctica. Now at least one similar lake has been found under the southern polar region of Mars. (NASA/JPL)

The view that subsurface water is present on Mars is hardly new.  Stephen Clifford, for many years a staff scientist at the Lunar and Planetary Institute, even wrote in 1987 that there could be liquid water at the base of the Martian poles due to the kind of high pressure environments he had studied in Greenland and Antarctica.

So you can imagine how gratifying it might be to learn, as he put it “of some evidence that shows that early theoretical work has some actual connection to reality.”

He considers the new findings to be “persuasive, but not definitive” — needing confirmation with other instruments.

Clifford’s wait has been long, indeed.  Many observations by teams using myriad instruments over the years did not produce the results of the Italian team.

Their discovery of liquid water is based on receiving particularly strong radar echoes from the base of the southern polar ice — echoes consistent with the higher radar reflectivity of water (as opposed to ice or rock.)

After analyzing the data in some novels ways and going through the many possible explanations other than the presence of a lake, Orosei said that none fit the results they had.  The explanation, then, was clear:  “We have to conclude there is liquid water on Mars.”

The depth of the lake — the distance from top to bottom — was impossible to measure, though the team concluded it was at least one meter and perhaps in the tens of meters.

Might the lake be a habitable?  Orosei said that because of the high salt levels “this is not a very pleasant environment for life.”

But who knows?  As he pointed out, Lake Vostok and other subglacial Antarctic lake, are known to be home to single-cell organisms that not only survive in their very salty world, but use the salt as part of their essential metabolism.

 

 

Facebooktwittergoogle_plusredditpinterestlinkedinmail

A New Frontier for Exoplanet Hunting

Facebooktwittergoogle_plusredditpinterestlinkedinmail
The spectrum from the newly-assembled EXtreme PREcision Spectrometer (EXPRES)  shines on Yale astronomy professor Debra Fischer, who is principal investigator of the project. The stated goal of EXPRES is to find many Earth-size planets via the radial velocity method — something that has never been done. (Ryan Blackman/Yale)

The first exoplanets were all found using the radial velocity method of measuring the “wobble” of a star — movement caused by the gravitational pull of an orbiting planet.

Radial velocity has been great for detecting large exoplanets relatively close to our solar system, for assessing their mass and for finding out how long it takes for the planet to orbit its host star.

But so far the technique has not been able to identify and confirm many Earth-sized planets, a primary goal of much planet hunting.  The wobble caused by the presence of a planet that size has been too faint to be detected by current radial velocity instruments and techniques.

However, a new generation of instruments is coming on line with the goal of bringing the radial velocity technique into the small planet search.  To do that, the new instruments, together with their telescopes. must be able to detect a sun wobble of 10 to 20 centimeters per second.  That’s quite an improvement on the current detection limit of about one meter per second.

At least three of these ultra high precision spectrographs (or sometimes called spectrometers) are now being developed or deployed.  The European Southern Observatory’s ESPRESSO instrument has begun work in Chile; Pennsylvania State University’s NEID spectrograph (with NASA funding) is in development for installation at the Kitt Peak National Observatory in Arizona; and the just-deployed EXPRES spectrograph put together by a team led by Yale University astronomers (with National Science Foundation funding) is in place at the Lowell Observatory outside of Flagstaff, Arizona.

The principal investigator of EXPRES, Debra Fischer, attended the recent University of Cambridge Exoplanets2 conference with some of her team, and there I had the opportunity to talk with them. We discussed the decade-long history of the instrument, how and why Fischer thinks it can break that 1-meter-per-second barrier, and what it took to get it into attached and working.

 

This animation shows how astronomers use very precise spectrographs to find exoplanets. As the planet orbits its gravitational pull causes the parent star to move back and forth. This tiny radial motion shifts the observed spectrum of the star by a correspondingly small amount because of the Doppler shift. With super-sensitive spectrographs the shifts can be measured and used to infer details of a planet’s mass and orbit. ESO/L. Calçada)

One of the earliest and most difficult obstacles to the development of EXPRES, Fischer told me, was that many in the astronomy community did not believe it could work.

Their view is that precision below that one meter per second of host star movement cannot be measured accurately.  Stars have flares, sunspots and a generally constant churning, and many argue that the turbulent nature of stars creates too much “noise” for a precise measurement below that one-meter-per-second level.

Yet European scientists were moving ahead with their ESPRESSO ultra high precision instrument aiming for that 10-centimeter-per-second mark, and they had a proven record of accomplishing what they set out to do with spectrographs.

In addition to the definite competiti0n going on, Fisher also felt that radial velocity astronomers needed to make that leap to measuring small planets “to stay in the game” over the long haul.

She arrived at Yale in 2009 and led an effort to build a spectrograph so stable and precise that it could find an Earth-like planet.  To make clear that goal, the instrument is at the center of a project called “100 Earths.”

Building on experience gained from developing two earlier spectrographs, Fischer and colleagues began the difficult and complicated process of getting backers for EXPRES, of finding a telescope observatory that would house it (The Discovery Channel Telescope at Lowell) and in the end adapting the instrument to the telescope.

And now comes the actual hard part:  finding those Earth-like planets.

As Fischer described it:  “We know from {the Kepler Telescope mission} that most stars have small rocky planets orbiting them.  But Kepler looked at stars very far away, and we’ll be looking at stars much, much closer to us.”

Nonetheless, those small planets will still be extremely difficult to detect due to all that activity on the host suns.

 

EXPRES in its vacuum-sealed chamber at the Lowell Observatory. will help detect Earth-sized planets in neighboring solar systems. (Ryan Blackman/Yale)

 

 

The 4.3 meter Discovery Channel Telescope in the Lowell Observatory in Arizona.  The photons collected by the telescope are delivered via optical fiber to the EXPRES instrument. (Boston University)

Spectrographs such as EXPRES are instruments astronomers use to study light emitted by planets, stars, and galaxies.

They are connected to either a ground-based or orbital telescope and they stretch out or split a beam of light into a spectrum of frequencies.  That spectrum is then analyzed to determine an object’s speed, direction, chemical composition, or mass.  With planets, the work involves determining (via the Doppler shift seen in the spectrum) whether and how much a sun is moving to and away from Earth due to the pull of a planet.

As Fisher and EXPRES postdoctoral fellow John Brewer explained it, the signal (noise) coming from the turbulence of the star is detectably different from the signal made by the wobble of a star due to the presence of an orbiting planet.

While these differences — imprinted in the spectrum captured by the spectrograph — have been known for some time, current spectrographs haven’t had sufficient resolving power to actually detect the difference.

If all works as planned for the EXPRES, Espresso and NEID spectrographs, they will have that necessary resolving power and so can, in effect, filter out the noise from the sun and identify what can only come from a planet-caused wobble.  If they succeed, they provide a major new pathway to  for astronomers to search for Earth-sized worlds.

“This is my dream machine, the one I always wanted to build,” Fischer said. “I had a belief that if we went to higher resolution, we could disentangle (the stellar noise from the planet-caused wobble.)

“I could still be wrong, but I definitely think that trying was the right choice to make.”

This image shows spectral data from the first light last December of the ESPRESSO instrument on ESO’s Very Large Telescope in Chile. The light from a star has been dispersed into its component colors. This view has been colorised to indicate how the wavelengths change across the image, but these are not exactly the colors that would be seen visually. (ESO/ESPRESSO)

While Fischer and others have very high hopes for EXPRES, it is not the sort of  big ticket project that is common in astronomy.  Instead, it was developed and built primarily with a $6 million grant from National Science Foundation.

It was completed on schedule by the Yale team, though the actual delivering of EXPRES to Arizona and connecting it to the telescope turned out to be a combination of hair-raising and edifying.

Twice, she said, she drove from New Haven to Flagstaff with parts of the instrument; each trip in a Penske rental truck and with her son Ben helping out.

And then when the instrumentation was in process late last year, Fischer and her team learned that funds for the scientists and engineers working on that process had come to an end.

Francesco Pepe of the University of Geneva. He is the principal scientist for the ESPRESSO instrument and gave essential aid to the EXPRES team when they needed it most.

She was desperate, and sent a long-shot email to Francesco Pepe of University of Geneva, the lead scientist and wizard builder of several European spectrographs, including ESPRESSO. In theory, he and his instrument — which went into operation late last year at the ESO Very Large Telescope in Chile — will be competing with EXPRES for discoveries and acknowledgement.

Nonetheless, Pepe heard Fischer out and understood the predicament she was in.  ESPRESSO had been installed and so he was able to contact an associate who freed up two instrumentation specialists who flew to Flagstaff to finish the work.  It was, Fischer said, an act of collegial generosity and scientific largesse that she will never forget.

Fischer is at the Lowell observatory now, using the Arizona monsoon as a time to clean up many details before the team returns to full-time observing.  She write about her days in an EXPRES blog.  Earlier, in March after the instrumentation had been completed and observing had commenced, she wrote this:

“Years of work went into EXPRES and as I look at this instrument, I am surprised that I ever had the audacity to start this project. The moment of truth starts now. It will take us a few more months of collecting and analyzing data to know if we made the right design decisions and I feel both humbled and hopeful. I’m proud of the fact that our design decisions were driven by evidence gleaned from many years of experience. But did I forget anything?”

Facebooktwittergoogle_plusredditpinterestlinkedinmail