The Kepler Space Telescope Mission Is Ending But Its Legacy Will Keep Growing.

Facebooktwittergoogle_plusredditpinterestlinkedinmail
An illustration of the Kepler Space Telescope, which is on its very last legs.  As of October 2018, the planet-hunting spacecraft has been in space for nearly a decade. (NASA via AP)

 

The Kepler Space Telescope is dead.  Long live the Kepler.

NASA officials announced on Tuesday that the pioneering exoplanet survey telescope — which had led to the identification of almost 2,700 exoplanets — had finally reached its end, having essentially run out of fuel.  This is after nine years of observing, after a malfunctioning steering system required a complex fix and change of plants, and after the hydrazine fuel levels reached empty.

While the sheer number of exoplanets discovered is impressive the telescope did substantially more:  it proved once and for all that the galaxy is filled with planets orbiting distant stars.  Before Kepler this was speculated, but now it is firmly established thanks to the Kepler run.

It also provided data for thousands of papers exploring the logic and characteristics of exoplanets.  And that’s why the Kepler will indeed live long in the world of space science.

“As NASA’s first planet-hunting mission, Kepler has wildly exceeded all our expectations and paved the way for our exploration and search for life in the solar system and beyond,” said Thomas Zurbuchen, associate administrator of NASA’s Science Mission Directorate in Washington.

“Not only did it show us how many planets could be out there, it sparked an entirely new and robust field of research that has taken the science community by storm. Its discoveries have shed a new light on our place in the universe, and illuminated the tantalizing mysteries and possibilities among the stars.”

 

 


The Kepler Space Telescope was focused on hunting for planets in this patch of the Milky Way. After two of its four spinning reaction wheels failed, it could no longer remain steady enough to stare that those distant stars but was reconfigured to look elsewhere and at a different angle for the K2 mission. (Carter Roberts/NASA)

 

Kepler was initially the unlikely brainchild of William Borucki, its founding principal investigator who is now retired from NASA’s Ames Research Center in California’s Silicon Valley.

When he began thinking of designing and proposing a space telescope that could potentially tell us how common distant exoplanets were — and especially smaller terrestrial exoplanets like Earth – the science of extra solar planets was at a very different stage.

William Borucki, originally the main champion for the Kepler idea and later the principal investigator of the mission. His work at NASA went back to the Apollo days. (NASA)

“When we started conceiving this mission 35 years ago we didn’t know of a single planet outside our solar system,” Borucki said.  “Now that we know planets are everywhere, Kepler has set us on a new course that’s full of promise for future generations to explore our galaxy.”

The space telescope was launched in 2009.  While Kepler did not find the first exoplanets — that required the work of astronomers using a different technique of observing based on the “wobble” of stars caused by orbiting planets — it did change the exoplanet paradigm substantially.

Not only did it prove that exoplanets are common, it found that planets outnumber stars in our galaxy (which has hundreds of billions of those stars.)

In addition it found that small, terrestrial-size planets are common as well, with some 20 to 50 percent of stars likely to have planets of that size and type.  And what menagerie of planets it found out there.

Astrophysicist Natalie Batalha was the Kepler project and mission scientist for a decade. She left NASA recently for the University of California at Santa Cruz “to carry on the Kepler legacy” by creating an interdisciplinary center for the study of planetary habitability.

Among the greatest surprises:  The Kepler mission provided data showing that the most common sized planets in the galaxy fall somewhere between Earth and Neptune, a type of planet that isn’t present in our solar system.

It found solar systems of all sizes as well, including some with many planets (as many as eight) orbiting close to their host star.

The discovery of these compact systems, generally orbiting a red dwarf star, raised questions about how solar systems form: Are these planets “born” close to their parent star, or do they form farther out and migrate in?

So far, more than 2,500 peer-reviewed papers have been published using Kepler data, with substantial amounts of that data still unmined.

Natalie Batalha was the project and mission scientist for Kepler for much of its run, and I asked her about its legacy.

“When I think of Kepler’s influence across all of astrophysics, I’m amazed at what such a simple experiment accomplished,” she wrote in an email. “You’d be hard-pressed to come up with a more boring mandate — to unblinkingly measure the brightnesses of the same stars for years on end. No beautiful images. No fancy spectra. No landscapes. Just dots in a scatter plot.

“And yet time-domain astronomy exploded. We’d never looked at the Universe quite this way before. We saw lava worlds and water worlds and disintegrating planets and heart-beat stars and supernova shock waves and the spinning cores of stars and planets the age of the galaxy itself… all from those dots.”

 

The Kepler-62 system is put one of many solar systems detected by the space telescope. The planets within the green discs are in the habitable zones of the stars — where water could be liquid at times. (NASA)

 

While Kepler provided remarkable answers to questions about the overall planetary makeup of our galaxy, it did not identify smaller planets that will be directly imaged, the evolving gold standard for characterizing exoplanets.  The 150,000 stars that the telescope was observing were very distant, in the range of a few hundred to a few thousand light-years away. One light year is about 6 trillion (6,000,000,000,000) miles.

Nonetheless, Kepler was able to detect  the presence of a handful of Earth-sized planets in the habitable zones of their stars.  The Kepler-62 system held one of them, and it is 1200 light-years away.  In contrast, the four Earth-sized planets in the habitable zone of the much-studied Trappist-1 system are 39 light-years away.

Kepler made its observations using the the transit technique, which looks for tiny dips in the amount of light coming from a star caused by the presence of a planet passing in front of the star.  While the inference that exoplanets are ubiquitous came from Kepler results, the telescope was actually observing but a small bit of the sky.  It has been estimated that it would require around 400 space telescopes like Kepler to cover the whole sky.

What’s more, only planets whose orbits are seen edge-on from Earth can be detected via the transit method, and that rules out a vast number of exoplanets.

The bulk of the stars that were selected for close Kepler observation were more or less sun-like, but a sampling of other stars occurred as well. One of the most important factors was brightness. Detecting minuscule changes in brightness caused by transiting planet is impossible if the star is too dim.

 

The artist’s concept depicts Kepler-186f, the first validated Earth-size planet to orbit a distant star in the habitable zone. (NASA Ames/SETI Institute/JPL-Caltech)

 

Four years into the mission, after the primary mission objectives had been met, mechanical failures temporarily halted observations. The mission team was able to devise a fix, switching the spacecraft’s field of view roughly every three months. This enabled an extended mission for the spacecraft, dubbed K2, which lasted as long as the first mission and bumped Kepler’s count of surveyed stars up to more than 500,000.

But it was inevitable that the mission would come to an end sooner rather than later because of that dwindling fuel supply, needed to keep the telescope properly pointed.

Kepler cannot be refueled because NASA decided to place the telescope in an orbit around the sun that is well beyond the influence of the Earth and moon — to simplify operations and ensure an extremely quiet, stable environment for scientific observations.  So Kepler was beyond the reach of any refueling vessel.  The Kepler team compensated by flying considerably more fuel than was necessary to meet the mission objectives.

The video below explains what will happen to the Kepler capsule once it is decommissioned.  But a NASA release explains that the final commands “will be to turn off the spacecraft transmitters and disable the onboard fault protection that would turn them back on. While the spacecraft is a long way from Earth and requires enormous antennas to communicate with it, it is good practice to turn off transmitters when they are no longer being used, and not pollute the airwaves with potential interference.”

 

 

And so Kepler will actually continue orbiting for many decades, just as its legacy will continue long after operations cease.

Kepler’s follow-on exoplanet surveyor — the Transiting Exoplanet Survey Satellite or TESS — was launched this year and has begun sending back data.  Its primary mission objective is to survey the brightest stars near the Earth for transiting exoplanets. The TESS satellite uses an array of wide-field cameras to survey some 85% of the sky, and is planned to last for two years.

Facebooktwittergoogle_plusredditpinterestlinkedinmail

Technosignatures and the Search for Extraterrestrial Intelligence

Facebooktwittergoogle_plusredditpinterestlinkedinmail
A rendering of a potential Dyson sphere, named after Freeman A. Dyson. As proposed by the physicist and astronomer decades ago, they would collect solar energy on a solar system wide scale for highly advanced civilizations. (SentientDevelopments.com)

The word “SETI” pretty much brings to mind the search for radio signals come from distant planets, the movie “Contact,” Jill Tarter, Frank Drake and perhaps the SETI Institute, where the effort lives and breathes.

But there was a time when SETI — the Search for Extraterrestrial Intelligence — was a significantly broader concept, that brought in other ways to look for intelligent life beyond Earth.

In the late 1950s and early 1960s — a time of great interest in UFOs, flying saucers and the like — scientists not only came up with the idea of searching for distant intelligent life via unnatural radio signals, but also by looking for signs of unexpectedly elevated heat signatures and for optical anomalies in the night sky.

The history of this search has seen many sharp turns, with radio SETI at one time embraced by NASA, subsequently de-funded because of congressional opposition, and then developed into a privately and philanthropically funded project of rigor and breadth at the SETI Institute.  The other modes of SETI went pretty much underground and SETI became synonymous with radio searches for ET life.

But this history may be about to take another sharp turn as some in Congress and NASA have become increasingly interested in what are now called “technosignatures,” potentially detectable signatures and signals of the presence of distant advanced civilizations.  Technosignatures are a subset of the larger and far more mature search for biosignatures — evidence of microbial or other primitive life that might exist on some of the billions of exoplanets we now know exist.

And as a sign of this renewed interest, a technosignatures conference was scheduled by NASA at the request of Congress (and especially retiring Republican Rep. Lamar Smith of Texas.)  The conference took place in Houston late last month, and it was most interesting in terms of the new and increasingly sophisticated ideas being explored by scientists involved with broad-based SETI.

“There has been no SETI conference this big and this good in a very long time,” said Jason Wright, an astrophysicist and professor at Pennsylvania State University and chair of the conference’s science organizing committee.  “We’re trying to rebuild the larger SETI community, and this was a good start.”

 

At this point, the search for technosignatures is often likened to that looking for a needle in a haystack. But what scientists are trying to do is define their haystack, determine its essential characteristics, and learn how to best explore it. (Wiki Commons)

 

During the three day meeting in Houston, scientists and interested private and philanthropic reps. heard talks that ranged from the trials and possibilities of traditional radio SETI to quasi philosophical discussions about what potentially detectable planetary transformations and by-products might be signs of an advanced civilization. (An agenda and videos of the talks are here.)

The subjects ranged from surveying the sky for potential millisecond infrared emissions from distant planets that could be purposeful signals, to how the presence of certain unnatural, pollutant chemicals in an exoplanet atmosphere that could be a sign of civilization.  From the search for thermal signatures coming from megacities or other by-products of technological activity, to the possible presence of “megastructures” built to collect a star’s energy by highly evolved beings.

Michael New is Deputy Associate Administrator for Research within NASA’s Science Mission Directorate. He was initially trained in chemical physics. (NASA)

All but the near infrared SETI are for the distant future — or perhaps are on the science fiction side — but astronomy and the search for distant life do tend to move forward slowly.  Theory and inference most often coming well before observation and detection.

So thinking about the basic questions about what scientists might be looking for, Wright said, is an essential part of the process.

Indeed, it is precisely what Michael New, Deputy Associate Administrator for Research within NASA’s Science Mission Directorate, told the conference. 

He said that he, NASA and Congress wanted the broad sweep of ideas and research out there regarding technosignatures, from the current state of the field to potential near-term findings, and known limitations and possibilities.

“The time is really ripe scientifically for revisiting the ideas of technosignatures and how to search for them,” he said.

He offered the promise of NASA help  (admittedly depending to some extent on what Congress and the administration decide) for research into new surveys, new technologies, data-mining algorithms, theories and modelling to advance the hunt for technosignatures.

 

Crew members aboard the International Space Station took this nighttime photograph of much of the Atlantic coast of the United States. The ability to detect the heat and light from this kind of activity on distant exoplanets does not exist today, but some day it might and could potentially help discover an advanced extraterrestrial civilization. (NASA)

 

Among the several dozen scientists who discussed potential signals to search for were the astronomer Jill Tarter, former director of the Center for SETI Research, Planetary Science Institute astrobiologist David Grinspoon and University of Rochester astrophysicist Adam Frank.  They all looked at the big picture, what artifacts in atmospheres, on surfaces and perhaps in space that advanced civilizations would likely produce by dint of their being “advanced.”

All spoke of the harvesting of energy to perform work as a defining feature of a technological planet, with that “work” describing transportation, construction, manufacturing and more.

Beings that have reached the high level of, in Frank’s words, exo-civilization produce heat, pollutants, changes to their planets and surroundings in the process of doing that work.  And so a detection of highly unusual atmospheric, thermal, surface and orbital conditions could be a signal.

One example mentioned by several speakers is the family of chemical chloroflourocarbons (CFCs,)  which are used as commercial refrigerants, propellants and solvents.

Astronomer Jill Tarter is an iconic figure in the SETI world and led the SETI Institute for 30 years. (AFP)

These CFCs are a hazardous and unnatural pollutant on Earth because they destroy the ozone layer, and they could be doing something similar on an exoplanet.  And as described in the conference, the James Webb Space Telescope — once it’s launch and working — could most likely detect such an atmospheric compound if it’s in high concentration and the project was given sufficient telescope time.

A similar single finding described by Tarter that could be revolutionary is the radioactive isotope tritium, which is a by-product of the nuclear fusion process.  It has a short half-life and so any distant discovery would point to a recent use of nuclear energy (as long as it’s not associated with a recent supernova event, which can also produce tritium.)

But there many other less precise ideas put forward.

Glints on the surface of planets could be the product of technology,  as might be weather on an exoplanet that has been extremely well stabilized, modified planetary orbits and chemical disequilibriums in the atmosphere based on the by-products of life and work.  (These disequilibriums are a well-established feature of biosignature research, but Frank presented the idea of a technosphere which would process energy and create by-products at a greater level than its supporting biosphere.)

Another unlikely but most interesting example of a possible technosignature put forward by Tarter and Grinspoon involved the seven planets of the Trappist-1 solar system, all tidally locked and so lit on only one side.  She said that they could potentially be found to be remarkably similar in their basic structure, alignment and dynamics. As Tarter suggested, this could be a sign of highly advanced solar engineering.

 

Artist rendering of the imagined Trappist-1 solar system that had been terraformed to make the planets similar and habitable.  The system is one of the closest found to our own — about 40 light years.

 

Grinspoon seconded that notion about Trappist-1, but in a somewhat different context.

He has worked a great deal on the question of today’s anthropocene era — when humans actively change the planet — and he expanded on his thinking about Earth into the galaxies.

Grinspoon said that he had just come back from Japan, where he had visited Hiroshima and its atomic bomb sites, and came away with doubts that we were the “intelligent” civilization we often describe ourselves in SETI terms.  A civilization that may well self destruct — a fate he sees as potentially common throughout the cosmos — might be considered “proto-intelligent,” but not smart enough to keep the civilization going over a long time.

Projecting that into the cosmos, Grinspoon argued that there may well be many such doomed civilizations, and then perhaps a far smaller number of those civilizations that make it through the biological-technological bottleneck that we seem to be facing in the centuries ahead.

These civilizations, which he calls semi-immortal, would develop inherently sustainable methods of continuing, including modifying major climate cycles, developing highly sophisticated radars and other tools for mitigating risks, terraforming nearby planets, and even finding ways to evolve the planet as its place in the habitable zone of its host star becomes threatened by the brightening or dulling of that star.

The trick to trying to find such truly evolved civilizations, he said, would be to look for technosignatures that reflect anomalous stability and not rampant growth. In the larger sense, these civilizations would have integrated themselves into the functioning of the planet, just as oxygen, first primitive and then complex life integrated themselves into the essential systems of Earth.

And returning to the technological civilizations that don’t survive, they could produce physical artifacts that now permeate the galaxy.

 

MeerKAT, originally the Karoo Array Telescope, is a radio telescope consisting of 64 antennas now being tested and verified in the Northern Cape of South Africa. When fully functional it will be the largest and most sensitive radio telescope in the southern hemisphere until the Square Kilometre Array is completed in approximately 2024. (South African Radio Astronomy Observatory)

 

This is exciting – the next phase Square kilometer Array (SKA2) will be able to detect Earth-level radio leakage from nearby stars. (South African Radio Astronomy Observatory)

 

While the conference focused on technosignature theory, models, and distant possibilities, news was also shared about two concrete developments involving research today.

The first involved the radio telescope array in South Africa now called MeerKAT,  a prototype of sorts that will eventually become the gigantic Square Kilometer Array.

Breakthrough Listen, the global initiative to seek signs of intelligent life in the universe, would soon announce the commencement of  a major new program with the MeerKAT telescope, in partnership with the South African Radio Astronomy Observatory (SARAO).

Breakthrough Listen’s MeerKAT survey will examine a million individual stars – 1,000 times the number of targets in any previous search – in the quietest part of the radio spectrum, monitoring for signs of extraterrestrial technology. With the addition of MeerKAT’s observations to its existing surveys, Listen will operate 24 hours a day, seven days a week, in parallel with other surveys.

This clearly has the possibility of greatly expanded the amount of SETI listening being done.  The SETI Institute, with its radio astronomy array in northern California and various partners, have been listening for almost 60 years, without detecting a signal from our galaxy.

That might seem like a disappointing intimation that nothing or nobody else is out there, but not if you listen to Tarter explain how much listening has actually been done.  Almost ten years ago, she calculated that if the Milky Way galaxy and everything in it was an ocean, then SETI would have listened to a cup full of water from that ocean.  Jason Wright and his students did an updated calculation recently, and now the radio listening amounts to a small swimming pool within that enormous ocean.

 

The NIROSETI team with their new infrared detector inside the dome at Lick Observatory. Left to right: Remington Stone, Dan Wertheimer, Jérome Maire, Shelley Wright, Patrick Dorval and Richard Treffers. (Laurie Hatch)

The other news came from Shelley Wright of the University of California, San Diego, who has been working on an optical SETI instrument for the Lick Observatory and beyond.

She has developed a Near-Infrared Optical SETI (NIROSETI)  instrument designed to search for signals from extraterrestrials at near-infrared wavelengths — a first. The near-infrared is an excellent spectral region to search for signals from extraterrestrials, since it offers a unique window for interstellar communication.  NIROSETI is now operating 8 to 12 nights per month, overseen by students at a remote location.

In addition, Wright and Harvard University’s Paul Horowitz have been working on a novel instrument for searching the full sky all the time for very short pulses of light — an idea that came out of a Breakthrough Listen meeting in 2016. The pulses they are searching for are nanosecond to one second bursts which,  could only come from technological civilizations.

This PANOSETI (Pulsed All-sky Near-infrared Optical SETI)  uses a most unusual light-collection method that features some 100 compact, wide-viewing Fresnel lenses mounted on two small geodesic domes, and connected to the telescope at the Lick Observatory. I

Jason Wright is an assistant professor of astronomy and astrophysics at Penn State. His reading list is here.

Jason Wright of Penn State was especially impressed by the project, which he said in the future can look at much of the sky at once and was put together with on very limited budget.

Wright, who teaches a course on SETI at Penn State and is a co-author of a recent paper trying to formalize SETI terminology, said his own take-away from the conference is that it may well represent an important and positive moment in the history of technosignatures.

“Without NASA support, the whole field has lacked the normal structure by which astronomy advances,” he said.  “No teaching of the subject, no standard terms, no textbook to formalize findings and understandings.

“The SETI Institute carried us through the dark times, and they did that outside of normal, formal structures. The Institute remains essential, but hopefully that reflex identification will start to change.”

 

Participants in the technosignatures conference in Houston last month, the largest SETI gathering in years.  And this one was sponsored by NASA and put together by the NExSS for Exoplanet Systems Science (NExSS,)  an interdisciplinary agency initiative. (Delia Enriquez)
Facebooktwittergoogle_plusredditpinterestlinkedinmail

Human Space Travel, Health and Risk

Facebooktwittergoogle_plusredditpinterestlinkedinmail
Astronauts in a mock-up of the Orion space capsule, which NASA plans to use in some form as a deep-space vehicle. (NASA)

 

We all know that human space travel is risky. Always has been and always will be.

Imagine, for a second, that you’re an astronaut about to be sent on a journey to Mars and back, and you’re in a capsule on top of NASA’s second-generation Space Launch System designed for that task.

You will be 384 feet in the air waiting to launch (as tall as a 38-floor building,) the rocket system will weigh 6.5 million pounds (equivalent to almost nine fully-loaded 747 jets) and you will take off with 9.2 million pounds of thrust (34 times the total thrust of one of those 747s.)

Given the thrill and power of such a launch and later descent, everything else seemed to pale in terms of both drama and riskiness.  But as NASA has been learning more and more, the risks continue in space and perhaps even increase.

We’re not talking here about a leak or a malfunction computer system; we’re talking about absolutely inevitable risks from cosmic rays and radiation generally — as well as from micro-gravity — during a long journey in space.

Since no human has been in deep space for more than a short time, the task of understanding those health risks is very tricky and utterly dependent on testing creatures other than humans.

The most recent results are sobering.  A NASA-sponsored team at Georgetown University Medical Center in Washington looked specifically at what could happen to a human digestive system on a long Martian venture, and the results were not reassuring.

Their results, published in the Proceedings of the National Academy of Sciences  (PNAS), suggests that deep space bombardment by galactic cosmic radiation and solar particles could significantly damage gastrointestinal tissue leading to long-term functional changes and problems. The study also raises concern about high risk of tumor development in the stomach and colon.

 

Galactic cosmic rays are a variable shower of charged particles coming from supernova explosions and other events extremely far from our solar system. The sun is the other main source of energetic particles this investigation detects and characterizes. The sun spews electrons, protons and heavier ions in “solar particle events” fed by solar flares and ejections of matter from the sun’s corona. Magnetic fields around Earth protect the planet from most of these heavy particles, but astronauts do not have that protect beyond low-Earth orbit. (NASA)

 

Kamal Datta, an associate professor in the Department of Biochemistry is project leader of the NASA Specialized Center of Research (NSCOR) at Georgetown and has been studying the effects of space radiation on humans for more than a decade.

He said that heavy ions (electrically charged atoms and molecules) of elements such as iron and silicon are damaging to humans because of their greater mass compared to mass-less photons such as x-rays and gamma rays prevalent on Earth.  These heavy ions, as well as low mass protons, are ubiquitous in deep space.

Kamal Datta of Georgetown University Medical Center works with NASA to understand the potential risks from galactic cosmic radiation to astronauts who may one day travel in deep space.

“With the current shielding technology, it is difficult to protect astronauts from the adverse effects of heavy ion radiation. Although there may be a way to use medicines to counter these effects, no such agent has been developed yet,” says Datta, also a member of Georgetown Lombardi Comprehensive Cancer Center.

“While short trips, like the times astronauts traveled to the moon, may not expose them to this level of damage, the real concern is lasting injury from a long trip, such as a Mars or other deep space missions which would be much longer” he said in a release.

Datta’s team has also published on the potentially harmful effects of galactic cosmic radiation on the brain and other teams are looking at potential deep-space travel dangers the human cardio-vascular system.  Researchers are also concerned about known weakening of bone and muscle tissue, harming vision, as well as speeded-up aging during long stays in space.

With current technology, it would take about three years to travel from Earth to Mars, orbit the planet until it is in the right place for a sling-shot boost home, and then to travel back.

A radiation detection instrument on the Mars Science Laboratory (MSL),  which carried the rover Curiosity to Mars in 2011-2012, measured an estimated overall human radiation exposure for a Mars trip that would would be two-thirds of the agency’s allowed lifetime limits.  That was based on the high-energy radiation hitting the capsule, but NASA later detected radiation bursts from solar flares on Mars far higher than anything detected during the MSL transit.

All of this seems, and is, quite daunting when thinking about human travel to Mars and other deep space destinations.  And Datta is clearly sensitive about how the new results are conveyed to the public.

“I am in no way saying that people cannot travel to Mars,” he told me. “What we are doing is trying to understand the health risks so proper mitigation can be devised, not to say this kind of travel is impossible.”

“We don’t have medicines now to protect astronauts from heavy particle radiation, and we don’t have the technology now to shield them while they’re in space.  But many people are working on these problems.”

 

The Orion spacecraft in flight, as drawn by an artist. The capsule has an attached habitat module. (NASA)

 

On the medical research side, scientists have to rely on data gained from exposing mice to radiation and extrapolating those results to humans.  It would, of course, be unethical to do the testing on people.

While this kind of animal testing is accepted as generally accurate, it certainly could hide either increased protections or increased risks in humans.

Datta said that another testing issue that has been present so far is that the mice have had to be irradiated in one large dose rather than in much smaller doses over time.  It is unclear how that effects the potential damage to human organs and the breaking of DNA bonds (which can result in the growth of cancers.)  But Datta said that new instruments at NASA’s Space Radiation Laboratory (NSRL) at the Brookhaven National Laboratory on Long Island, New York, will allow for a more gradual, lower-dose experiment.

While Datta’s work has been focused on the health risks of deep space travel, galactic cosmic radiation and solar heavy particles also bombard the moon — which has no magnetic field and only a very thin atmosphere to protect it.  Apollo astronauts could safely stay on the moon for several days in their suits and their lander, but months or years of living in a colony there would pose far greater risks.

NASA has actually funded projects to shield small areas on the moon from radiation, but the issue remains very much unresolved.

Shielding also plays a major role in thinking about how to protect astronauts traveling into deep space.  The current aluminum skin of space capsules allows much of the harmful radiation to pass through, and so something is needed to block it.

 

The goal of building an inhabited colony on the moon has many avid supporters in government and the private sector. The health risks for astronauts are similar to those in deep space. (NASA/Pat Rawlings)

 

Experts have concluded that perhaps the best barrier to radiation would be a layer of water, but it is too heavy to carry in the needed amounts.  Other possibilities include organic polymers (large macromolecules with repeating subunits) such as polyethelyne.

It seems clear that issues such as these — the effects of more hazardous space radiation on astronauts in deep space and on the moon, and how to minimize those dangers — will be coming to the front burner in the years ahead.  And assuming that progress can be made, it’s a thrilling time.

What this means for space science, however, is less clear.

On one hand I recall hearing former astronaut extraordinaire and then head of the NASA Science Mission Directorate John Grunsfeld talk about how an astronaut on Mars could gather data and understandings in one week that the Curiosity rover would need a full year to match.

On the other, human space exploration is much more expensive than anything without people — yes, even including the long-delayed and ever-more-costly James Webb Space Telescope — and NASA budgets are limited.

So the question arises whether human exploration will, when it gets into high gear, swallow up the resources needed for the successors to the Hubble, Curiosity, Cassini and the other missions that have helped create what I consider to be a golden age of space science.  Risks come in many forms.

 

Facebooktwittergoogle_plusredditpinterestlinkedinmail

A National Strategy for Finding and Understanding Exoplanets (and Possibly Extraterrestrial Life)

Facebooktwittergoogle_plusredditpinterestlinkedinmail
The National Academies of Science, Engineering and Medicine took an in-depth look at what NASA, the astronomy community and the nation need to grow the burgeoning science of exoplanets — planets outside our solar system that orbit a star. (NAS)

 

An extensive, congressionally-directed study of what NASA needs to effectively learn how exoplanets form and whether some may support life was released today, and it calls for major investments in next-generation space and ground telescopes.  It also calls for the adoption of an increasingly multidisciplinary approach for addressing the innumerable questions that remain unanswered.

While the recommendations were many, the top line calls were for a sophisticated new space-based telescope for the 2030s that could directly image exoplanets, for approval and funding of the long-delayed and debated WFIRST space telescope, and for the National Science Foundation and to help fund two of the very large ground-based telescopes now under development.

The study of exoplanets has seen remarkable discoveries in the past two decades.  But the in-depth study from the private, non-profit National Academies of Sciences, Engineering and Medicine concludes that there is much more that we don’t understand than that we do, that our understandings are “substantially incomplete.”

So the two overarching goals for future exoplanet science are described as these:

 

  • To understand the formation and evolution of planetary systems as products of star formation and characterize the diversity of their architectures, composition, and environments.
  • To learn enough about exoplanets to identify potentially habitable environments and search for scientific evidence of life on worlds orbiting other stars.

 

Given the challenge, significance and complexity of these science goals, it’s no wonder that young researchers are flocking to the many fields included in exoplanet science.  And reflecting that, it is perhaps no surprise that the NAS survey of key scientific questions, goals, techniques, instruments and opportunities runs over 200 pages. (A webcast of a 1:00 pm NAS talk on the report can be accessed here.)

 


Artist’s concept showing a young sun-like star surrounded by a planet-forming disk of gas and dust.
(NASA/JPL-Caltech/T. Pyle)

These ambitious goals and recommendations will now be forwarded to the arm of the National Academies putting together 2020 Astronomy and Astrophysics Decadal Survey — a community-informed blueprint of priorities that NASA usually follows.

This priority-setting is probably most crucial for the two exoplanet direct imaging missions now being studied as possible Great Observatories for the 2030s — the paradigm-changing space telescopes NASA has launched almost every decade since the 1970s.

HabEx (the Habitable Exoplanet Observatory) and LUVOIR (the Large UV/Optical/IR Surveyor) are two direct-imaging exoplanet projects in conception phase that would indeed significantly change the exoplanet field.

Both would greatly enhance scientists’ ability to detect and characterize exoplanets. But the more ambitious LUVOIR in particular, would not only find many exoplanets in all stages of formation, but could readily read chemical components of the atmospheres and thereby get clear data on whether the planet was habitable or even if it supported life.  The LUVOIR would provide either an 8 meter or a record-breaking 15-meter space telescope, while HabEx would send up a 4 meter mirror.

HabEx and LUVOIR are competing with two other astrophysics projects for that Great Observatory designation, and so NAS support now and prioritizing later is essential if they are to become a reality.

 

An artist notional rendering of an approximately 15-meter telescope in space. This image was created for an earlier large space telescope feasibility project called ATLAST, but it is similar to what is being discussed inside and outside of NASA as a possible great observatory after the James Webb Space Telescope and the Wide-Field Infrared Survey Telescope. (NASA)

These two potential Great Observatories will be costly and would take many years to design and build.  As the study acknowledges and explains, “While the committee recognized that developing a direct imaging capability will require large financial investments and a long time scale to see results, the effort will foster the development of the scientific community and technological capacity to understand myriad worlds.”

So a lot is at stake.  But with budget and space priorities in flux, the fate of even the projects given the highest priority in the Decadal Survey remains unclear.

That’s apparent in the fact that one of the top recommendations of today’s study is the funding of the number one priority put forward in the 2010 Astronomy and Astrophysics Decadal Survey — the Wide Field Infrared Survey Telescope (WFIRST.)

The project — which would boost the search for exoplanets further from their stars than earlier survey mission using microlensing– was cancelled in the administration’s proposed 2019 federal budget.  Congress has continued funding some development of this once top priority, but its future nonetheless remains in doubt.

WFIRST could have the capability of directly imaging exoplanets if it were built with technology to block out the blinding light of the star around which exoplanets would be orbiting — doing so either with internal coronagraph or a companion starshade.  This would be novel technology for a space-based telescope, and the NAS survey recommends it as well.

 

An artist’s rendering of a possible “starshade” that could be launched to work with WFIRST or another space telescope and allow the telescope to take direct pictures of other Earth-like planets. (NASA/JPL-Caltech)

The list of projects the study recommends is long, with these important additions:

That “ground-based astronomy – enabled by two U.S.-led telescopes – will also play a pivotal role in studying planet formation and potentially terrestrial worlds, the report says. The future Giant Magellan telescope (GMT) and proposed Thirty Meter Telescope (TMT) would allow profound advances in imaging and spectroscopy – absorption and emission of light – of entire planetary systems. They also could detect molecular oxygen in temperate terrestrial planets in transit around close and small stars, the report says.”

The committee concluded that the technology road map to enable the full potential of GMT and TMT in the study of exoplanets is in need of investments, and should leverage the existing network of U.S. centers and laboratories. To that end, the report recommends that the National Science Foundation invest in both telescopes and their exoplanet instrumentation to provide all-sky access to the U.S. community.

And for another variety of ground-based observing the study called for the funding of a project to substantially increase the precision of instruments that find and measure exoplanets using the detected “wobble” of the host star.  But stars are active with or without a nearby exoplanet, and so it has been difficult to achieve the precision that astronomers using this “radial velocity” technique need to find and characterize smaller exoplanets.

Several smaller efforts to increase this precision are under way in the U.S., and the European Southern Observatory has a much larger project in development.

Additionally, the report recommends that the administrators of the James Webb Space Telescope give significant amounts of observing time to exoplanet study, especially early in its time aloft (now scheduled to begin in 2021.)  The atmospheric data that JWST can potentially collect could and would be used in conjunction with results coming from other telescopes, and to further study of exoplanet targets that are already promising based on existing theories and findings.

 

Construction has begun on the Giant Magellan Telescope at the Carnegie Institution’s Las Campanas Observatory in Chile. This artist rendering shows what the 24.5 meter (80 foot) segmented mirror and observatory will look like when completed, estimated to be in 2024. (Mason Media Inc.)

 

While the NAS report gives a lot of attention to instruments and ways to use them, it also focuses as never before on astrobiology — the search for life beyond Earth.

Much work has been done on how to determine whether life exists on a distant planet through modeling and theorizing about biosignatures.  The report encourages scientists to expand that work and embraces it as a central aspect of exoplanet science.

The study also argues that interdisciplinary science — bringing together researchers from many disciplines — is the necessary way forward.  It highlights the role of the Nexus for Exoplanet System Science, a NASA initiative which since 2015 has brought together a broad though limited number  of science teams from institutions across the country to learn about each other’s work and collaborate whenever possible.

The initiative itself has not required much funding, instead bringing in teams that had been supported with other grants.   However, that may be changing. One of the study co-chairs, David Charbonneau of Harvard University, said after the release of the study that the “promise of NExSS is tremendous…We really want that idea to grow and have a huge impact.”

The NAS study itself recommends that “building on the NExSS model, NASA should support a cross-divisional exoplanet research coordination network that includes additional membership opportunities via dedicated proposal calls for interdisciplinary research.”

The initiative, I’m proud to say, sponsors this interdisciplinary column in addition to all that interdisciplinary science.

Facebooktwittergoogle_plusredditpinterestlinkedinmail

Large Reservoir of Liquid Water Found Deep Below the Surface of Mars

Facebooktwittergoogle_plusredditpinterestlinkedinmail
Artist impression of the Mars Express spacecraft probing the southern hemisphere of Mars, superimposed on a radar cross section of the southern polar layered deposits. The leftmost white line is the radar echo from the Martian surface, while the light blue spots are highlighted radar echoes along the bottom of the ice.  Those highlighted areas measure very high reflectivity, interpreted as being caused by the presence of water. (ESA, INAF. Graphic rendering by Davide Coero Borga )

Far beneath the frigid surface of the South Pole of Mars is probably the last place where you might expect the first large body of Martian liquid water would be found.  It’s -170 F on the surface, there are no known geothermal sources that could warm the subterranean ice to make a meltwater lake, and the liquid water is calculated to be more than a mile below the surface.

Yet signs of that liquid water are what a team of Italian scientists detected — a finding that they say strongly suggests that there are other underground lakes and streams below the surface of Mars.  In a Science journal article released today, the scientists described the subterranean lake they found as being about 20 kilometers in diameter.

The detection adds significantly to the long-studied and long-debated question of how much surface water was once on Mars, a subject that has major implications for the question of whether life ever existed on the planet.

Finding the subterranean lake points to not only a wetter early Mars, said co-author Enrico Flamini of the Italian space agency, but also to a Mars that had a water cycle that collected and delivered the liquid water.  That would mean the presence of clouds, rain, evaporation, rivers, lakes and water to seep through surface cracks and pool underground.

Scientists have found many fossil waterways on Mars, minerals that can only be formed in the presence of water, and what might be the site of an ancient ocean.

But in terms of liquid water now on the planet, the record is thin.  Drops of water collected on the leg of NASA’s Phoenix Lander after it touched down in 2008, and what some have described as briny water appears to be flowing down some steep slopes in summertime.  Called recurrent slope lineae or RSLs, they appear at numerous locations when the temperatures rise and disappear when they drop.

This lake is different, however, and its detection is a major step forward in understanding the history of Mars.

Color photo mosaic of a portion of Planum Australe on Mars.  The subsurface reflective echo power is color coded and deep blue corresponds to the strongest reflections, which are interpreted as being caused by the presence of water. (USGS Astrogeology Science Center, Arizona State University, INAF)

The discovery was made analyzing echoes captured by the the radar instruments on the European Space Agency’s Mars Express, a satellite orbiting the planet since 2002.  The data for this discovery was collected from observation made between 2012 and 2015.

 

A schematic of how scientists used radar to find what they interpret to be liquid water beneath the surface of Mars. (ESA)

Antarctic researchers have long used radar on aircraft to search for lakes beneath the thick glaciers and ice layers,  and have found several hundred.  The largest is Lake Vostok, which is the sixth largest lake on Earth in terms of volume of water.  And it is two miles below the coldest spot on Earth.

So looking for a liquid lake below the southern pole of Mars wasn’t so peculiar after all.  In fact, lead author Roberto Orosei of the Institute of Radioastronomy of Bologna, Italy said that it was the ability to detect subsurface water beneath the ice of Antarctica and Greenland that helped inspire the team to look at Mars.

There are a number of ways to keep water liquid in the deep subsurface even when it is surrounded by ice.  As described by the Italian team and an accompanying Science Perspective article by Anja Diez of the Norwegian Polar Institute, the enormous pressure of the ice lowers the freezing point of water substantially.

Added to that pressure on Mars is the known presence of many salts, that the authors propose mix with the water to form a brine that lowers the freezing point further.

So the conditions are present for additional lakes and streams on Mars.  And according to Flamini, solar system exploration manager for the Italian space agency, the team is confident there are more and some of them larger than the one detected.  Finding them, however, is a difficult process and may be beyond the capabilities of the radar equipment now orbiting Mars.

 

Subsurface lakes and rivers in Antarctica. Now at least one similar lake has been found under the southern polar region of Mars. (NASA/JPL)

The view that subsurface water is present on Mars is hardly new.  Stephen Clifford, for many years a staff scientist at the Lunar and Planetary Institute, even wrote in 1987 that there could be liquid water at the base of the Martian poles due to the kind of high pressure environments he had studied in Greenland and Antarctica.

So you can imagine how gratifying it might be to learn, as he put it “of some evidence that shows that early theoretical work has some actual connection to reality.”

He considers the new findings to be “persuasive, but not definitive” — needing confirmation with other instruments.

Clifford’s wait has been long, indeed.  Many observations by teams using myriad instruments over the years did not produce the results of the Italian team.

Their discovery of liquid water is based on receiving particularly strong radar echoes from the base of the southern polar ice — echoes consistent with the higher radar reflectivity of water (as opposed to ice or rock.)

After analyzing the data in some novels ways and going through the many possible explanations other than the presence of a lake, Orosei said that none fit the results they had.  The explanation, then, was clear:  “We have to conclude there is liquid water on Mars.”

The depth of the lake — the distance from top to bottom — was impossible to measure, though the team concluded it was at least one meter and perhaps in the tens of meters.

Might the lake be a habitable?  Orosei said that because of the high salt levels “this is not a very pleasant environment for life.”

But who knows?  As he pointed out, Lake Vostok and other subglacial Antarctic lake, are known to be home to single-cell organisms that not only survive in their very salty world, but use the salt as part of their essential metabolism.

 

 

Facebooktwittergoogle_plusredditpinterestlinkedinmail