The Moon-Forming Impact And Its Gifts

Facebooktwittergoogle_plusredditpinterestlinkedinmail

 

Rice University petrologists have found Earth most likely received the bulk of its carbon, nitrogen and other life-essential volatile elements from the planetary collision that created the moon more than 4.4 billion years ago. (Rice University)

 

The question of how life-essential elements such as carbon, nitrogen and sulfur came to our planet has been long debated and is a clearly important and slippery scientific subject.

Did these volatile elements accrete onto the proto-Earth from the sun’s planetary disk as the planet was being formed?  Did they arrive substantially later via meteorite or comet?  Or was it the cataclysmic moon-forming impact of the proto-Earth and another Mars-sized planet that brought in those essential elements?

Piecing this story together is definitely challenging,  but now there is vigorous support for one hypothesis — that the giant impact brought us the elements would later be used to enable life.

Based on high pressure-temperature experiments, modeling and simulations, a team at Rice University’s Department of Earth, Environmental and Planetary Sciences makes that case in Science Advances for the central role of the proto-planet called Theia.

“From the study of primitive meteorites, scientists have long known that Earth and other rocky planets in the inner solar system are volatile-depleted,” said study co-author Rajdeep Dasgupta. “But the timing and mechanism of volatile delivery has been hotly debated. Ours is the first scenario that can explain the timing and delivery in a way that is consistent with all of the geochemical evidence.”

“What we are saying is that the impactor definitely brought the majority supply of life-essential elements that we see at the mantle and surface today,” Dasgupta wrote in an email.

 

A schematic depicting the formation of a Mars-sized planet (left) and its differentiation into a body with a metallic core and an overlying silicate reservoir. The sulfur-rich core expels carbon, producing silicate with a high carbon to nitrogen ratio. The moon-forming collision of such a planet with the growing Earth (right) can explain Earth’s abundance of both water and major life-essential elements like carbon, nitrogen and sulfur, as well as the geochemical similarity between Earth and the moon. (Rajdeep Dasgupta; background photo of the Milky Way galaxy is by Deepayan Mukhopadhyay)

 

Some of their conclusions are based on the finding of a similarity between the isotopic compositions of nitrogen and hydrogen in lunar glasses and in the bulk silicate portions of the Earth.  The Earth and moon volatiles, they conclude, “have a common origin.”

Carbon, nitrogen and sulfur are deemed “volatile” elements because they have a relatively low boiling point and can easily fly off into space from planets and moons in their early growing stages.  A number of other life-important chemicals, including water, are volatiles as well.

The recent findings are grounded in a series of experiments by study lead author and graduate student Damanveer Grewal, who works in the Dasgupta lab.  Grewal gathered evidence to test the theory that Earth’s volatiles arrived when the embryonic planet Theia — that had a sulfur-rich core — crashed into very early Earth.

The sulfur content of the donor planet’s core matters because of the puzzling evidence about the carbon, nitrogen and sulfur that exist in all parts of the Earth — other than the core.  The team needed to test the conditions under which a core with sulfur could, in effect, exclude other volatiles, thus making them more common in the planet’s mantle and above — and as a result more available to a planet it might crash into.  

The high temperature and pressure tests led to a computer simulation to find the most likely scenario that produced Earth’s volatiles. Finding the answer involved varying the starting conditions, running approximately 1 billion scenarios and comparing them against the known conditions in the solar system today.

 

Carbonaceous chondrites include some of the most primitive known meteorites, including the iconic Allende meteorite. Some scientists have proposed they delivered volatiles to Earth, but this latest paper disputes that conclusion.

“What we found is that all the evidence — isotopic signatures, the carbon-nitrogen ratio and the overall amounts of carbon, nitrogen and sulfur in the bulk silicate Earth — are consistent with a moon-forming impact involving a volatile-bearing, Mars-sized planet with a sulfur-rich core,” Grewal said.

Another often-cited explanation about how Earth received its volatiles is the “late veneer” theory, which holds that volatile-rich meteorites, leftover chunks of primordial matter from the outer solar system, arrived after Earth’s core formed.

And while the isotopic signatures of Earth’s volatiles match these primordial objects, known as carbonaceous chondrites, the elemental ratio of carbon to nitrogen is off. Earth’s non-core material, which geologists call the bulk silicate Earth, has about 40 parts carbon to each part nitrogen, approximately twice the 20-1 ratio seen in carbonaceous chondrites. 

This led to their conclusion that the late veneer theory could not explain the conditions they had found.

Although the Rice team’s paper does not go in depth into the question of how water got to Earth, Dasgupta wrote that the team’s conclusion that the moon-forming impact brought with it other volatiles, “it is likely that the impactor would contain and bring some water too. This is especially likely because this impactor needs to form in part from oxidized carbonaceous chondritic materials (that is the condition our experiments simulated as well).

“So although we did not factor in matching the water budget in our model, it is entirely possible that this impactor brought Earth’s water budget too, if the proto-Earth was water-poor.”

 

A study by Rice University scientists (from left) Gelu Costin, Chenguang Sun, Damanveer Grewal (sitting), Kyusei Tsuno, and Rajdeep Dasgupta found Earth most likely received the bulk of its carbon, nitrogen, and other life-essential elements from the planetary collision that created the moon more than 4.4 billion years ago. The findings appear in the journal Science Advances. (Jeff Fitlow/Rice University)

 

Dasgupta, the principal investigator on a NASA-funded effort called CLEVER Planets that is exploring how life-essential elements might come together on distant rocky planets, said better understanding the origin of Earth’s life-essential elements has implications beyond our solar system.

“This study suggests that a rocky, Earth-like planet gets more chances to acquire life-essential elements if it forms and grows from giant impacts with planets that have sampled different building blocks, perhaps from different parts of a protoplanetary disk,” Dasgupta said.

“This removes some boundary conditions,” he said. “It shows that life-essential volatiles can arrive at the surface layers of a planet, even if they were produced on planetary bodies that underwent core formation under very different conditions.”

Dasgupta said it does not appear that Earth’s initial composition of bulk silicate, on its own,  could have attained the concentrations of those life-essential volatiles needed to produce our atmosphere and hydrosphere and biosphere.

This all has great implications for exoplanet studies, he said.  It means that “we can broaden our search for pathways that lead to volatile elements coming together on a planet to support life as we know it.”

CLEVER Planets is part of the Nexus for Exoplanet System Science, or NExSS, a NASA astrobiology research coordination network that is dedicated to the study of planetary habitability. CLEVER Planets involves more than a dozen research groups from Rice, NASA’s Johnson Space Center, UCLA, the University of Colorado Boulder and the University of California, Davis.

 

 

Facebooktwittergoogle_plusredditpinterestlinkedinmail

The Kepler Space Telescope Mission Is Ending But Its Legacy Will Keep Growing.

Facebooktwittergoogle_plusredditpinterestlinkedinmail
An illustration of the Kepler Space Telescope, which is on its very last legs.  As of October 2018, the planet-hunting spacecraft has been in space for nearly a decade. (NASA via AP)

 

The Kepler Space Telescope is dead.  Long live the Kepler.

NASA officials announced on Tuesday that the pioneering exoplanet survey telescope — which had led to the identification of almost 2,700 exoplanets — had finally reached its end, having essentially run out of fuel.  This is after nine years of observing, after a malfunctioning steering system required a complex fix and change of plants, and after the hydrazine fuel levels reached empty.

While the sheer number of exoplanets discovered is impressive the telescope did substantially more:  it proved once and for all that the galaxy is filled with planets orbiting distant stars.  Before Kepler this was speculated, but now it is firmly established thanks to the Kepler run.

It also provided data for thousands of papers exploring the logic and characteristics of exoplanets.  And that’s why the Kepler will indeed live long in the world of space science.

“As NASA’s first planet-hunting mission, Kepler has wildly exceeded all our expectations and paved the way for our exploration and search for life in the solar system and beyond,” said Thomas Zurbuchen, associate administrator of NASA’s Science Mission Directorate in Washington.

“Not only did it show us how many planets could be out there, it sparked an entirely new and robust field of research that has taken the science community by storm. Its discoveries have shed a new light on our place in the universe, and illuminated the tantalizing mysteries and possibilities among the stars.”

 

 


The Kepler Space Telescope was focused on hunting for planets in this patch of the Milky Way. After two of its four spinning reaction wheels failed, it could no longer remain steady enough to stare that those distant stars but was reconfigured to look elsewhere and at a different angle for the K2 mission. (Carter Roberts/NASA)

 

Kepler was initially the unlikely brainchild of William Borucki, its founding principal investigator who is now retired from NASA’s Ames Research Center in California’s Silicon Valley.

When he began thinking of designing and proposing a space telescope that could potentially tell us how common distant exoplanets were — and especially smaller terrestrial exoplanets like Earth – the science of extra solar planets was at a very different stage.

William Borucki, originally the main champion for the Kepler idea and later the principal investigator of the mission. His work at NASA went back to the Apollo days. (NASA)

“When we started conceiving this mission 35 years ago we didn’t know of a single planet outside our solar system,” Borucki said.  “Now that we know planets are everywhere, Kepler has set us on a new course that’s full of promise for future generations to explore our galaxy.”

The space telescope was launched in 2009.  While Kepler did not find the first exoplanets — that required the work of astronomers using a different technique of observing based on the “wobble” of stars caused by orbiting planets — it did change the exoplanet paradigm substantially.

Not only did it prove that exoplanets are common, it found that planets outnumber stars in our galaxy (which has hundreds of billions of those stars.)

In addition it found that small, terrestrial-size planets are common as well, with some 20 to 50 percent of stars likely to have planets of that size and type.  And what menagerie of planets it found out there.

Astrophysicist Natalie Batalha was the Kepler project and mission scientist for a decade. She left NASA recently for the University of California at Santa Cruz “to carry on the Kepler legacy” by creating an interdisciplinary center for the study of planetary habitability.

Among the greatest surprises:  The Kepler mission provided data showing that the most common sized planets in the galaxy fall somewhere between Earth and Neptune, a type of planet that isn’t present in our solar system.

It found solar systems of all sizes as well, including some with many planets (as many as eight) orbiting close to their host star.

The discovery of these compact systems, generally orbiting a red dwarf star, raised questions about how solar systems form: Are these planets “born” close to their parent star, or do they form farther out and migrate in?

So far, more than 2,500 peer-reviewed papers have been published using Kepler data, with substantial amounts of that data still unmined.

Natalie Batalha was the project and mission scientist for Kepler for much of its run, and I asked her about its legacy.

“When I think of Kepler’s influence across all of astrophysics, I’m amazed at what such a simple experiment accomplished,” she wrote in an email. “You’d be hard-pressed to come up with a more boring mandate — to unblinkingly measure the brightnesses of the same stars for years on end. No beautiful images. No fancy spectra. No landscapes. Just dots in a scatter plot.

“And yet time-domain astronomy exploded. We’d never looked at the Universe quite this way before. We saw lava worlds and water worlds and disintegrating planets and heart-beat stars and supernova shock waves and the spinning cores of stars and planets the age of the galaxy itself… all from those dots.”

 

The Kepler-62 system is put one of many solar systems detected by the space telescope. The planets within the green discs are in the habitable zones of the stars — where water could be liquid at times. (NASA)

 

While Kepler provided remarkable answers to questions about the overall planetary makeup of our galaxy, it did not identify smaller planets that will be directly imaged, the evolving gold standard for characterizing exoplanets.  The 150,000 stars that the telescope was observing were very distant, in the range of a few hundred to a few thousand light-years away. One light year is about 6 trillion (6,000,000,000,000) miles.

Nonetheless, Kepler was able to detect  the presence of a handful of Earth-sized planets in the habitable zones of their stars.  The Kepler-62 system held one of them, and it is 1200 light-years away.  In contrast, the four Earth-sized planets in the habitable zone of the much-studied Trappist-1 system are 39 light-years away.

Kepler made its observations using the the transit technique, which looks for tiny dips in the amount of light coming from a star caused by the presence of a planet passing in front of the star.  While the inference that exoplanets are ubiquitous came from Kepler results, the telescope was actually observing but a small bit of the sky.  It has been estimated that it would require around 400 space telescopes like Kepler to cover the whole sky.

What’s more, only planets whose orbits are seen edge-on from Earth can be detected via the transit method, and that rules out a vast number of exoplanets.

The bulk of the stars that were selected for close Kepler observation were more or less sun-like, but a sampling of other stars occurred as well. One of the most important factors was brightness. Detecting minuscule changes in brightness caused by transiting planet is impossible if the star is too dim.

 

The artist’s concept depicts Kepler-186f, the first validated Earth-size planet to orbit a distant star in the habitable zone. (NASA Ames/SETI Institute/JPL-Caltech)

 

Four years into the mission, after the primary mission objectives had been met, mechanical failures temporarily halted observations. The mission team was able to devise a fix, switching the spacecraft’s field of view roughly every three months. This enabled an extended mission for the spacecraft, dubbed K2, which lasted as long as the first mission and bumped Kepler’s count of surveyed stars up to more than 500,000.

But it was inevitable that the mission would come to an end sooner rather than later because of that dwindling fuel supply, needed to keep the telescope properly pointed.

Kepler cannot be refueled because NASA decided to place the telescope in an orbit around the sun that is well beyond the influence of the Earth and moon — to simplify operations and ensure an extremely quiet, stable environment for scientific observations.  So Kepler was beyond the reach of any refueling vessel.  The Kepler team compensated by flying considerably more fuel than was necessary to meet the mission objectives.

The video below explains what will happen to the Kepler capsule once it is decommissioned.  But a NASA release explains that the final commands “will be to turn off the spacecraft transmitters and disable the onboard fault protection that would turn them back on. While the spacecraft is a long way from Earth and requires enormous antennas to communicate with it, it is good practice to turn off transmitters when they are no longer being used, and not pollute the airwaves with potential interference.”

 

 

And so Kepler will actually continue orbiting for many decades, just as its legacy will continue long after operations cease.

Kepler’s follow-on exoplanet surveyor — the Transiting Exoplanet Survey Satellite or TESS — was launched this year and has begun sending back data.  Its primary mission objective is to survey the brightest stars near the Earth for transiting exoplanets. The TESS satellite uses an array of wide-field cameras to survey some 85% of the sky, and is planned to last for two years.

Facebooktwittergoogle_plusredditpinterestlinkedinmail

Technosignatures and the Search for Extraterrestrial Intelligence

Facebooktwittergoogle_plusredditpinterestlinkedinmail
A rendering of a potential Dyson sphere, named after Freeman A. Dyson. As proposed by the physicist and astronomer decades ago, they would collect solar energy on a solar system wide scale for highly advanced civilizations. (SentientDevelopments.com)

The word “SETI” pretty much brings to mind the search for radio signals come from distant planets, the movie “Contact,” Jill Tarter, Frank Drake and perhaps the SETI Institute, where the effort lives and breathes.

But there was a time when SETI — the Search for Extraterrestrial Intelligence — was a significantly broader concept, that brought in other ways to look for intelligent life beyond Earth.

In the late 1950s and early 1960s — a time of great interest in UFOs, flying saucers and the like — scientists not only came up with the idea of searching for distant intelligent life via unnatural radio signals, but also by looking for signs of unexpectedly elevated heat signatures and for optical anomalies in the night sky.

The history of this search has seen many sharp turns, with radio SETI at one time embraced by NASA, subsequently de-funded because of congressional opposition, and then developed into a privately and philanthropically funded project of rigor and breadth at the SETI Institute.  The other modes of SETI went pretty much underground and SETI became synonymous with radio searches for ET life.

But this history may be about to take another sharp turn as some in Congress and NASA have become increasingly interested in what are now called “technosignatures,” potentially detectable signatures and signals of the presence of distant advanced civilizations.  Technosignatures are a subset of the larger and far more mature search for biosignatures — evidence of microbial or other primitive life that might exist on some of the billions of exoplanets we now know exist.

And as a sign of this renewed interest, a technosignatures conference was scheduled by NASA at the request of Congress (and especially retiring Republican Rep. Lamar Smith of Texas.)  The conference took place in Houston late last month, and it was most interesting in terms of the new and increasingly sophisticated ideas being explored by scientists involved with broad-based SETI.

“There has been no SETI conference this big and this good in a very long time,” said Jason Wright, an astrophysicist and professor at Pennsylvania State University and chair of the conference’s science organizing committee.  “We’re trying to rebuild the larger SETI community, and this was a good start.”

 

At this point, the search for technosignatures is often likened to that looking for a needle in a haystack. But what scientists are trying to do is define their haystack, determine its essential characteristics, and learn how to best explore it. (Wiki Commons)

 

During the three day meeting in Houston, scientists and interested private and philanthropic reps. heard talks that ranged from the trials and possibilities of traditional radio SETI to quasi philosophical discussions about what potentially detectable planetary transformations and by-products might be signs of an advanced civilization. (An agenda and videos of the talks are here.)

The subjects ranged from surveying the sky for potential millisecond infrared emissions from distant planets that could be purposeful signals, to how the presence of certain unnatural, pollutant chemicals in an exoplanet atmosphere that could be a sign of civilization.  From the search for thermal signatures coming from megacities or other by-products of technological activity, to the possible presence of “megastructures” built to collect a star’s energy by highly evolved beings.

Michael New is Deputy Associate Administrator for Research within NASA’s Science Mission Directorate. He was initially trained in chemical physics. (NASA)

All but the near infrared SETI are for the distant future — or perhaps are on the science fiction side — but astronomy and the search for distant life do tend to move forward slowly.  Theory and inference most often coming well before observation and detection.

So thinking about the basic questions about what scientists might be looking for, Wright said, is an essential part of the process.

Indeed, it is precisely what Michael New, Deputy Associate Administrator for Research within NASA’s Science Mission Directorate, told the conference. 

He said that he, NASA and Congress wanted the broad sweep of ideas and research out there regarding technosignatures, from the current state of the field to potential near-term findings, and known limitations and possibilities.

“The time is really ripe scientifically for revisiting the ideas of technosignatures and how to search for them,” he said.

He offered the promise of NASA help  (admittedly depending to some extent on what Congress and the administration decide) for research into new surveys, new technologies, data-mining algorithms, theories and modelling to advance the hunt for technosignatures.

 

Crew members aboard the International Space Station took this nighttime photograph of much of the Atlantic coast of the United States. The ability to detect the heat and light from this kind of activity on distant exoplanets does not exist today, but some day it might and could potentially help discover an advanced extraterrestrial civilization. (NASA)

 

Among the several dozen scientists who discussed potential signals to search for were the astronomer Jill Tarter, former director of the Center for SETI Research, Planetary Science Institute astrobiologist David Grinspoon and University of Rochester astrophysicist Adam Frank.  They all looked at the big picture, what artifacts in atmospheres, on surfaces and perhaps in space that advanced civilizations would likely produce by dint of their being “advanced.”

All spoke of the harvesting of energy to perform work as a defining feature of a technological planet, with that “work” describing transportation, construction, manufacturing and more.

Beings that have reached the high level of, in Frank’s words, exo-civilization produce heat, pollutants, changes to their planets and surroundings in the process of doing that work.  And so a detection of highly unusual atmospheric, thermal, surface and orbital conditions could be a signal.

One example mentioned by several speakers is the family of chemical chloroflourocarbons (CFCs,)  which are used as commercial refrigerants, propellants and solvents.

Astronomer Jill Tarter is an iconic figure in the SETI world and led the SETI Institute for 30 years. (AFP)

These CFCs are a hazardous and unnatural pollutant on Earth because they destroy the ozone layer, and they could be doing something similar on an exoplanet.  And as described in the conference, the James Webb Space Telescope — once it’s launch and working — could most likely detect such an atmospheric compound if it’s in high concentration and the project was given sufficient telescope time.

A similar single finding described by Tarter that could be revolutionary is the radioactive isotope tritium, which is a by-product of the nuclear fusion process.  It has a short half-life and so any distant discovery would point to a recent use of nuclear energy (as long as it’s not associated with a recent supernova event, which can also produce tritium.)

But there many other less precise ideas put forward.

Glints on the surface of planets could be the product of technology,  as might be weather on an exoplanet that has been extremely well stabilized, modified planetary orbits and chemical disequilibriums in the atmosphere based on the by-products of life and work.  (These disequilibriums are a well-established feature of biosignature research, but Frank presented the idea of a technosphere which would process energy and create by-products at a greater level than its supporting biosphere.)

Another unlikely but most interesting example of a possible technosignature put forward by Tarter and Grinspoon involved the seven planets of the Trappist-1 solar system, all tidally locked and so lit on only one side.  She said that they could potentially be found to be remarkably similar in their basic structure, alignment and dynamics. As Tarter suggested, this could be a sign of highly advanced solar engineering.

 

Artist rendering of the imagined Trappist-1 solar system that had been terraformed to make the planets similar and habitable.  The system is one of the closest found to our own — about 40 light years.

 

Grinspoon seconded that notion about Trappist-1, but in a somewhat different context.

He has worked a great deal on the question of today’s anthropocene era — when humans actively change the planet — and he expanded on his thinking about Earth into the galaxies.

Grinspoon said that he had just come back from Japan, where he had visited Hiroshima and its atomic bomb sites, and came away with doubts that we were the “intelligent” civilization we often describe ourselves in SETI terms.  A civilization that may well self destruct — a fate he sees as potentially common throughout the cosmos — might be considered “proto-intelligent,” but not smart enough to keep the civilization going over a long time.

Projecting that into the cosmos, Grinspoon argued that there may well be many such doomed civilizations, and then perhaps a far smaller number of those civilizations that make it through the biological-technological bottleneck that we seem to be facing in the centuries ahead.

These civilizations, which he calls semi-immortal, would develop inherently sustainable methods of continuing, including modifying major climate cycles, developing highly sophisticated radars and other tools for mitigating risks, terraforming nearby planets, and even finding ways to evolve the planet as its place in the habitable zone of its host star becomes threatened by the brightening or dulling of that star.

The trick to trying to find such truly evolved civilizations, he said, would be to look for technosignatures that reflect anomalous stability and not rampant growth. In the larger sense, these civilizations would have integrated themselves into the functioning of the planet, just as oxygen, first primitive and then complex life integrated themselves into the essential systems of Earth.

And returning to the technological civilizations that don’t survive, they could produce physical artifacts that now permeate the galaxy.

 

MeerKAT, originally the Karoo Array Telescope, is a radio telescope consisting of 64 antennas now being tested and verified in the Northern Cape of South Africa. When fully functional it will be the largest and most sensitive radio telescope in the southern hemisphere until the Square Kilometre Array is completed in approximately 2024. (South African Radio Astronomy Observatory)

 

This is exciting – the next phase Square kilometer Array (SKA2) will be able to detect Earth-level radio leakage from nearby stars. (South African Radio Astronomy Observatory)

 

While the conference focused on technosignature theory, models, and distant possibilities, news was also shared about two concrete developments involving research today.

The first involved the radio telescope array in South Africa now called MeerKAT,  a prototype of sorts that will eventually become the gigantic Square Kilometer Array.

Breakthrough Listen, the global initiative to seek signs of intelligent life in the universe, would soon announce the commencement of  a major new program with the MeerKAT telescope, in partnership with the South African Radio Astronomy Observatory (SARAO).

Breakthrough Listen’s MeerKAT survey will examine a million individual stars – 1,000 times the number of targets in any previous search – in the quietest part of the radio spectrum, monitoring for signs of extraterrestrial technology. With the addition of MeerKAT’s observations to its existing surveys, Listen will operate 24 hours a day, seven days a week, in parallel with other surveys.

This clearly has the possibility of greatly expanded the amount of SETI listening being done.  The SETI Institute, with its radio astronomy array in northern California and various partners, have been listening for almost 60 years, without detecting a signal from our galaxy.

That might seem like a disappointing intimation that nothing or nobody else is out there, but not if you listen to Tarter explain how much listening has actually been done.  Almost ten years ago, she calculated that if the Milky Way galaxy and everything in it was an ocean, then SETI would have listened to a cup full of water from that ocean.  Jason Wright and his students did an updated calculation recently, and now the radio listening amounts to a small swimming pool within that enormous ocean.

 

The NIROSETI team with their new infrared detector inside the dome at Lick Observatory. Left to right: Remington Stone, Dan Wertheimer, Jérome Maire, Shelley Wright, Patrick Dorval and Richard Treffers. (Laurie Hatch)

The other news came from Shelley Wright of the University of California, San Diego, who has been working on an optical SETI instrument for the Lick Observatory and beyond.

She has developed a Near-Infrared Optical SETI (NIROSETI)  instrument designed to search for signals from extraterrestrials at near-infrared wavelengths — a first. The near-infrared is an excellent spectral region to search for signals from extraterrestrials, since it offers a unique window for interstellar communication.  NIROSETI is now operating 8 to 12 nights per month, overseen by students at a remote location.

In addition, Wright and Harvard University’s Paul Horowitz have been working on a novel instrument for searching the full sky all the time for very short pulses of light — an idea that came out of a Breakthrough Listen meeting in 2016. The pulses they are searching for are nanosecond to one second bursts which,  could only come from technological civilizations.

This PANOSETI (Pulsed All-sky Near-infrared Optical SETI)  uses a most unusual light-collection method that features some 100 compact, wide-viewing Fresnel lenses mounted on two small geodesic domes, and connected to the telescope at the Lick Observatory. I

Jason Wright is an assistant professor of astronomy and astrophysics at Penn State. His reading list is here.

Jason Wright of Penn State was especially impressed by the project, which he said in the future can look at much of the sky at once and was put together with on very limited budget.

Wright, who teaches a course on SETI at Penn State and is a co-author of a recent paper trying to formalize SETI terminology, said his own take-away from the conference is that it may well represent an important and positive moment in the history of technosignatures.

“Without NASA support, the whole field has lacked the normal structure by which astronomy advances,” he said.  “No teaching of the subject, no standard terms, no textbook to formalize findings and understandings.

“The SETI Institute carried us through the dark times, and they did that outside of normal, formal structures. The Institute remains essential, but hopefully that reflex identification will start to change.”

 

Participants in the technosignatures conference in Houston last month, the largest SETI gathering in years.  And this one was sponsored by NASA and put together by the NExSS for Exoplanet Systems Science (NExSS,)  an interdisciplinary agency initiative. (Delia Enriquez)
Facebooktwittergoogle_plusredditpinterestlinkedinmail

Human Space Travel, Health and Risk

Facebooktwittergoogle_plusredditpinterestlinkedinmail
Astronauts in a mock-up of the Orion space capsule, which NASA plans to use in some form as a deep-space vehicle. (NASA)

 

We all know that human space travel is risky. Always has been and always will be.

Imagine, for a second, that you’re an astronaut about to be sent on a journey to Mars and back, and you’re in a capsule on top of NASA’s second-generation Space Launch System designed for that task.

You will be 384 feet in the air waiting to launch (as tall as a 38-floor building,) the rocket system will weigh 6.5 million pounds (equivalent to almost nine fully-loaded 747 jets) and you will take off with 9.2 million pounds of thrust (34 times the total thrust of one of those 747s.)

Given the thrill and power of such a launch and later descent, everything else seemed to pale in terms of both drama and riskiness.  But as NASA has been learning more and more, the risks continue in space and perhaps even increase.

We’re not talking here about a leak or a malfunction computer system; we’re talking about absolutely inevitable risks from cosmic rays and radiation generally — as well as from micro-gravity — during a long journey in space.

Since no human has been in deep space for more than a short time, the task of understanding those health risks is very tricky and utterly dependent on testing creatures other than humans.

The most recent results are sobering.  A NASA-sponsored team at Georgetown University Medical Center in Washington looked specifically at what could happen to a human digestive system on a long Martian venture, and the results were not reassuring.

Their results, published in the Proceedings of the National Academy of Sciences  (PNAS), suggests that deep space bombardment by galactic cosmic radiation and solar particles could significantly damage gastrointestinal tissue leading to long-term functional changes and problems. The study also raises concern about high risk of tumor development in the stomach and colon.

 

Galactic cosmic rays are a variable shower of charged particles coming from supernova explosions and other events extremely far from our solar system. The sun is the other main source of energetic particles this investigation detects and characterizes. The sun spews electrons, protons and heavier ions in “solar particle events” fed by solar flares and ejections of matter from the sun’s corona. Magnetic fields around Earth protect the planet from most of these heavy particles, but astronauts do not have that protect beyond low-Earth orbit. (NASA)

 

Kamal Datta, an associate professor in the Department of Biochemistry is project leader of the NASA Specialized Center of Research (NSCOR) at Georgetown and has been studying the effects of space radiation on humans for more than a decade.

He said that heavy ions (electrically charged atoms and molecules) of elements such as iron and silicon are damaging to humans because of their greater mass compared to mass-less photons such as x-rays and gamma rays prevalent on Earth.  These heavy ions, as well as low mass protons, are ubiquitous in deep space.

Kamal Datta of Georgetown University Medical Center works with NASA to understand the potential risks from galactic cosmic radiation to astronauts who may one day travel in deep space.

“With the current shielding technology, it is difficult to protect astronauts from the adverse effects of heavy ion radiation. Although there may be a way to use medicines to counter these effects, no such agent has been developed yet,” says Datta, also a member of Georgetown Lombardi Comprehensive Cancer Center.

“While short trips, like the times astronauts traveled to the moon, may not expose them to this level of damage, the real concern is lasting injury from a long trip, such as a Mars or other deep space missions which would be much longer” he said in a release.

Datta’s team has also published on the potentially harmful effects of galactic cosmic radiation on the brain and other teams are looking at potential deep-space travel dangers the human cardio-vascular system.  Researchers are also concerned about known weakening of bone and muscle tissue, harming vision, as well as speeded-up aging during long stays in space.

With current technology, it would take about three years to travel from Earth to Mars, orbit the planet until it is in the right place for a sling-shot boost home, and then to travel back.

A radiation detection instrument on the Mars Science Laboratory (MSL),  which carried the rover Curiosity to Mars in 2011-2012, measured an estimated overall human radiation exposure for a Mars trip that would would be two-thirds of the agency’s allowed lifetime limits.  That was based on the high-energy radiation hitting the capsule, but NASA later detected radiation bursts from solar flares on Mars far higher than anything detected during the MSL transit.

All of this seems, and is, quite daunting when thinking about human travel to Mars and other deep space destinations.  And Datta is clearly sensitive about how the new results are conveyed to the public.

“I am in no way saying that people cannot travel to Mars,” he told me. “What we are doing is trying to understand the health risks so proper mitigation can be devised, not to say this kind of travel is impossible.”

“We don’t have medicines now to protect astronauts from heavy particle radiation, and we don’t have the technology now to shield them while they’re in space.  But many people are working on these problems.”

 

The Orion spacecraft in flight, as drawn by an artist. The capsule has an attached habitat module. (NASA)

 

On the medical research side, scientists have to rely on data gained from exposing mice to radiation and extrapolating those results to humans.  It would, of course, be unethical to do the testing on people.

While this kind of animal testing is accepted as generally accurate, it certainly could hide either increased protections or increased risks in humans.

Datta said that another testing issue that has been present so far is that the mice have had to be irradiated in one large dose rather than in much smaller doses over time.  It is unclear how that effects the potential damage to human organs and the breaking of DNA bonds (which can result in the growth of cancers.)  But Datta said that new instruments at NASA’s Space Radiation Laboratory (NSRL) at the Brookhaven National Laboratory on Long Island, New York, will allow for a more gradual, lower-dose experiment.

While Datta’s work has been focused on the health risks of deep space travel, galactic cosmic radiation and solar heavy particles also bombard the moon — which has no magnetic field and only a very thin atmosphere to protect it.  Apollo astronauts could safely stay on the moon for several days in their suits and their lander, but months or years of living in a colony there would pose far greater risks.

NASA has actually funded projects to shield small areas on the moon from radiation, but the issue remains very much unresolved.

Shielding also plays a major role in thinking about how to protect astronauts traveling into deep space.  The current aluminum skin of space capsules allows much of the harmful radiation to pass through, and so something is needed to block it.

 

The goal of building an inhabited colony on the moon has many avid supporters in government and the private sector. The health risks for astronauts are similar to those in deep space. (NASA/Pat Rawlings)

 

Experts have concluded that perhaps the best barrier to radiation would be a layer of water, but it is too heavy to carry in the needed amounts.  Other possibilities include organic polymers (large macromolecules with repeating subunits) such as polyethelyne.

It seems clear that issues such as these — the effects of more hazardous space radiation on astronauts in deep space and on the moon, and how to minimize those dangers — will be coming to the front burner in the years ahead.  And assuming that progress can be made, it’s a thrilling time.

What this means for space science, however, is less clear.

On one hand I recall hearing former astronaut extraordinaire and then head of the NASA Science Mission Directorate John Grunsfeld talk about how an astronaut on Mars could gather data and understandings in one week that the Curiosity rover would need a full year to match.

On the other, human space exploration is much more expensive than anything without people — yes, even including the long-delayed and ever-more-costly James Webb Space Telescope — and NASA budgets are limited.

So the question arises whether human exploration will, when it gets into high gear, swallow up the resources needed for the successors to the Hubble, Curiosity, Cassini and the other missions that have helped create what I consider to be a golden age of space science.  Risks come in many forms.

 

Facebooktwittergoogle_plusredditpinterestlinkedinmail

A National Strategy for Finding and Understanding Exoplanets (and Possibly Extraterrestrial Life)

Facebooktwittergoogle_plusredditpinterestlinkedinmail
The National Academies of Science, Engineering and Medicine took an in-depth look at what NASA, the astronomy community and the nation need to grow the burgeoning science of exoplanets — planets outside our solar system that orbit a star. (NAS)

 

An extensive, congressionally-directed study of what NASA needs to effectively learn how exoplanets form and whether some may support life was released today, and it calls for major investments in next-generation space and ground telescopes.  It also calls for the adoption of an increasingly multidisciplinary approach for addressing the innumerable questions that remain unanswered.

While the recommendations were many, the top line calls were for a sophisticated new space-based telescope for the 2030s that could directly image exoplanets, for approval and funding of the long-delayed and debated WFIRST space telescope, and for the National Science Foundation and to help fund two of the very large ground-based telescopes now under development.

The study of exoplanets has seen remarkable discoveries in the past two decades.  But the in-depth study from the private, non-profit National Academies of Sciences, Engineering and Medicine concludes that there is much more that we don’t understand than that we do, that our understandings are “substantially incomplete.”

So the two overarching goals for future exoplanet science are described as these:

 

  • To understand the formation and evolution of planetary systems as products of star formation and characterize the diversity of their architectures, composition, and environments.
  • To learn enough about exoplanets to identify potentially habitable environments and search for scientific evidence of life on worlds orbiting other stars.

 

Given the challenge, significance and complexity of these science goals, it’s no wonder that young researchers are flocking to the many fields included in exoplanet science.  And reflecting that, it is perhaps no surprise that the NAS survey of key scientific questions, goals, techniques, instruments and opportunities runs over 200 pages. (A webcast of a 1:00 pm NAS talk on the report can be accessed here.)

 


Artist’s concept showing a young sun-like star surrounded by a planet-forming disk of gas and dust.
(NASA/JPL-Caltech/T. Pyle)

These ambitious goals and recommendations will now be forwarded to the arm of the National Academies putting together 2020 Astronomy and Astrophysics Decadal Survey — a community-informed blueprint of priorities that NASA usually follows.

This priority-setting is probably most crucial for the two exoplanet direct imaging missions now being studied as possible Great Observatories for the 2030s — the paradigm-changing space telescopes NASA has launched almost every decade since the 1970s.

HabEx (the Habitable Exoplanet Observatory) and LUVOIR (the Large UV/Optical/IR Surveyor) are two direct-imaging exoplanet projects in conception phase that would indeed significantly change the exoplanet field.

Both would greatly enhance scientists’ ability to detect and characterize exoplanets. But the more ambitious LUVOIR in particular, would not only find many exoplanets in all stages of formation, but could readily read chemical components of the atmospheres and thereby get clear data on whether the planet was habitable or even if it supported life.  The LUVOIR would provide either an 8 meter or a record-breaking 15-meter space telescope, while HabEx would send up a 4 meter mirror.

HabEx and LUVOIR are competing with two other astrophysics projects for that Great Observatory designation, and so NAS support now and prioritizing later is essential if they are to become a reality.

 

An artist notional rendering of an approximately 15-meter telescope in space. This image was created for an earlier large space telescope feasibility project called ATLAST, but it is similar to what is being discussed inside and outside of NASA as a possible great observatory after the James Webb Space Telescope and the Wide-Field Infrared Survey Telescope. (NASA)

These two potential Great Observatories will be costly and would take many years to design and build.  As the study acknowledges and explains, “While the committee recognized that developing a direct imaging capability will require large financial investments and a long time scale to see results, the effort will foster the development of the scientific community and technological capacity to understand myriad worlds.”

So a lot is at stake.  But with budget and space priorities in flux, the fate of even the projects given the highest priority in the Decadal Survey remains unclear.

That’s apparent in the fact that one of the top recommendations of today’s study is the funding of the number one priority put forward in the 2010 Astronomy and Astrophysics Decadal Survey — the Wide Field Infrared Survey Telescope (WFIRST.)

The project — which would boost the search for exoplanets further from their stars than earlier survey mission using microlensing– was cancelled in the administration’s proposed 2019 federal budget.  Congress has continued funding some development of this once top priority, but its future nonetheless remains in doubt.

WFIRST could have the capability of directly imaging exoplanets if it were built with technology to block out the blinding light of the star around which exoplanets would be orbiting — doing so either with internal coronagraph or a companion starshade.  This would be novel technology for a space-based telescope, and the NAS survey recommends it as well.

 

An artist’s rendering of a possible “starshade” that could be launched to work with WFIRST or another space telescope and allow the telescope to take direct pictures of other Earth-like planets. (NASA/JPL-Caltech)

The list of projects the study recommends is long, with these important additions:

That “ground-based astronomy – enabled by two U.S.-led telescopes – will also play a pivotal role in studying planet formation and potentially terrestrial worlds, the report says. The future Giant Magellan telescope (GMT) and proposed Thirty Meter Telescope (TMT) would allow profound advances in imaging and spectroscopy – absorption and emission of light – of entire planetary systems. They also could detect molecular oxygen in temperate terrestrial planets in transit around close and small stars, the report says.”

The committee concluded that the technology road map to enable the full potential of GMT and TMT in the study of exoplanets is in need of investments, and should leverage the existing network of U.S. centers and laboratories. To that end, the report recommends that the National Science Foundation invest in both telescopes and their exoplanet instrumentation to provide all-sky access to the U.S. community.

And for another variety of ground-based observing the study called for the funding of a project to substantially increase the precision of instruments that find and measure exoplanets using the detected “wobble” of the host star.  But stars are active with or without a nearby exoplanet, and so it has been difficult to achieve the precision that astronomers using this “radial velocity” technique need to find and characterize smaller exoplanets.

Several smaller efforts to increase this precision are under way in the U.S., and the European Southern Observatory has a much larger project in development.

Additionally, the report recommends that the administrators of the James Webb Space Telescope give significant amounts of observing time to exoplanet study, especially early in its time aloft (now scheduled to begin in 2021.)  The atmospheric data that JWST can potentially collect could and would be used in conjunction with results coming from other telescopes, and to further study of exoplanet targets that are already promising based on existing theories and findings.

 

Construction has begun on the Giant Magellan Telescope at the Carnegie Institution’s Las Campanas Observatory in Chile. This artist rendering shows what the 24.5 meter (80 foot) segmented mirror and observatory will look like when completed, estimated to be in 2024. (Mason Media Inc.)

 

While the NAS report gives a lot of attention to instruments and ways to use them, it also focuses as never before on astrobiology — the search for life beyond Earth.

Much work has been done on how to determine whether life exists on a distant planet through modeling and theorizing about biosignatures.  The report encourages scientists to expand that work and embraces it as a central aspect of exoplanet science.

The study also argues that interdisciplinary science — bringing together researchers from many disciplines — is the necessary way forward.  It highlights the role of the Nexus for Exoplanet System Science, a NASA initiative which since 2015 has brought together a broad though limited number  of science teams from institutions across the country to learn about each other’s work and collaborate whenever possible.

The initiative itself has not required much funding, instead bringing in teams that had been supported with other grants.   However, that may be changing. One of the study co-chairs, David Charbonneau of Harvard University, said after the release of the study that the “promise of NExSS is tremendous…We really want that idea to grow and have a huge impact.”

The NAS study itself recommends that “building on the NExSS model, NASA should support a cross-divisional exoplanet research coordination network that includes additional membership opportunities via dedicated proposal calls for interdisciplinary research.”

The initiative, I’m proud to say, sponsors this interdisciplinary column in addition to all that interdisciplinary science.

Facebooktwittergoogle_plusredditpinterestlinkedinmail