Over the last century, our understanding of the evolution of stars has improved dramatically. By studying the cosmos, we have gained a pretty good theoretical model of how stars change over their lifetimes.
But looking at our own Sun, a G-type star on its main sequence fusing hydrogen to helium, and comparing it to our best models, we are left with a little conundrum. In the early life of the Sun, when the Earth was newly formed, scientists believe that the Sun gave off less energy than it does today.
“According to standard solar models, when nuclear fusion ignited in the core of the Sun at the time of its arrival on what is called the zero-age main sequence (ZAMS) 4.57 Ga (1 Ga = 109 years ago), the bolometric luminosity of the Sun (the solar luminosity integrated over all wavelengths) was about 30 percent lower as compared to the present epoch,” a paper on the topic explains.
While this may not sound like a problem, it’s actually one that has puzzled scientists for decades. Why? Well, if the only variable was the Sun’s luminosity, we would expect Earth’s climate to be pretty darn cold back in this early epoch.
“We see that the global temperature of Earth dropped below the freezing point of seawater less than 2.3 aeons ago (1 aeon is 109 years); 4.0 to 4.5 aeons ago global temperatures were about 2,63OK,” Carl Sagan and George Mullen wrote, turning their attention to the topic in 1972.
“Had we used 50 percent for ΔL, the freezing point of seawater would have been reached about 1.4 aeons ago, and temperatures 4.0 to 4.5 aeons ago would have been about 245°K. Because of albedo instabilities […] it is unlikely that extensive liquid water could have existed anywhere on Earth with such global mean temperatures.”
And yet, evidence seen around the globe in ancient rock show that the Earth had abundant water flowing 3.2 aeons ago. As well as this, Sagan and Mullen point out, we have algae fossils dating back to around the same time “which would be very difficult to imagine on a frozen Earth.”
In short, Earth should have been a snowball billions of years ago, looking at the Sun’s luminosity, and yet it wasn’t. Over the decades, scientists have proposed potential explanations for the paradox.
“An extreme atmospheric greenhouse effect, an initially more massive Sun, release of heat acquired during the accretion process of protoplanetary material, and radioactivity of the early Earth material have been proposed as reservoirs or traps for heat,” one study explains.
According to that study, the Moon could have played a role in heating the Earth during that early epoch. As the Moon was closer (yes, the Moon is slowly drifting away) back then, it could have heated the Earth through tidal forces.
“As a bonus, tidal heating as a geothermal heat source might have helped to sustain enhanced mantle temperatures, for instance by driving hydrothermal fluid circulation in early Earth’s crust,” the paper continues.
Stronger solar winds in the Sun’s early days, heating through carbon dioxide, or high concentrations of ammonia gas (as suggested by Sagan) are favored possibilities, or a combination of these effects.
To add to this puzzle, looking at Mars, it too appears to have had liquid water on its surface around 3.6 billion years ago, and perhaps stretching as far back as 4.45 billion years ago. This could be the result of carbon dioxide buildup or methane outgassing, but we might not be able to solve the Mars paradox without getting a closer look at the planet’s rocks. For this, we may now have to wait until 2040. While we may hone in on the exact cause of the faint young sun paradox on Earth, Mars’s puzzle may take another 15 years at least.
Source Link: The "Faint Young Sun Paradox" That Puzzled Carl Sagan