Jump to content
  • Member Statistics

    17,510
    Total Members
    7,904
    Most Online
    Toothache
    Newest Member
    Toothache
    Joined

CO2 is not causing changes of climate


meteorologist

Recommended Posts

My link

Things were very different 600 million years ago so be careful about making comparisons to today's climate. One consideration is the uncertainty in your figure of 7,000 ppm. Can you share with us where you got that figure? If, as I suspect, it came from Berner's GEOCARB model, then there is a tremendous uncertainty in the older values. Berner himself says in the conclusion of his paper "Thus, exact values of CO2, as shown by the standard curve, should not be taken literally".

Another factor is that the Sun wasn't as bright back then. Check out the 'Faint Young Sun Paradox". Early in Earth's history, about 4 billion years ago, the TSI was only around 70% of today's value. I don't know the TSI value at 600 M years ago, but it was less than today's 1365 W/m2. Higher GHG levels were needed to prevent the Earth from freezing.

Definitely will look into the dimness factor. I did not believe 600million years would amount to much on a star with a lifespan closer to 10billion plus, but I will definitely read more about this, thanks.

Link to comment
Share on other sites

  • Replies 305
  • Created
  • Last Reply

My link

Things were very different 600 million years ago so be careful about making comparisons to today's climate. One consideration is the uncertainty in your figure of 7,000 ppm. Can you share with us where you got that figure? If, as I suspect, it came from Berner's GEOCARB model, then there is a tremendous uncertainty in the older values. Berner himself says in the conclusion of his paper "Thus, exact values of CO2, as shown by the standard curve, should not be taken literally".

Another factor is that the Sun wasn't as bright back then. Check out the 'Faint Young Sun Paradox". Early in Earth's history, about 4 billion years ago, the TSI was only around 70% of today's value. I don't know the TSI value at 600 M years ago, but it was less than today's 1365 W/m2. Higher GHG levels were needed to prevent the Earth from freezing.

Have you taken a look at Nir Shaviv's solar variation hypothesis, and if so, what thoughts do you have? This question is posed to anyone with more knowledge about the Faint Young Sun Paradox. Thanks

Link to comment
Share on other sites

Have you taken a look at Nir Shaviv's solar variation hypothesis, and if so, what thoughts do you have? This question is posed to anyone with more knowledge about the Faint Young Sun Paradox. Thanks

I just reread Shaviv's paper and I confess I'm not convinced by his hypothesis. He does a lot of curve fitting, a lot of ad hoc adjustments to periods to make the correlations look better, and used the GEOCARB results in ways its author explicitly said weren't legit. I'll rate that paper interesting, but with issues.

As for the Faint Young Sun hypothesis - I feel that it is more plausible because it is in agreement with what has been observed and theorized about Main Sequence stars like our Sun. If the Sun's 30% brightening from 4 billion years ago to today was linear (purely an assumption on my part) then the TSI 600 million years ago would have been 4% - 5% less than today. Call it 55 - 70 W/m2 dimmer than today. That's a lot when compared to the historic measured TSI fluctuations of about 0.1%.

Link to comment
Share on other sites

I just reread Shaviv's paper and I confess I'm not convinced by his hypothesis. He does a lot of curve fitting, a lot of ad hoc adjustments to periods to make the correlations look better, and used the GEOCARB results in ways its author explicitly said weren't legit. I'll rate that paper interesting, but with issues.

As for the Faint Young Sun hypothesis - I feel that it is more plausible because it is in agreement with what has been observed and theorized about Main Sequence stars like our Sun. If the Sun's 30% brightening from 4 billion years ago to today was linear (purely an assumption on my part) then the TSI 600 million years ago would have been 4% - 5% less than today. Call it 55 - 70 W/m2 dimmer than today. That's a lot when compared to the historic measured TSI fluctuations of about 0.1%.

As I said, I've read elsewhere that the solar output of the sun 600 million years ago was 3-4% lower.

Despite CO2 dropping rapidly over the last 600 million years as seen in the first graph, radiative forcing when including the sun, shows no trend just ups and downs.

Phanerozoic_CO2.gif

Phanerozoic_Forcing.gif

Link to comment
Share on other sites

Another factor to consider is that while 6000 to 7000 parts per million sounds like a huge and dramatic increase over today's level of nearly 400ppm, in reality it only represents about 4 doublings of CO2 over today's levels. At 3.7W of forcing per doubling we see that the forcing under those conditions by CO2 would have been about 15W higher than today. This forcing would have had about 5C warming influence over that of today before feedback.

If the Sun were 3% less luminous Earth would receive about 41Watts less energy at it's orbital distance, or a bit less than the forcing from 2 doublings of CO2. Everything else being equal the Earth might have been only 2C - 3C warmer than today even with CO2 levels near 7,000ppm.

Link to comment
Share on other sites

The trouble with you 'hypothesis' is that it fails on many, many levels. Far too many for me to want to take the time to debunk. Especially since you've shown no interest in the actual science, or in expanding you knowledge. So I'll keep this short and point out just one, of many, errors. The increase in OLR isn't due to albedo changes because the OLR reaching space isn't being radiated from the Earth's surface. GHGs absorb most surface OLR before it reaches space. The OLR being observed by satellites is being radiated from high in the atmosphere where the mean free path for LW photons is long enough to reach space. If you want the real reason for increasing OLR then look into the Stefan-Boltzmann Law. As the Earth warms up in response to increasing CO2 levels, OLR is supposed to go up. That is one of many observations that support the mainstream AGW theories.

This is another crackpot theory, that defies all the laws of physics possible. When the Earth warms, the surface emits more OLR, so that it can equilibriate to the amount of ISR that is reaching Earth's surface. With a decrease in albedo, you get less Clouds to trap OLR, so you get an increase in OLR. In addition, since Clouds are increasing the amount of ISR that reaches Earth's surface, the OLR would continually increase, since there is a continued increase in ISR reaching Earth's surface from decreasing Cloud Cover. To claim that when the surface warms, it will not emit more OLR is just bizzare. With GHGs, they would produce an initial decrease in OLR, and then Earth would warm until the OLR has equilibriated to the amount of ISR that has reached Earth's Surface. This would lead to no change in OLR at all.

You statement is wrong in so many ways that it is hard to decide where to begin. First, clouds (and GHGs) don't 'trap' OLR. If clouds actually trapped that much energy they would heat to the point that they would dissipate. GHGs (including water vapor, CO2, CH4, and O3) are opaque to most frequencies in the LW portion of the spectrum. That's the definition of a GHG. When a GHG molecule encounters a photon of LW radiation it absorbs it, which raises the molecule's energy level. Several microseconds later it drops to a lower energy level by emitting a photon of LW radiation. That new LW photon is emitted isotropically and travels until it either is absorbed by another GHG molecule or reaches space. Near the Earth's surface, where the air density is the greatest, the mean free path between GHG molecules is very short, on the order of meters. Each time that new photon encounters another GHG molecule then the absorbsion/re-emission cycle repeats. The successive LW photon paths form a 3-D random walk gradually moving higher. With altitude, and the corresponding decrease in density, the mean free path increases. In the stratosphere, where CO2 is the dominant GHG due to the low levels of water vapor, the rarified air gives OLR opportunity to reach space.

An important point to understand is that the GHG molecule can also drop to a lower energy level by losing energy through collisions with other gas molecules instead of emitting a LW photon. This collisional transfer of energy is how GHGs warm non-GHGs.

Clouds are simply volumes of atmosphere where the conditions cause some of the GHG water vapor to condense into droplets. Clear skies are only slightly less opaque to OLR. The difference, as relates to OLR is that the mean free path within a cloud is very short. Nevertheless, the most of the OLR that enters the bottom of a cloud eventually leaves the top.

And, no, it is not a 'crackpot theory', it's basic radiative physics. I'm sure that you're too wilfully ignorant to learn anything, but perhaps you'll prove me wrong. You seem to be a big fan of Lindzen so I'll quote from his paper Taking Greenhouse Warming Seriously, ( the title, at least, is something he and I agree on):

The main greenhouse gas, water vapor, generally maximizes at the surface in the tropics and sharply decreases with
both altitude and latitude. There is so much greenhouse opacity immediately above the ground that the surface cannot effectively cool by the emission of thermal radiation. Instead, heat is carried away from the surface by fluid motions ranging from the cumulonimbus towers of the tropics to the weather and planetary scale waves of the extratropics. These motions carry the heat upward and poleward to levels where it is possible for thermal radiation emitted from these levels to escape to space. We will refer to this level (which varies with the amount of water vapor at any given location) as the characteristic emission level. Crudely speaking, the emission from this level is
proportional to the 4th power of the temperature at this level. Figure 3a offers a simplified one dimensional picture of the situation. Largely because of the motions of the atmosphere, the temperature decreases with altitude to some level known as the tropopause. The height of the tropopause varies with latitude. In the tropics, the tropopause height is about 16 km. Near 30° latitude, the tropopause height drops to about 12 km, and near the poles it is around 8 km. Below the tropopause, we have what is called the troposphere. The characteristic emission level is referred to as
τ
= 1.
τ
is a non-dimensional measure of infrared absorption measured from the top of the atmosphere looking down. Crudely speaking, radiation is attenuated as e
−τ
. The level at which
τ
= 1, is one optical depth into the atmosphere, and radiation emitted from this level is proportional to the 4th power of the temperature at this new level. When the earth is in radiative balance with space, the net incoming solar radiation is balanced by the outgoing longwave radiation (OLR or thermal radiation or infrared radiation; these are all commonly used and equivalent terms) from the characteristic emission level,
τ
= 1. When greenhouse gases are added to the atmosphere, the level at which
τ
= 1 is raised in altitude, and, because the temperature of the atmosphere decreases with altitude (at the rate of approximately 6.5° C per kilometer), the new characteristic emission level is colder than the previous level.

In practice, the
τ
= 1 level is typically in the neighborhood of 7–8 km in the tropics and at lower levels in the extratropics.

(hmmm, the greek symbols are not showing up correctly. My apologies)

So how does you albedo/cloud feedback hypothesis hold up when measured against physical reality? Not very well, I'm afraid. But I'm sure you'll keep cutting and pasting the same error filled nonsense you've posted multiple times already.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.

×
×
  • Create New...