Jump to content

Leitwolf

Members
  • Posts

    2
  • Joined

  • Last visited

About Leitwolf

  1. The adiabatic lapse rate is the key component of the GHE. Regrettably I do not seem to fully understand it and so I need to ask for help. The average lapse rate on Earth is about 6.5K/1km. The "dry" lapse rate however is said to be 9.8K/km. So vapor (and the latent heat it transports) reduces the lapse rate significantly, in this way it also serves as a kind of anti-GHG. I mean if we understand the GHE as a combination of an elevated emission layer (photosphere) where we have 255K and the adiabatic lapse rate, which produces higher temperatures at the surface. Anyhow, the exact impact of vapor on the lapse rate is a key question. Now I found this sightly confusion chart on the subject on wikipedia. There are some things I understand, and some things I don't. For instance we have lines for altitude (scale on the right) which are sloped. That is to be understood relative to the left scale of pressure, meaning with cold air pressure will decrease faster with altitude, as it has a higher density. https://en.wikipedia.org/wiki/Lapse_rate#/media/File:Emagram.GIF Now if I look at the bold line for the dry adiabat, for instance the one starting at 20°C, it intersects the 5km line at roughly -24°C. That are 44K for 5km, or 10.8K/km, significantly more than the 9.8K quoted before. Assuming the chart is right, what causes this difference? Is it because the troposphere is naturally unstable and heated at the surface, so to say? Then the "wet" lapse rate starting at +15°C intersects the 5km line at about -14.5°C, meaning a delta of 29.5K or 5.9K/km. +15°C roughly corresponds to the actual surface temperatures on Earth, yet 5.9K/km is significantly less than the quoted 6.5K. Why is that? Is it representing a more theoretical perfectly wet atmosphere with a 100% H2O saturation??
  2. Hello to the community I wanted to find out how clouds affect surface temperature. Regrettably I could not find a lot of usefull information on this issue, except for sources I do not fully trust. So I downloaded some raw data from the NOAA that include both cloud condition (although only up to 12.000ft) and temperature, those were about 100 Mio records. Then I did some programming in good old C to extract the data I was looking for. What I found, after some proper filtering (location, season, day time), is a distinct pattern in the correlation between cloudiness and temperature. Of the 5 basic cloud conditions (CLR, FEW, SCT, BKN, OVC), CLR and OVC are about equally cold, while the intermediate scenarios are warmer. In fact the chart describes a nice little curve. That far there is no specific trend, as the curve is largely symmetric. Of course this result is a bit odd, since clouds are expected to cool Earth in the context of climatology. If that was true we should see some effect on the mirco level if you will, and have lower temperatures with clouds. Now here is the real problem: On the one side I want to understand why there is this curvy shape and on the other side I have yet to allow for another bias which is rain. Rain sharply reduces surface temperatures and it is of course strongly correlated to cloudiness. In fact you get a logarithmic shaped curve if you plot the amount of rain against those 5 cloud conditions (there is some rain reported with CLR skies though, which is due to the 12.000ft reporting ceiling). So since rain has an increasing chilling effect with stronger cloudiness, this is a perfect explanation for the described curve. But then this bias by rain masks a correlation which seems otherwise quite linear in nature: the more clouds, the warmer. But if that is so, and clouds actually warm the surface, the whole GHE (due to GHGs) is shattered.
×
×
  • Create New...