Jump to content

bdgwx

Members
  • Posts

    1,361
  • Joined

  • Last visited

Posts posted by bdgwx

  1. 34 minutes ago, bobjohnsonforthehall said:

    Per Chang et al, the average annual change in ocean heat content is 5.5 zettajoules per year...about 0.1% of the energy entering and leaving the ocean.

    Yeah. That's pretty close.

    35 minutes ago, bobjohnsonforthehall said:

    Yet all we hear like a constant drum beat is that this tiniest of imbalances is due to human causes.

    That's because it is. At least post WWII it is. In fact, the anthroprogenic force has been so large during this period that it accounts for nearly 100% of the net force.

    38 minutes ago, bobjohnsonforthehall said:

    We never hear about IPCC or others noting other possible causes.

    The IPCC considers all agents that are modulating the climate. They have to because the energy imbalance is modulated by the net affect of all energy fluxes; not just one of them. I posted a link to IPCC AR5 WGI chapter 8 above that provides a brief summary of the agents that have contributed to the modulation of Earth's energy budget.

    39 minutes ago, bobjohnsonforthehall said:

    It is always humans.

    Not always. Humans either did not exist or were not capable of influencing the climate in the distant past. But here's the cool thing about the laws of physics. They stipulate that the radiative force induced by perturbations in GHGs are invariant of the actor that modulated them. In other words, GHG molecules have the exact same radiative behavior regardless of whether they are emitted by natural agents or by human agents. That's why GHGs are crucial and essential pieces of the puzzle in solving many paleoclimate mysteries like the PETM, glacial cycles, faint young Sun problem, etc.

     

    • Like 2
  2. 1 hour ago, bobjohnsonforthehall said:

    Argo in situ calibration experiments reveal measurement errors of about ±0.6 C.

    Hadfield, et al., (2007), J. Geophys. Res., 112, C01009, doi:10.1029/2006JC003825

    False precision is fun.

    This paper is not relevant to the error for the annual OHC anomaly or annual global temperature. Nor is the quoted 0.6C figure the measurement error that can be expected from an ARGO float which is said to be approximately ±0.002C for individual measurements. The 0.6C figure is the RMS error of ARGO derived hydrographic section temperature fields. These sections are computed even in lieu of being occupied by an ARGO float at each grid cell. Using the WOA (World Ocean Atlas) dimensions we can estimated 75x30 = 2250 grid cells along the cruise line of the hydrographic section used in the paper. If you were to then answer the question...what is the error of computed mean temperature of this hydrographic section then you might expect it to be 0.6/sqrt(2250) = ±0.01C using the standard error of the mean formula. In reality I suspect the actual error to be a bit different for a variety of reasons. I'm just giving you an order of magnitude estimate based on trivial statistical principals using the RMS error of the temperature field on that single hydrographic section mentioned in the publication. Note that this hydrographic section represents but an infinitesimally small part of a much larger 3D volume containing vastly many more grid cells by which to significantly reduce the error in the global mean temperature estimate if such a method were used. I do not see anything in this publication that is inconsistent with Cheng's OHC 2σ envelope. I do see that ARGO reduces the error by a factor 2 relative to the non-ARGO era. Perhaps this is why the 2σ envelope appears to be significantly reduced in later years on Cheng's graph?

    • Like 1
  3. 4 hours ago, bobjohnsonforthehall said:

    Finally, even if one assumes their problematic findings are indeed accurate, it will take some 300+ years for the oceans to warn by a single degree.

    Fun with math. The ocean has a mass of about 1.4e21 kg. The specific heat capacity is about 4 kj/kg. This means it would require 5.6e24 joules of energy to increase the mean temperature of the ocean by 1.0C. This would require an EEI of +1.2 W/m^2 to persist for 300 years. A smaller +0.7 W/m^2 imbalance is causing the GMST to increase by about +0.2C/decade. I doubt the relationship would be linear but you can certainly do an order of magnitude estimate on what might happen if we added 5.6 yottajoules of energy to geosphere over a 300 year period. Hint...hothouse Earth might be something worth researching.

    4 hours ago, bobjohnsonforthehall said:

    And to say with certainty that the change in temperature is 100% due to human causes as opposed to cloud cover or El Nino frequency, or any number of other slight variances in a highly complex climate system seems to me very definition of arrogance.

    It's a good thing then that no reputable scientific work claims that climate forcing agents are known with perfect certainty nor is it claimed that anthroprogenically modulated forcing agents are the only contributors to changes in temperature. I highly recommend reading IPCC AR5 WGI chapter 8 on radiative forcings for a brief summary of the agents in play and estimates of their magnitudes and uncertainty. ENSO, clouds, and likely a bunch of stuff that you haven't even thought of are all actively researched. I think what you'll find when looking at the academic literature is that the exact opposite of arrogance is happening in climate science.

     

    • Like 2
  4. 1 hour ago, bobjohnsonforthehall said:

    And the amount of heat being added to the oceans BY THE SUN is what? A thousand times that? Every second? Yawn.

    Heat is the transfer of thermal energy. For the body being warmed/cooled it requires a net positive/negative flow of energy. If I am to interpret your question in precise terms then it is equivalent to asking...of the Earth Energy Imbalance (EEI) how much does the Sun contribute? The answer is effectively nothing. The reason is because total solar irradiance is not increasing. In fact, if anything it has actually been decreasing, albeit by a small amount, over the last few decades. I'm going to estimate the RF of the Sun over this period at about -0.01 W/m^2 as compared to the EEI of +0.70 W/m^2. In order of magnitude terms you might say it is 1/100th and in the opposite direction. 

    Now if the question were...how much energy is being added by the Sun then the answer is about 240 W/m^2. This is the effective solar ingress flux near the surface. It is about 300x the EEI. Don't forget about the near surface egress fluxes though!

    • Like 1
  5. 20 minutes ago, TriPol said:

    What if all of the thermometers had bad data back then? Or not the thermometers, but the people reporting the temps weren't the most accurate? And we weren't even measuring ocean temps back then, just temps at stations, which were probably in cities. We had no idea if there was an El Nino or La Nina in 1885. We can only guess based off of observations. 

    There are many potential sources of error including instrument bias, station moves, station sighting changes, time-of-day of observation, urban heat island effect, human element, etc.

    Ocean temperature datasets like ERSST and HadSST go back well into the 1800's as well. They are used as inputs into the global mean surface temperature (GMST) datasets as well.

    We have reconstructions of the ENSO cycle going back hundreds of years. The ENSO phase was positive (El Nino) in 1885 and into 1886.

    Speaking of cities and the potential for the urban heat island effect...Berkeley Earth concluded that the UHI bias was flat to even negative since 1950 during the period in which the anthroprogenic influence is most acute.

    33 minutes ago, TriPol said:

    But, let's assume you're right. Let's assume that the average temperature, globally has increased by 2C since 1880 and that we're hotter than we've been in a long time. Well, what were the conditions like thousands of years ago? Seems to me that human beings survived this pretty well. 

    Berkeley Earth shows that the GMST increased about 1.0C since 1880.

    Temperature proxies like those derived from ice cores, tree ring analysis, etc. can provide insights into the GMST going back millions of years.

  6. 2 hours ago, TriPol said:

    But how accurate were those thermometers back then in the 1,000 stations in 1880? 

    Certainly not as accurate as today. But here's the cool thing about math. Trivially if a thermometer has an error of say 2C and there are 1000 thermometers then the error of the mean is only 2/sqrt(1000) = 0.06C. In reality it's far more complicated than that. After it is all said done surface datasets like GISTEMP publishes an error of about 0.10C before WWII and 0.05C after for the global mean temperature. The error envelope expands the further back in the 1800's you go.

    Berkeley Earth actually has some in depth papers on how it works and how the uncertainty is determined. Fair warning...it's thick stuff.

    http://berkeleyearth.org/static/papers/Methods-GIGS-1-103.pdf

    http://berkeleyearth.org/static/pdf/methods-paper-supplement.pdf

    • Like 2
  7. 5 hours ago, etudiant said:

    Powerful stuff! 

    Can anyone help put the heat content change into perspective? The increase of roughly 300 Zetta Joules since the 1980s is what percentage of the annual global heat budget?

     

    Using my +0.7 W/m^2 figure above and dividing by 240 W/m^2 yields = 0.7 / 240 = 0.3% of the surface budget. However, keep in mind that +0.7 W/m^2 is just the imbalance that still needs to equilibriate. The energy imbalance from the past that has caused 1.1C of warming has already equilibriated so is not included in my +0.7 W/m^2 value. This additional energy is probably in the 1.5-2.0 W/m^2 'ish range (just guessing right now). If you include that then we're probably close to 1% of the surface budget.

  8. So to put that in perspective the change from 1985 to 2019 is about 350e21 joules (eyeballing for now). The average uptake is thus...

    350e21 (OHC-joules) / (31.56e6 (seconds in year) * 34 (years))  / 510e12 (Earth area-m^2) = 0.64 W/m^2.

    The ocean takes up about 90% of the imbalance so we can probably estimate the average imbalance as 0.64 / 0.9 = +0.7 W/m^2.

    That is a pretty large imbalance and is inline with expectations of the net radiative force from all agents subtracted off from what has already equilibriated to raise the global mean temperature.

    Over the last 10 years this imbalance actually works out to about +1.0 W/m^2. The implication...even if GHG emissions were to cease instantly there is still a lot of warming waiting in the pipeline that needs to equilibriate.

     

     

    • Like 1
  9. 5 hours ago, LibertyBell said:

    I think it's accelerating and will continue to accelerate as we will see the Antarctic ice start to melt at a faster pace too- there are hints that might already be underway.  Australia might be uninhabitable within a few decades.

     

     

    It's interesting what is happening in the SH as well. According to the IPCC the expectation was for mostly flat trends and possibly even an increase through the 2020's. So to see the SH decline and even drop to record lows tells us that at least some sea ice predictions have underestimated the decline down there too. It seems as though there is a long history of sea ice predictions being too conservative; sometimes shockingly so. For example, back in 2001 and inferring from a graphic in AR3 the IPCC predicted that NH sea ice extent wouldn't drop below an annual mean of 10.5e6 until about 2040. It first happened in 2007 and then 6 times after that including the last 4 years in a row. So by taking a more conservative stance and hinting that the declines may moderate in the 2020's I'm doing so fully aware that I could end up getting burned. But I also understand that trendline reversion is a powerful concept and I'm also trying to stay pragmatic and not come across as overly alarmist either.

    • Like 2
  10. So if I remember correctly some of the recent computer modeling studies showed that the 2020's might be characterized as period of stalling out on the declines before picking back up again in the 2030's. What do you guys think? Are we going to see the same dramatic declines or will there be a hiatus? I think I'm more in favor of a moderation in the decline rates. But, it's not lost on me that those who have made similar conservative predictions in the past have gotten burned. So I'm prepared to be wrong.

  11. According to NSIDC...

    For 2019 the NH (Arctic) ended with an annual mean of 10.186e6 km^2 of extent. This is the 2nd lowest after 2016 which ended with 10.163e6.

    For 2019 the SH (Antarctic) ended with an annual mean of 10.826e6 km^2 of extent. This is the 2nd lowest after 2017 which ended with 10.749e6.

    • Like 2
  12. 6 hours ago, BillT said:
    to make the claim REQUIRES there be no margin of error......because to say one year was .01 degree warmer than another REQUIRES the margin of error to be much less than .O1....the claim itself excludes there being any margin of error.......and i must ask who is the "we all" you posted of?  "all" includes many people.

     

    Annual global mean temperatures from most datasets are accompanied with a margin of error. This makes annual rankings probabilistic. Berkeley Earth has a good visualization of this in their 2018 report. You can see how the temperature distribution curve for each year peaks at the reported value and how the tails can overlap with other years.

    http://berkeleyearth.org/2018-temperatures/

    They have an excellent paper describing how the averaging process works and how uncertainties are dealt with and reported.

    http://berkeleyearth.org/static/papers/Methods-GIGS-1-103.pdf 

    Most other datasets post their uncertainties and make annual rankings in a similar manner.

     

     

    • Like 1
  13. One model that gets a lot of attention is that done by Hansen and published in 1988. See H88 here.

    He models three different scenarios: A, B, and C. A is a high emissions scenario, B is medium emission scenario, and C is a low emission scenario. Scenario A assumes exponential growth of GHG emissions plus hypothetical new emissions. B assumes modest GHG emissions most similar to what has been experienced so far. And C assumes drastic curtailment of all GHG emissions by 2000. Scenarios B and C inject volcanic eruptions in the years 1995 and 2015 similar in magnitude to El Chichon and Agung. Scenario A has no such volcanic eruptions injected. Scenario A is claimed to produce an equivalent forcing of 2xCO2 by 2030, B by 2060, and never for C. Scenario B was said to be most realistic and likely future trajectory.

    The actual trajectory of forcing lies between B and C. The HDAS2019 publication referenced in this thread estimates B's forcing as 27% higher than what actually transpired. Today we know this is partially due to the Montreal Protocol and probably the somewhat higher volcanic aerosol forcing that occurred due to Pinatubo 1991 and several VEI 4+ eruptions in the early 2000s as well. Had H88 used correct inputs in a hypothetical scenario that closely matched reality then that scenario would have exhibited very high skill in predicting the global mean surface temperature. This tells us the problem was more with the assumed inputs than with the model physics.

     

    • Like 2
  14. A new study appeared yesterday evaluating the skill of past climate model simulations.

    https://www.sciencemag.org/news/2019/12/even-50-year-old-climate-models-correctly-predicted-global-warming

    The news article contains a link for free access to the publication. Here is the paywall link though.

    https://agupubs.onlinelibrary.wiley.com/doi/abs/10.1029/2019GL085378

    And here is the materials available on github.

    https://github.com/hausfath/OldModels

    Abstract

    Retrospectively comparing future model projections to observations provides a robust and independent test of model skill. Here we analyse the performance of climate models published between 1970 and 2007 in projecting future global mean surface temperature (GMST) changes. Models are compared to observations based on both the change in GMST over time and the change in GMST over the change in external forcing. The latter approach accounts for mismatches in model forcings, a potential source of error in model projections independent of the accuracy of model physics. We find that climate models published over the past five decades were skillful in predicting subsequent GMST changes, with most models examined showing warming consistent with observations, particularly when mismatches between model‐projected and observationally‐estimated forcings were taken into account.

    Plain Language Summary

    Climate models provide an important way to understand future changes in the Earth's climate. In this paper we undertake a thorough evaluation of the performance of various climate models published between the early 1970s and the late 2000s. Specifically, we look at how well models project global warming in the years after they were published by comparing them to observed temperature changes. Model projections rely on two things to accurately match observations: accurate modeling of climate physics, and accurate assumptions around future emissions of CO2 and other factors affecting the climate. The best physics‐based model will still be inaccurate if it is driven by future changes in emissions that differ from reality. To account for this, we look at how the relationship between temperature and atmospheric CO2 (and other climate drivers) differs between models and observations. We find that climate models published over the past five decades were generally quite accurate in predicting global warming in the years after publication, particularly when accounting for differences between modeled and actual changes in atmospheric CO2 and other climate drivers. This research should help resolve public confusion around the performance of past climate modeling efforts, and increases our confidence that models are accurately projecting global warming.

    • Like 2
  15. From the early arrivals...CFSR and UAH for November came in 0.10 and 0.09 higher than October respectively. I would imagine the conventional surface station datasets like GISTEMP and Berkeley Earth and the like will be at least as warm as October if not warmer. This would position 2019 with very high odds of being the 2nd warmest on record.

    • Like 2
  16. 5 hours ago, banshee8 said:

    Well, what is interesting is that trends in the U.S. don't always follow the global ones. For the lower 48, since 2007 we've seen a shift to colder continental patterns during the cold season that has been opposite of global trends.

    Compare the winters since 2007 to the decade that preceded. Complete anomaly reversal, and definitely colder overall.

    That's pretty typical. Regional trends often do not align with global trends. Another example...Siberia cooled even more than the upper great plains in the CONUS during this same period. Yet...the planet is warmer overall now than in the previous decade. I asked the question above...is this a result of a systematic shift via WACKy, quasi resonant amplification of the polar jet, or something else? Or is it just another chaotic artifact that will disappear in the following decade? Lots of questions...

    • Like 1
  17. 45 minutes ago, banshee8 said:

    I mean, the NSIDC is a pretty mainstream source. 

    I think some predictions got a lot more aggressive after 2007, but most toned it down after about 2013.

    NSIDC isn't the source though. Their "NSIDC in the News" section is just links to various articles and blogs that mention NSIDC. They have hundreds of links in this section every year. It's not even clear if these articles (which are dead links now) were in reference to a bona-fide peer reviewed study or some random blogger's opinion. Note the disclaimer in the section.

    The following items link to media coverage of NSIDC in various news outlets, online magazines, editorial pieces, and blogs. The content of these articles and blog posts does not necessarily reflect the views of NSIDC, our collaborators, or our funding agencies.

    Like I said, I can't see the articles anymore so I have no idea what the details of these "predictions" are. But based on the timing of when the articles appear I can speculate a bit. There were two fellows during this period that made some very aggressive predictions that got widespread media attention. The first was Maslowski and the second was Wadhams. Neither was characterized by broad acceptance in the academic community. In Wadham's case he was pretty much entirely ignored. Maslowski was a legit researcher but in his defense his work was frequently taken out of context. His 2016±3 date (which was often erroneously cited as 2013) was statistical and appeared in a publication that I believe used many methods to arrive at many different estimates with 2016 being the lowest therefore making a cherry pick and really bad at that. Masklowski even warned against taking his work out of context and specifically chided Al Gore for doing just that.

    The point...be careful about linking media popularity with the mainstream views of bona-fide scientists. They are often at odds with each other.

    • Like 1
×
×
  • Create New...