bdgwx

Members
  • Content Count

    784
  • Joined

  • Last visited

About bdgwx

  • Birthday 10/19/1977

Profile Information

  • Gender
    Male
  • Location:
    St. Louis

Recent Profile Visitors

1,952 profile views
  1. One model that gets a lot of attention is that done by Hansen and published in 1988. See H88 here. He models three different scenarios: A, B, and C. A is a high emissions scenario, B is medium emission scenario, and C is a low emission scenario. Scenario A assumes exponential growth of GHG emissions plus hypothetical new emissions. B assumes modest GHG emissions most similar to what has been experienced so far. And C assumes drastic curtailment of all GHG emissions by 2000. Scenarios B and C inject volcanic eruptions in the years 1995 and 2015 similar in magnitude to El Chichon and Agung. Scenario A has no such volcanic eruptions injected. Scenario A is claimed to produce an equivalent forcing of 2xCO2 by 2030, B by 2060, and never for C. Scenario B was said to be most realistic and likely future trajectory. The actual trajectory of forcing lies between B and C. The HDAS2019 publication referenced in this thread estimates B's forcing as 27% higher than what actually transpired. Today we know this is partially due to the Montreal Protocol and probably the somewhat higher volcanic aerosol forcing that occurred due to Pinatubo 1991 and several VEI 4+ eruptions in the early 2000s as well. Had H88 used correct inputs in a hypothetical scenario that closely matched reality then that scenario would have exhibited very high skill in predicting the global mean surface temperature. This tells us the problem was more with the assumed inputs than with the model physics.
  2. A new study appeared yesterday evaluating the skill of past climate model simulations. https://www.sciencemag.org/news/2019/12/even-50-year-old-climate-models-correctly-predicted-global-warming The news article contains a link for free access to the publication. Here is the paywall link though. https://agupubs.onlinelibrary.wiley.com/doi/abs/10.1029/2019GL085378 Abstract Retrospectively comparing future model projections to observations provides a robust and independent test of model skill. Here we analyse the performance of climate models published between 1970 and 2007 in projecting future global mean surface temperature (GMST) changes. Models are compared to observations based on both the change in GMST over time and the change in GMST over the change in external forcing. The latter approach accounts for mismatches in model forcings, a potential source of error in model projections independent of the accuracy of model physics. We find that climate models published over the past five decades were skillful in predicting subsequent GMST changes, with most models examined showing warming consistent with observations, particularly when mismatches between model‐projected and observationally‐estimated forcings were taken into account. Plain Language Summary Climate models provide an important way to understand future changes in the Earth's climate. In this paper we undertake a thorough evaluation of the performance of various climate models published between the early 1970s and the late 2000s. Specifically, we look at how well models project global warming in the years after they were published by comparing them to observed temperature changes. Model projections rely on two things to accurately match observations: accurate modeling of climate physics, and accurate assumptions around future emissions of CO2 and other factors affecting the climate. The best physics‐based model will still be inaccurate if it is driven by future changes in emissions that differ from reality. To account for this, we look at how the relationship between temperature and atmospheric CO2 (and other climate drivers) differs between models and observations. We find that climate models published over the past five decades were generally quite accurate in predicting global warming in the years after publication, particularly when accounting for differences between modeled and actual changes in atmospheric CO2 and other climate drivers. This research should help resolve public confusion around the performance of past climate modeling efforts, and increases our confidence that models are accurately projecting global warming.
  3. From the early arrivals...CFSR and UAH for November came in 0.10 and 0.09 higher than October respectively. I would imagine the conventional surface station datasets like GISTEMP and Berkeley Earth and the like will be at least as warm as October if not warmer. This would position 2019 with very high odds of being the 2nd warmest on record.
  4. That's pretty typical. Regional trends often do not align with global trends. Another example...Siberia cooled even more than the upper great plains in the CONUS during this same period. Yet...the planet is warmer overall now than in the previous decade. I asked the question above...is this a result of a systematic shift via WACKy, quasi resonant amplification of the polar jet, or something else? Or is it just another chaotic artifact that will disappear in the following decade? Lots of questions...
  5. NSIDC isn't the source though. Their "NSIDC in the News" section is just links to various articles and blogs that mention NSIDC. They have hundreds of links in this section every year. It's not even clear if these articles (which are dead links now) were in reference to a bona-fide peer reviewed study or some random blogger's opinion. Note the disclaimer in the section. The following items link to media coverage of NSIDC in various news outlets, online magazines, editorial pieces, and blogs. The content of these articles and blog posts does not necessarily reflect the views of NSIDC, our collaborators, or our funding agencies. Like I said, I can't see the articles anymore so I have no idea what the details of these "predictions" are. But based on the timing of when the articles appear I can speculate a bit. There were two fellows during this period that made some very aggressive predictions that got widespread media attention. The first was Maslowski and the second was Wadhams. Neither was characterized by broad acceptance in the academic community. In Wadham's case he was pretty much entirely ignored. Maslowski was a legit researcher but in his defense his work was frequently taken out of context. His 2016±3 date (which was often erroneously cited as 2013) was statistical and appeared in a publication that I believe used many methods to arrive at many different estimates with 2016 being the lowest therefore making a cherry pick and really bad at that. Masklowski even warned against taking his work out of context and specifically chided Al Gore for doing just that. The point...be careful about linking media popularity with the mainstream views of bona-fide scientists. They are often at odds with each other.
  6. Broadly speaking the first "ice-free" year has been getting pushed up. You'll find select studies here and there that have really aggressive predictions, but those are either few in number or not well received enough to influence the consensus much. In the 1990's the prevailing prediction was around 2100 or thereafter. And in the IPCC's AR3 report from 2001 it was stated (via a chart) that the first annual mean extent of 10.5e6 km^2 would not occur until about 2040. In reality it actually occurred in 2007. Even today many sea-ice models continue to struggle with the rapid pace of sea ice declines in both the NH and SH. Today it seems as though the consensus lands somewhere in the 2040-2060 range. So we still have a good wait ahead of us before we see < 1e6 km^2 of extent at the minimum. It's certainly possible that it could occur prior to 2040. Some on this forum and the ASIF believe we'll be lucky to make it to 2040. I'm in the more conservative camp and believe it will be after 2040. I'm prepared to be proven wrong though.
  7. Here is the average ONI for each of the last 6 years. Note that because the global mean temperature response tends to lag ENSO by a few months I have computed each year's average ONI using an offset. It seems as though this lag is typically in the range of 3 to 6 months so I've included two values. The first is the 3-month lag and the second is the 6-month lag. For example the 6-month lag value of +1.7 in 2016 is computed from 2015-07 to 2016-06. 2019 = +0.6, +0.6 2018 = -0.4, -0.5 2017 = -0.1, -0.2 2016 = +1.2, +1.7 2015 = +1.0, +0.6 2014 = -0.1, -0.2
  8. Here are the YTD rankings (Jan - Oct) from Berkeley Earth. 2016 = +0.99 2019 = +0.91 2017 = +0.86 2015 = +0.80 2018 = +0.79 2014 = +0.70 And here are the annual rankings (Jan - Dec) from Berkeley Earth. 2016 = +0.97 2017 = +0.86 2015 = +0.83 2018 = +0.79 2014 = +0.70 A top 3 finish looks like a good bet at this point. Nov and Dec need to come in at a combined +0.62 to rank above 2017's +0.86 finish and capture 2nd place. Based on the latest GFS monthly analysis + 7 day forecast (up to Nov 26th) it appears that Nov should finish strong and end well above the +0.62 threshold. http://berkeleyearth.lbl.gov/auto/Global/Land_and_Ocean_complete.txt http://www.karstenhaustein.com/climate.php
  9. Maybe a bit of WACKy (Warm Actic Cold Continent) or perhaps the hypothesized Quasi-Resonant Amplification of the jet stream starting to show itself?
  10. Copernicus says 2019 was the warmest (albeit by a narrow margin) on record for the month of October. They also cite the last 12 months as being 1.2C above the pre-industrial temperature. And with an energy imbalance that is still +0.6 W/m^2 it is reasonable to assume that more warming is likely already baked in. https://climate.copernicus.eu/surface-air-temperature-october-2019
  11. So in an effort to steer this thread back on track...it looks like the refreeze has really ramped up lately. We are still in record territory for this time of year, but it looks like 2019 might jump ahead of 2016 in the next week or so.
  12. CO2 does not add heat. It traps heat. In this context "trap" means to slow the egress transmission of heat without slowing the ingress transmission of heat. The insulation in your home acts as a thermal barrier to trap heat. The furnace adds energy to your home. Because the insulation has changed the rate at which heat is lost your home will achieve a higher equilibrium temperature with the insulation than it would otherwise. But the furnace is still the energy source. ...similarly... The GHGs in Earth's atmosphere act as a thermal barrier to trap heat. The Sun adds energy to the Earth. Because the GHGs have changed the rate at which heat is lost the Earth will achieve a higher equilibrium temperature with the GHGs than it would otherwise. But the Sun is still the energy source.
  13. Just understand that natural CO2 molecules have the same radiation behavior as anthroprogenic CO2 molecules. So an anthrprogenic pulse of CO2 (like with fossil fuel combustion and cement production) will lead to the same amount of warming as a natural pulse of CO2 (like which occurred during the PETM) given the same magnitude of the pulse. In that manner the laws of physics don't really care how the CO2 got into the atmosphere. I also sense a bit of the logical fallacy affirming a disjunct. Just because CO2 was naturally modulated in the past doesn't mean that it can't be anthroprogenically modulated today and cause warming. And yes, CO2 levels were much higher in the past. This is an essential piece in the puzzle in solving the faint young Sun problem. Remember, solar output is about 1% weaker for every 120 million years in the past. 600 mya the solar radiative force was about -12 W/m^2 (see Gough 1981). So it would take ~9.5x the amount of CO2 just to offset the reduced solar forcing of the past relative to today (note that 5.35 * ln(9.5) = +12 W/m^2).
  14. I believe the point of the graph was to test your comprehension of noisy information. This is actually a carefully studied topic in academia especially in the context of climate data in which there is a disproportionate number of analysis out there in the blogosphere that suffer from various cognitive biases. I want you to get started with this paper. Daron et al, 2015: Interpreting climate data visualisations to inform adaptation decisions There are many well documented cognitive biases that influence an individual's comprehension of a graph. They include but are not limited to anchoring, framing, etc. In a nutshell when individuals are presented with a plot of noisy data some of them are incapable of mentally forming a linear or exponential regression trendline in their head.
  15. Also per the NSIDC 5-day average 10/17/2019 marks the all time highest negative anomaly on record. We are 3.065 sq km below the 1981-2010 climatological average. This breaks the 3.048 record set on 10/9/2012. In other words, we have less sea ice (in terms of extent) relative to average than at any point in the satellite era.