Jump to content

bdgwx

Members
  • Posts

    1,500
  • Joined

  • Last visited

Everything posted by bdgwx

  1. So if I remember correctly some of the recent computer modeling studies showed that the 2020's might be characterized as period of stalling out on the declines before picking back up again in the 2030's. What do you guys think? Are we going to see the same dramatic declines or will there be a hiatus? I think I'm more in favor of a moderation in the decline rates. But, it's not lost on me that those who have made similar conservative predictions in the past have gotten burned. So I'm prepared to be wrong.
  2. According to NSIDC... For 2019 the NH (Arctic) ended with an annual mean of 10.186e6 km^2 of extent. This is the 2nd lowest after 2016 which ended with 10.163e6. For 2019 the SH (Antarctic) ended with an annual mean of 10.826e6 km^2 of extent. This is the 2nd lowest after 2017 which ended with 10.749e6.
  3. Annual global mean temperatures from most datasets are accompanied with a margin of error. This makes annual rankings probabilistic. Berkeley Earth has a good visualization of this in their 2018 report. You can see how the temperature distribution curve for each year peaks at the reported value and how the tails can overlap with other years. http://berkeleyearth.org/2018-temperatures/ They have an excellent paper describing how the averaging process works and how uncertainties are dealt with and reported. http://berkeleyearth.org/static/papers/Methods-GIGS-1-103.pdf Most other datasets post their uncertainties and make annual rankings in a similar manner.
  4. According to the NSIDC the 5 day average is well outside the interdecile range and just barely inside the 2σ envelope. It is 870,000 sq km below the 1981-2010 mean. Also, I counted 6 other times in which sea ice extent increase from 12/1 to 12/23 was higher.
  5. And with reanalysis running warmer through the first half of December I think it's a near certainty that 2019 is going to clinch 2nd place in most datasets.
  6. One model that gets a lot of attention is that done by Hansen and published in 1988. See H88 here. He models three different scenarios: A, B, and C. A is a high emissions scenario, B is medium emission scenario, and C is a low emission scenario. Scenario A assumes exponential growth of GHG emissions plus hypothetical new emissions. B assumes modest GHG emissions most similar to what has been experienced so far. And C assumes drastic curtailment of all GHG emissions by 2000. Scenarios B and C inject volcanic eruptions in the years 1995 and 2015 similar in magnitude to El Chichon and Agung. Scenario A has no such volcanic eruptions injected. Scenario A is claimed to produce an equivalent forcing of 2xCO2 by 2030, B by 2060, and never for C. Scenario B was said to be most realistic and likely future trajectory. The actual trajectory of forcing lies between B and C. The HDAS2019 publication referenced in this thread estimates B's forcing as 27% higher than what actually transpired. Today we know this is partially due to the Montreal Protocol and probably the somewhat higher volcanic aerosol forcing that occurred due to Pinatubo 1991 and several VEI 4+ eruptions in the early 2000s as well. Had H88 used correct inputs in a hypothetical scenario that closely matched reality then that scenario would have exhibited very high skill in predicting the global mean surface temperature. This tells us the problem was more with the assumed inputs than with the model physics.
  7. A new study appeared yesterday evaluating the skill of past climate model simulations. https://www.sciencemag.org/news/2019/12/even-50-year-old-climate-models-correctly-predicted-global-warming The news article contains a link for free access to the publication. Here is the paywall link though. https://agupubs.onlinelibrary.wiley.com/doi/abs/10.1029/2019GL085378 And here is the materials available on github. https://github.com/hausfath/OldModels Abstract Retrospectively comparing future model projections to observations provides a robust and independent test of model skill. Here we analyse the performance of climate models published between 1970 and 2007 in projecting future global mean surface temperature (GMST) changes. Models are compared to observations based on both the change in GMST over time and the change in GMST over the change in external forcing. The latter approach accounts for mismatches in model forcings, a potential source of error in model projections independent of the accuracy of model physics. We find that climate models published over the past five decades were skillful in predicting subsequent GMST changes, with most models examined showing warming consistent with observations, particularly when mismatches between model‐projected and observationally‐estimated forcings were taken into account. Plain Language Summary Climate models provide an important way to understand future changes in the Earth's climate. In this paper we undertake a thorough evaluation of the performance of various climate models published between the early 1970s and the late 2000s. Specifically, we look at how well models project global warming in the years after they were published by comparing them to observed temperature changes. Model projections rely on two things to accurately match observations: accurate modeling of climate physics, and accurate assumptions around future emissions of CO2 and other factors affecting the climate. The best physics‐based model will still be inaccurate if it is driven by future changes in emissions that differ from reality. To account for this, we look at how the relationship between temperature and atmospheric CO2 (and other climate drivers) differs between models and observations. We find that climate models published over the past five decades were generally quite accurate in predicting global warming in the years after publication, particularly when accounting for differences between modeled and actual changes in atmospheric CO2 and other climate drivers. This research should help resolve public confusion around the performance of past climate modeling efforts, and increases our confidence that models are accurately projecting global warming.
  8. From the early arrivals...CFSR and UAH for November came in 0.10 and 0.09 higher than October respectively. I would imagine the conventional surface station datasets like GISTEMP and Berkeley Earth and the like will be at least as warm as October if not warmer. This would position 2019 with very high odds of being the 2nd warmest on record.
  9. That's pretty typical. Regional trends often do not align with global trends. Another example...Siberia cooled even more than the upper great plains in the CONUS during this same period. Yet...the planet is warmer overall now than in the previous decade. I asked the question above...is this a result of a systematic shift via WACKy, quasi resonant amplification of the polar jet, or something else? Or is it just another chaotic artifact that will disappear in the following decade? Lots of questions...
  10. NSIDC isn't the source though. Their "NSIDC in the News" section is just links to various articles and blogs that mention NSIDC. They have hundreds of links in this section every year. It's not even clear if these articles (which are dead links now) were in reference to a bona-fide peer reviewed study or some random blogger's opinion. Note the disclaimer in the section. The following items link to media coverage of NSIDC in various news outlets, online magazines, editorial pieces, and blogs. The content of these articles and blog posts does not necessarily reflect the views of NSIDC, our collaborators, or our funding agencies. Like I said, I can't see the articles anymore so I have no idea what the details of these "predictions" are. But based on the timing of when the articles appear I can speculate a bit. There were two fellows during this period that made some very aggressive predictions that got widespread media attention. The first was Maslowski and the second was Wadhams. Neither was characterized by broad acceptance in the academic community. In Wadham's case he was pretty much entirely ignored. Maslowski was a legit researcher but in his defense his work was frequently taken out of context. His 2016±3 date (which was often erroneously cited as 2013) was statistical and appeared in a publication that I believe used many methods to arrive at many different estimates with 2016 being the lowest therefore making a cherry pick and really bad at that. Masklowski even warned against taking his work out of context and specifically chided Al Gore for doing just that. The point...be careful about linking media popularity with the mainstream views of bona-fide scientists. They are often at odds with each other.
  11. Broadly speaking the first "ice-free" year has been getting pushed up. You'll find select studies here and there that have really aggressive predictions, but those are either few in number or not well received enough to influence the consensus much. In the 1990's the prevailing prediction was around 2100 or thereafter. And in the IPCC's AR3 report from 2001 it was stated (via a chart) that the first annual mean extent of 10.5e6 km^2 would not occur until about 2040. In reality it actually occurred in 2007. Even today many sea-ice models continue to struggle with the rapid pace of sea ice declines in both the NH and SH. Today it seems as though the consensus lands somewhere in the 2040-2060 range. So we still have a good wait ahead of us before we see < 1e6 km^2 of extent at the minimum. It's certainly possible that it could occur prior to 2040. Some on this forum and the ASIF believe we'll be lucky to make it to 2040. I'm in the more conservative camp and believe it will be after 2040. I'm prepared to be proven wrong though.
  12. Here is the average ONI for each of the last 6 years. Note that because the global mean temperature response tends to lag ENSO by a few months I have computed each year's average ONI using an offset. It seems as though this lag is typically in the range of 3 to 6 months so I've included two values. The first is the 3-month lag and the second is the 6-month lag. For example the 6-month lag value of +1.7 in 2016 is computed from 2015-07 to 2016-06. 2019 = +0.6, +0.6 2018 = -0.4, -0.5 2017 = -0.1, -0.2 2016 = +1.2, +1.7 2015 = +1.0, +0.6 2014 = -0.1, -0.2
  13. Here are the YTD rankings (Jan - Oct) from Berkeley Earth. 2016 = +0.99 2019 = +0.91 2017 = +0.86 2015 = +0.80 2018 = +0.79 2014 = +0.70 And here are the annual rankings (Jan - Dec) from Berkeley Earth. 2016 = +0.97 2017 = +0.86 2015 = +0.83 2018 = +0.79 2014 = +0.70 A top 3 finish looks like a good bet at this point. Nov and Dec need to come in at a combined +0.62 to rank above 2017's +0.86 finish and capture 2nd place. Based on the latest GFS monthly analysis + 7 day forecast (up to Nov 26th) it appears that Nov should finish strong and end well above the +0.62 threshold. http://berkeleyearth.lbl.gov/auto/Global/Land_and_Ocean_complete.txt http://www.karstenhaustein.com/climate.php
  14. Maybe a bit of WACKy (Warm Actic Cold Continent) or perhaps the hypothesized Quasi-Resonant Amplification of the jet stream starting to show itself?
  15. Copernicus says 2019 was the warmest (albeit by a narrow margin) on record for the month of October. They also cite the last 12 months as being 1.2C above the pre-industrial temperature. And with an energy imbalance that is still +0.6 W/m^2 it is reasonable to assume that more warming is likely already baked in. https://climate.copernicus.eu/surface-air-temperature-october-2019
  16. So in an effort to steer this thread back on track...it looks like the refreeze has really ramped up lately. We are still in record territory for this time of year, but it looks like 2019 might jump ahead of 2016 in the next week or so.
  17. CO2 does not add heat. It traps heat. In this context "trap" means to slow the egress transmission of heat without slowing the ingress transmission of heat. The insulation in your home acts as a thermal barrier to trap heat. The furnace adds energy to your home. Because the insulation has changed the rate at which heat is lost your home will achieve a higher equilibrium temperature with the insulation than it would otherwise. But the furnace is still the energy source. ...similarly... The GHGs in Earth's atmosphere act as a thermal barrier to trap heat. The Sun adds energy to the Earth. Because the GHGs have changed the rate at which heat is lost the Earth will achieve a higher equilibrium temperature with the GHGs than it would otherwise. But the Sun is still the energy source.
  18. Just understand that natural CO2 molecules have the same radiation behavior as anthroprogenic CO2 molecules. So an anthrprogenic pulse of CO2 (like with fossil fuel combustion and cement production) will lead to the same amount of warming as a natural pulse of CO2 (like which occurred during the PETM) given the same magnitude of the pulse. In that manner the laws of physics don't really care how the CO2 got into the atmosphere. I also sense a bit of the logical fallacy affirming a disjunct. Just because CO2 was naturally modulated in the past doesn't mean that it can't be anthroprogenically modulated today and cause warming. And yes, CO2 levels were much higher in the past. This is an essential piece in the puzzle in solving the faint young Sun problem. Remember, solar output is about 1% weaker for every 120 million years in the past. 600 mya the solar radiative force was about -12 W/m^2 (see Gough 1981). So it would take ~9.5x the amount of CO2 just to offset the reduced solar forcing of the past relative to today (note that 5.35 * ln(9.5) = +12 W/m^2).
  19. I believe the point of the graph was to test your comprehension of noisy information. This is actually a carefully studied topic in academia especially in the context of climate data in which there is a disproportionate number of analysis out there in the blogosphere that suffer from various cognitive biases. I want you to get started with this paper. Daron et al, 2015: Interpreting climate data visualisations to inform adaptation decisions There are many well documented cognitive biases that influence an individual's comprehension of a graph. They include but are not limited to anchoring, framing, etc. In a nutshell when individuals are presented with a plot of noisy data some of them are incapable of mentally forming a linear or exponential regression trendline in their head.
  20. Also per the NSIDC 5-day average 10/17/2019 marks the all time highest negative anomaly on record. We are 3.065 sq km below the 1981-2010 climatological average. This breaks the 3.048 record set on 10/9/2012. In other words, we have less sea ice (in terms of extent) relative to average than at any point in the satellite era.
  21. Per NSIDC the daily extent on 10/17 was 5.374e6. On this date in 2012 and 2016 was 6.082e6 and 5.954e6 respectively and the climatological average is 8.470e6. Obviously 2019 is yet another year among recent years with lackluster sea ice extents in the NH. And the SH isn't picking up the slack like it was prior to 2016. Globally sea ice extents are at record lows. In fact, globally sea ice extents have spent more time below -4σ than it has above -2σ since 2016. That is certainly noteworthy.
  22. Yes. UAH's TLT product is too high to be a reliable proxy for the surface temperature. Obviously that raises concerns with contamination from the cooling stratosphere, but its been suggested that there are other methodological problems that may be partly to blame for their significantly lower warming trend estimate.
  23. I think the flag made sense at the time. I have to be honest...my first thought was that something may have been wrong with the data. That was before I had learned of the record breaking SSW event in the SH and the RSS data. Looking back through commentary it appears like a few experts had already expected UAH and RSS to record these unusually large regional anomalies and modestly large global anomalies. Until this event I had no idea a SSW event in the SH could cause such a dramatic change in the global mean temperature. Fascinating stuff. BTW...Spencer and Christy have confirmed that their data is correct. This statement now appears in the data files. *****UPDATE 4 Oct 2019***** After further analysis, September 2019 values are credible. see https://www.drroyspencer.com/2019/10/record-antarctic-stratospheric-warming-causes-sept-2019-global-temperature-update-confusion/
  24. I'll go ahead and get this kicked off since I have something interesting to talk about. So UAH is usually super quick at publishing monthly numbers. They posted a +0.61 for September 2019 which was an unexpectedly large increase. https://www.nsstc.uah.edu/data/msu/v6.0/tlt/uahncdc_lt_6.0.txt The data file is even flagged with: *****CAUTION****** SEPTEMBER 2019 DATA APPEAR TO BE ERRONEOUSLY WARM. WE ARE INVESTIGATING. The warming seems to be the result of unusually large anomalies in both the stratosphere and troposhere in the southern hemisphere with a whopping +13.65 at the south pole on the TLS product. https://www.nsstc.uah.edu/data/msu/v6.0/tls/uahncdc_ls_6.0.txt But...RSS just released their September data and they too show the unusual warming. http://images.remss.com/data/msu/graphics/TLT_v40/time_series/RSS_TS_channel_TLT_Global_Land_And_Sea_v04_0.txt http://images.remss.com/data/msu/graphics/TLS_v40/time_series/RSS_TS_channel_TLS_Southern Polar_Land_And_Sea_v04_0.txt It seems as though there was a sudden stratospheric warming event in the SH. And it was record breaking at least according to UAH and RSS. I'm thinking the UAH (and RSS) data may be legit. UAH may be justified in removing the warning message without any changes to the data.
×
×
  • Create New...