Jump to content

bdgwx

Members
  • Posts

    1,356
  • Joined

  • Last visited

Posts posted by bdgwx

  1. You mentioned UHI @Roger Smith. I want to address that because it is interesting and because it is one of the most misunderstood topics in the climate debate. First, some definitions.

    UHI Effect - This is an increase in localized temperatures in urban areas relative the surrounding region as a result of land use changes. It is a phenomenon that has a physical impact.

    UHI Bias - This is a high or low bias in the measured regional average temperatures caused by defective spatial averaging techniques. It is a phenomenon that has an unphysical impact.

    It is important that these concepts not be conflated. Ideally we want to keep the UHI effect in regional and global average temperature datasets because it is real. What we don't want to keep in the datasets are any UHI biases. The biases are caused by grid meshing strategies, station moves, station commissioning or decommissioning, urbanization or deurbanization, etc.

    An example of a high bias is when you have a grid cell is that is 25/75 urban/rural but the station ratio is flipped such that it is 75/25 urban/rural. In this case you are overweighting the urban stations and effectively using them as a proxy for the rural area.

    An example of a low bias is when the station ratio starts out 75/25 urban/rural for a grid cell where urbanization stalled and then changes to 50/50 urban/rural. In this case you create an artificial low bias even though you may still actually be overweighting the urban area.

    The timing of urban/rural station ratio and the rate of urbanization (including stalls) affect how the bias plays out in that grid cell. Most people don't take the time to consider that the UHI bias can cause us to underestimate temperature trends. I'm not saying that low biases outnumber high biases. I'm just saying that the sword cuts both ways and if you want to a rigorous analysis then you need to consider how UHIs could cause low biases as well.

    A lot of datasets deal with the bias by just simple removing the UHI effect altogether. That does mitigate the bias, but at the expense of underestimating the surface warming a bit. I think the reason why this is such an appealing approach is because it is simple and because the UHI effect isn't that much considering only 2% of the planet is urbanized. 

    At the end of the day UHI is such a small part of the global average that it just doesn't matter either way. But I'll leave readers with the Berkeley Earth's analysis. They found that the UHI bias on the global average temperature trend is statistically equivalent to 0.00 C/decade, but that if anything it is more likely to bias the trend too low albeit by an insignificant amount. I think this shocks a lot of people because the thought process is that since the UHI effect is always positive then the bias must always be positive as well which isn't true. [Rohde et al. 2013

  2. I was just made aware today that NOAAGlobalTemp was upgraded last week from 5.0 to 5.1. The trend from 1979/01 to 2022/12 increased from 0.172 C/decade to 0.180 C/decade.

    https://www.ncei.noaa.gov/products/land-based-station/noaa-global-temp

    The big change is the implementation of full spatial coverage.

    See Vose et al. 2021 for details on the changes.

    Here is the updated graphic based on preliminary data I had to download manually. 

    2p9Vg2U.png

    • Like 1
  3. On 2/17/2023 at 9:19 AM, Wxdood said:
    The arctic can go ice free all year? Based on what evidence?

    Paleoclimate and paleontology records.

     

    On 2/17/2023 at 9:19 AM, Wxdood said:
    Al gore said it would be ice free in 2008. Why isn’t it ice free now?

    The latest evidence suggests "ice-free" (defined in literature as < 1e6 km2) will first occur around 2050 (IPCC AR6 WG1 SPM pg. 16). This is the most aggressive prediction I've seen based on the broad evidence. It is more aggressive then the IPCC's previous of prediction of 2070 in the early 2000's and 2100 in the 1990's. The IPCC's predictions are based on a consilience of evidence approach and so adequately represents the scientific expectation. 


    Regarding Al Gore...his prediction of 2013 (made in 2008) and then later 2016 was based on a single cherry-picked source that never said what Gore claimed. That source is Maslowski. Specifically Maslowski et al. 2008 and later Maslowski et al. 2013. I encourage you to read the publications yourself. In fact, Maslowski goes to great lengths warning his audience to be careful with predictions based on extrapolation of recent trends and even warns against taking dynamically modeled predictions (like from his NAME model) verbatim due to the large uncertainty and limitations with sea ice modeling in general at the time. And note that Maslowski said in response to Gore's statement "It’s unclear to me how this figure was arrived at. I would never try to estimate likelihood at anything as exact as this.” Let that be a cautionary tail to 1) discount scientific predictions coming from non peer reviewed sources and 2) always check the sources provided.
     

    • Like 1
  4. On 2/15/2023 at 7:17 PM, fujiwara79 said:

    Of course core samples from thousands of years ago will show natural forces exceeding anthropogenic forces - because there was no anthropogenic forcing back then.

    Correct. This fallacy is common enough that it has a name. It is called affirming a disjunct. I see it all of the time. Two options are presented: A (natural) or B (anthropogenic). Then the argument is because A is true therefore B is not true. This is essentially how the null hypothesis test I discussed just above plays out as well. Two options are presented: A (breakpoint analysis doesn't matter) and B (breakpoint analysis does matter). A test is performed showing A is true for an isolated case and then the erroneous conclusion that therefore B is not true follows. Both arguments (that it is only ever natural and that breakpoint analysis never matters) are absurd. They are absurd because in both cases A does not preclude B. 

    • Like 2
    • Thanks 1
  5. 2 hours ago, bdgwx said:

    I must have missed it. Can you post details on that null hypothesis test or link to where you did it? I'd like to review it.

    Nevermind. I found it. I'm not sure the exact details of the test, but it only covered the overlap period between your PWS and Coatesville 2W from 2003/12 to 2007/12. That is important because Coatesville 2W (USC00361591) did not have any documented changes during that period so the expectation is that there would be no difference between the region and thus your PWS. Using a period in which it is expected for 2 stations to be equivalent and then finding that they are equivalent is not proof of an equivalency of other stations and other time periods. BTW...Berkeley Earth's analysis did not find any breakpoints for Coatesville 2W during this period. However, their analysis did find two breakpoints prior to this period. Both breakpoints biased the observations higher and so the breakpoint adjustment reduced the warming in this case [1].

    [1] Note that although Berkeley Earth performs a breakpoint adjustment for each station they actually don't use it for the spatial averaging step. It is only provided for informational purposes. They actually use what they call the "scalpel" method. When a breakpoint is found they split the station timeseries and treat it as if it were another station. This is quite clever because it addresses the concern of adjustments and the impact it may have on the final global average temperature product. See Rohde et al. 2013 and Rohde & Hausfather 2020 for details.

    • Like 1
    • Thanks 1
  6. Arctic sea ice was likely lower during the Holocene Climate Optimum and almost certainly lower prior to the current ice age. It is a testament to the fact that given a big enough nudge not only can Arctic sea ice go lower, but it can entirely disappear. And I don't mean mean go "ice-free" in the summer with < 1e6 km2. I mean literally go to 0 km2 year round.

    Anyway, per Walsh et al. 2016 (see also Walsh et al. 2019) Arctic sea is lower than at any point since 1850 AD.

    d7BsGL4.png

    And per Kinnard et al. 2011 Arctic sea ice is lower than at any point since 600 AD.

    yvqwknl.png

    • Haha 1
  7. 3 hours ago, dseagull said:

    However, I stand by my statement and scientific fact that natural forces will always be greater than anthropogenic.  This has been proven through core samples.  This is not a hypothesis.

    Say what? Core samples prove that natural forces will always be greater than anthropogenic? In all my years of studying the climate I have never heard that. And considering that the 2 biggest contributors to climate forcing today are GHGs and aerosols which have both been implicated in numerous climatic shifts in the past and which are supported by core sample evidence I am left perplexed by this statement. Look at what happened during the PETM when large quantities of carbon got released into the atmosphere. Look at what happened when Tambora or even more recently Pinatubo released large quantities of SO2 into the atmosphere. Remember, the laws of physics say that anthropogenically modulated GHGs and aerosols work the exact same way in regards to their radiative properties as naturally modulated GHGs and aerosols.

    • Like 2
  8. 4 hours ago, ChescoWx said:

    Except the statistical analysis proves there is no bias or by chance relationships between the various data sets and observations in the data. If there was the statistical p-value in the data would not have delivered the level of significance against the null hypothesis. It's simple scientific data testing and analysis.

    I must have missed it. Can you post details on that null hypothesis test or link to where you did it? I'd like to review it.

  9. 27 minutes ago, ChescoWx said:

    Of course as the actual real world data is not aligned with the popular global story....I get it

    First, I think you are deflecting and diverting here. The topic is the temperature in Chester County, PA. You are claiming that the observations show no increase in temperature in that region but not addressing the fact the claim is based on biased observations. That is what I'm asking you to address.

    Second, if by "popular global story" you mean the global average temperature (GAT) then know that the actual real world data in Chester County, PA is aligned with the GAT (aka "popular global story"). It is one of the inputs to the GAT afterall. The temperature in Chester County, PA has the same effect on the GAT as any other region of equal area.

    • Like 2
  10. This post is part 2 regarding the question of how ENSO and specifically the current La Nina is affecting the global average temperature (GAT) trend. 

    Using the model we created and trained above we can see that ENSO modulation term is [0.11 * ONIlag4]. We can visual the ENSO affect by plotting the ENSO residuals from our model against the detrended GAT. As can be seen ENSO explains a lot of the GAT variability with an R^2 = 0.32. One quite noticeable problem is the Pinatubo 1991 eruption.

    5Sj8dWT.png

    If we then apply the ENSO residuals to the GAT we can get a ENSO-adjusted GAT timeseries. Each dataset has been ENSO-adjusted and presented with new trend (C/decade) figures. The graph below is the exact same format as the one above so it can be easily compared. Notice that applying the ENSO residual adjustment reduces the monthly variability and causes all of the trends to increase by about 0.01 C/decade including the composite trend which is +0.195 ± 0.049 C/decade vs  +0.186 ± 0.058 C/decade without the adjustment. Because we are removing a significant autocorrelation component in the process our AR(1) corrected uncertainty drops as well. So that is the answer. ENSO and specifically the current La Nina has reduced the trend by 0.01 C/decade. 

    Another thing that may jump out at you is the Pinatubo 1991 eruption era. The discrepancy between CMIP5 and observation may be the handling of the eruption and its effects. It looks to me like CMIP5 underestimates the effect. As a result it does not cool the planet enough in the succeeding months. Had CMIP5 pulled the temperature down another 0.2 C it looks like the prediction would have been a much better match to observations later in the period.

    Mhq0kNV.png

     

    • Like 1
  11. 2 hours ago, ChescoWx said:

    But of course the unadjusted data for all of the available Chester County sites does not show the above trend lines.....

    You saw the breakpoint analysis clearly showing the low bias later in the period...

    2 hours ago, ChescoWx said:

    Of course Tip...but the actual unadjusted data should follow suit...there should be consistent data without adjustment for the 1930's and 1940's that support this supposition....correct?? How can we find entire counties in the USA that do not follow the same warming we should expect to see?

    No. That is not correct. First, the warming is not homogenous. There is no expectation that it will be the same everywhere on the planet or that every location will even experience warming at all. It is the opposite actually. We expect the temperature trends to be different from location to location. We even expect some locations to cool. Second, as I already pointed out land stations are notorious for having breakpoints that bias their trends downward due to time-of-observation changes and instrument package changes. See Vose et al 2003 and Hubbard & Lin 2006 for details on these particular issues and Menne et al. 2009 for details how it is handled.

  12. I want to address the observation that the current La Nina may be pulling hte trend down that @HailMan06 noticed. This will be a two part topic. The first part will focus on determing how we can model the global average temperature (GAT). We will take the result from this post to answer the question directly in the next post.

    The goal is to find a model that explains and predicts the GAT and minimizes the root mean squared error (RMSE). We will use the following rules.

    - Components of the model must be based on physical laws and known to directly force changes in atmospheric temperature. This rule eliminates things like population growth which may be correlated with atmospheric temperature but does not directly force it.

    - Autocorrelation will not be considered. While autocorrelation is incredibly powerful in predicting how a phenomenon evolves in time it does not help in explaining why it evolved or more precisely why it persisted in the first place.

    - The components should be as independent as possible. For example, since MEI and ONI are different metrics of the same phenomenon we should not use both.

    - ENSO must be considered. This is necessary because in part 2 we will use the information from the model regarding ENSO to see how it effects the GAT trend.

    - It must use a simple linear form so that it is easy to compute and interpret. 

    With these rules in mind here is a list of components that I felt were easy to obtain and would adequately model the GAT.

    - Oceanic Nino Index (ONI) https://www.cpc.ncep.noaa.gov/data/indices/oni.ascii.txt

    - Atlantic Multi-Decadal Oscillation (AMO) https://psl.noaa.gov/data/correlation/amon.us.data

    - Volcanic Aersools (AOD) https://data.giss.nasa.gov/modelforce/strataer/tau.line_2012.12.txt

    - Total Solar Irradiance (TSI) https://lasp.colorado.edu/lisird/data/nrl2_tsi_P1M/

    - Atmospheric Carbon Dioxide Concentration (CO2) https://gml.noaa.gov/webdata/ccgg/trends/co2/co2_mm_mlo.txt

    The model will be in the form T = Σ[λ * f(D_i), 1, n] where λ serves as both a tuning parameter and a unit translation factor to convert units for the component into degrees K (or C), D is the component data, and f is a simple function that acts upon the component data. In most cases the function f just returns the value raw value of the dataset. The model is trained with multiple passes where each pass focuses on tuning both the λ parameter and the lag period for a single component one-by-one. After the 1st pass initializes the λ parameter for a 1-month lag for each component a 2nd pass perturbs the parameter and lag period up/down to see if the model output skill improves. This continues for as many iterations as needed until the model skill stops improving. It is important to understand that the λ parameters and lag periods are not chosen or selected. They appear organically from the training.

    Without further ado here is the model.

    GAT = 0.09 + [2.3 * log2(CO2lag1 / CO2initial)] + [0.11 * ONIlag4] + [0.20 * AMOlag3] + [-2.0 * AODlag3] + [0.05 * (TSIlag1 - TSIavg)]

    This model has an RMSE of 0.091 C. Considering that the composite mean GAT has an uncertainty of around σ = 0.05 C that is an incredible match to the GAT observations leaving maybe 0.04 C of skill on the table.

    The astute reader will notice that the λ parameter on the CO2 component is λ = 2.3 C/log2(PPM). Care needs to be taken when interpreting this value. It is neither the equilibrium climate sensitivity (ECS) nor the transient climate sensitivity (TCS). However, it is most closely related to the TCS which would imply an ECS of about 3.0 C per 2xCO2. And again...I did not pick this value. I did not manipulate the training of the model so that it would appear. I had no idea that the machine learning algorithm would hone in on it. It just appears organically. Anyway, that is neither here nor there. The important point here is that we now have an estimate for how ENSO modulates the GAT which can be used later to see how the current La Nina may have affected the trend.

    sUpJ7gP.png

     

     

    • Like 1
    • Thanks 1
  13. 1 hour ago, ChescoWx said:

    CSNavy waiting as usual for any actual non NOAA adjusted long term weather data verification ...more than since 1970 to support with any scientific rigor any such warming claims.

    Let me get this straight. You want to use data that has not been adjusted for known bias caused by instrument package changes, time-of-observation changes, station sighting changes, measurement procedure changes (transition from bucket to ship intake to buoy)? Yes/No?

    If that is your bar for acceptance then no thank you. Using data that is known to be contaminated with biases without making any attempt to address those biases is unethical at best and fraudulent at worst. I'm going to have no part of it.

    And BTW...don't think I didn't notice that you completely ignored the fact that the "non NOAA adjusted long term weather data" shows more warming vs the adjusted data. It's worth repeating in all caps and bold...the unadjusted/raw data shows MORE warming than the adjusted data. So if you're bar for acceptance includes the unethical (and potentially fraudulent) analysis of data known to be contaminated with biases then you are going to have accept that the warming is MORE than is being reported by scientists since 1880.

     

     

  14. In the chart above the composite trend is labeled is as +0.186 ± 0.058 C/decade (2σ). The question might be...why is the uncertainty so high on that trend. The reason is because the global average temperature exhibits high autocorrelation. Autocorrelation is the correlation of a timeseries with a time-lagged copy of itself. What this means is that the value 1 month (or 2 months, etc.) ago partially determines (act as a predictor or estimator) the value of the current month. This can be interpreted where if the value is high/low it tends to stay high/low. This would then cause the trend to nudge up/down significantly as the monthly values stayed high/low for extended periods of time. In other words, the trend is variable as new monthly values are added to the timeseries. This creates uncertainty on top of the already existing linear regression trend standard uncertainty. 

    The way I dealt with is to use the Foster & Rahmstorf 2011 method. In this method they calculate the ordinary linear regression standard uncertainty via the canonical σ_w = sqrt[ Σ[(y_actual - y_predicted)^2] / Σ[(x_actual - x_mean)^2] / (n-2) ] forumula. That results in σ_w = 0.0048 C/decade. We then multiple that by the square root of the correction value v such that σ_c = σ_w * sqrt(v). Our correction value v is defined as v = (1+p) / (1-p) and can be interpreted as the number of data points per the degrees of freedom. The p parameter is the autoregressive coefficient from an AR(1) model of the timeseries. For the composite timeseries in the graph above our AR(1) coefficient is p = 0.948. This makes our correction factor v = (1+0.948) / (1-0.948) = 37.5. So sqrt[v] = 6.15. Thus the expanded/corrected standard uncertainty of the trend is σ_c = 0.0048 * 6.15 = 0.03 C/decade. And, of course, 2σ_c = 0.06 C/decade. There are some rounding errors here. The full computation is 0.058 C/decade. Anyway, that is how we get ±0.058 C/decade for the trend over the period 1979-2022. Note that F&R say that the AR(1) correction still underestimates the trend uncertainty. They recommend applying an ARMA correction as well which factors in the n-month lagged autoregressive decay rate. Based on the calculations I've seen that additional correction is insignificant when analyzing long duration trends like from 1979 to 2022 so I'll ignore it for now.

    To demonstrate just how wildly the trend can behave over short periods of time I plotted the trend from all starting months from 1979 to 2022 ending in 2022/12 with the confidence intervals computed from the method above. For example, the trend at 2014/09 is -0.005 C/decade. It happens to be the oldest trend that is <= 0 C/decade. But using the F&R method above we see that the uncertainty on that is a massive ±0.305 C/decade. The AR(1) corrected trend uncertainties gets smaller and stabilizes as you go back further in time toward 1979. 

    The point...the global average temperature trend is highly variable over short periods of analysis. This is why the infamous Monckton Pause which starts in 2014/09 has an uncertainty of ±0.305 C/decade and so is statistically insignificant. You just simply can't look at short recent trends and draw any meaningful conclusion. This is true when selecting 2010/12 as the starting month and report the very high warming rate of +0.316 C/decade since the uncertainty there is ± 0.273 C/decade. What we can definitively conclude is that all trends older than 2011/08 show statistically significant warming.

    xFN04bb.png

     

    • Like 2
  15. 17 hours ago, HailMan06 said:

    Our current multi-year La Niña is probably affecting the global temperature trend post-2019. Just eyeballing the chart it makes the trend appear lower than it otherwise would be.

    Correct. I'll post more information about this when I get time. Another thing that is affecting the composite trend is the fact that only HCRUT, GISS, BEST, and ERA are full sphere measurements. This is pulling the observed trend down because the highest rates of warming are occurring in the Arctic. Another possible (and probable IMHO) is that UAH is an outlier not because it holds a monopoly on truth, but because it may have one or more significant defects. If UAH showed the same warming rate as the next lowest dataset (NOAA) then the composite trend would jump up to +0.190 C/decade.

    • Like 1
    • Thanks 1
×
×
  • Create New...