Jump to content
  • Member Statistics

    17,515
    Total Members
    7,904
    Most Online
    12bet1 net
    Newest Member
    12bet1 net
    Joined

30 Year "Normals" changing - How does this affect historical data?


Recommended Posts

Since what is "normal" just changed with data thru 2010, how does this affect history data before the current climate normal period?

For example: Let's pretend that a given location in December 1950 averaged +2 for the month back in 1950. Well the new "normals" get published after 2010 and the December average for that location increases by .2 degrees.

Does that historical month of December 1950 departure from normal now change to +1.8 since the "normal" for December increased when the new "normals" were calculated?

Link to comment
Share on other sites

Since what is "normal" just changed with data thru 2010, how does this affect history data before the current climate normal period?

For example: Let's pretend that a given location in December 1950 averaged +2 for the month back in 1950. Well the new "normals" get published after 2010 and the December average for that location increases by .2 degrees.

Does that historical month of December 1950 departure from normal now change to +1.8 since the "normal" for December increased when the new "normals" were calculated?

Based on the new 30-year dataset, yes. Climate datasets are always redefined, which accounts for the variability of climate. Climate is not static.

Link to comment
Share on other sites

Since what is "normal" just changed with data thru 2010, how does this affect history data before the current climate normal period?

For example: Let's pretend that a given location in December 1950 averaged +2 for the month back in 1950. Well the new "normals" get published after 2010 and the December average for that location increases by .2 degrees.

Does that historical month of December 1950 departure from normal now change to +1.8 since the "normal" for December increased when the new "normals" were calculated?

That's an interesting question. Because normals are updated every ten years, there is always overlap with the preceding 30-year period. For example, the immediately preceding normals were for 1971-2000, and so both the new and immediately preceding normals encompass 1981-2000. Perhaps the most relevant consideration is what question you are trying to answer. For example, suppose you are trying to decide whether July 1911 or July 2011 had more abnormal temperatures at a particular location. You might attempt to calculate the July normal temperature for 1911 by averaging July 1891-July 1920 temperatures or July 1901-July 1930 temperatures. Suppose both averages came out 75 degrees, and so you use that as your normal for July 1911. Further suppose that, according to the 1981-2010 normals, the new normal July temperature at that location is 80 degrees. An average temperature of 80 degrees in July 1911 would therefore have been 5 degrees above normal; whereas an average temperature of 84 degrees in July 2011 would have been only 4 degrees above normal. So, by that measure, July 1911 would have been more abnormal than July 2011, even though July 2011 was four degrees warmer.

Link to comment
Share on other sites

That's an interesting question. Because normals are updated every ten years, there is always overlap with the preceding 30-year period. For example, the immediately preceding normals were for 1971-2000, and so both the new and immediately preceding normals encompass 1981-2000. Perhaps the most relevant consideration is what question you are trying to answer. For example, suppose you are trying to decide whether July 1911 or July 2011 had more abnormal temperatures at a particular location. You might attempt to calculate the July normal temperature for 1911 by averaging July 1891-July 1920 temperatures or July 1901-July 1930 temperatures. Suppose both averages came out 75 degrees, and so you use that as your normal for July 1911. Further suppose that, according to the 1981-2010 normals, the new normal July temperature at that location is 80 degrees. An average temperature of 80 degrees in July 1911 would therefore have been 5 degrees above normal; whereas an average temperature of 84 degrees in July 2011 would have been only 4 degrees above normal. So, by that measure, July 1911 would have been more abnormal than July 2011, even though July 2011 was four degrees warmer.

This is why there is a standardized 30-year normal period using the running average. Researchers and operational scientists worldwide need a dataset which is the norm to compare or contrast studies and projects with.

Link to comment
Share on other sites

Thanks for the responses. To me what's interesting about this is that a period that was marginally above normal 60 years ago could later be viewed to be marginally below normal should the "normals" increase. Or vice versa.

Right, and this whole area can be very tricky. For example, consider these January normal mean temperatures at Reagan National Airport in Washington, DC (DCA):

1941-1970: 35.7

1951-1980: 35.2

1961-1990: 34.6

1971-2000: 34.9

1981-2010: 36.1

The January 1962 mean temperature at DCA was 34.6 degrees, and so that was 1.1 degrees below the 1941-1970 normal, 0.6 degrees below the 1951-1980 normal, equal to the 1961-1990 normal, 0.3 degrees below the 1971-2000 normal, and 1.5 degrees below the 1981-2010 normal. Further, if you do a more granular analysis, the decade encompassing 1962 was 1961-1970, and the January mean temperature for that decade at DCA was 33.5 degrees. If you treat that mean temperature as the most relevant "normal", January 1962 was 1.1 degrees above the 1961-1970 normal. So, was January 1962 1.1 degrees below normal, 0.6 degrees below normal, exactly normal, 0.3 degrees below normal, 1.5 degrees below normal, or 1.1 degrees above normal?

Link to comment
Share on other sites

At this point I would not even consider the normal temps for a 30 year period. if you want to consider averages, I would then use the last 10 to 15 year temps and that would be closer to our present day reality.

Temperatures (population) are assumed to be normally distributed. For a random sample to take on a normal distribution, the sample size should be 30 or higher. Smaller samples might not adequately represent the population. Hence, for statistical purposes, 30-year base periods are utilized.

Link to comment
Share on other sites

At this point I would not even consider the normal temps for a 30 year period. if you want to consider averages, I would then use the last 10 to 15 year temps and that would be closer to our present day reality.

What are your specific climate needs?

Link to comment
Share on other sites

Temperatures (population) are assumed to be normally distributed. For a random sample to take on a normal distribution, the sample size should be 30 or higher. Smaller samples might not adequately represent the population. Hence, for statistical purposes, 30-year base periods are utilized.

Thanks Don. I was just saying that the trends seem to indicate that we are mostly above normal in recent years and I don't see that changing for at least a few years out.

Link to comment
Share on other sites

  • 7 months later...

Raising this from the dead here: Do you know where an easy to pull by major city daily average is? I used to use the weather channel for daily normals by city (but I'm weary about trusting the data since it appears to have changed over the last month). Any other sites provide this besides the heavy duty ncdc downloads?

Thanks.

Link to comment
Share on other sites

Raising this from the dead here: Do you know where an easy to pull by major city daily average is? I used to use the weather channel for daily normals by city (but I'm weary about trusting the data since it appears to have changed over the last month). Any other sites provide this besides the heavy duty ncdc downloads?

Thanks.

You can averages under "Today's Alminac" on any forecast on http://www.wunderground.com

There's a little link in the box for "View more history data" inside of the box where you can find more info too.

Link to comment
Share on other sites

Thanks. This is helpful. Do you know which set of normals they are using?

I'm looking at DC, and weather.com use to show a normal high of 88 degrees (which is what wunderground shows), now weather.com shows a normal high of 90.

Link to comment
Share on other sites

I have always had problems with the concept of only taking the last 30 years to represent what is "normal" for a given area. Granted climate is variable, but the 30 year only data can be very misleading. I have done extensive research on this for DFW's climate. DFW's coldest normal temperature at the coldest point in winter has risen steadly from 32°F (1961-1990) to 33°F (1971-2000) to 35°F (1981-2010). Where in fact, during the winter months we have anywhere from 14 to 64 days where the temperatures reaches 32°F or colder and we are supposed to average about 34 days. There has never been a winter where we did not see at least 14 days of 32°F or colder and there has never been a winter where we did not dip to at least 25°F. We have been as cold as 8°F below zero. On average (taking the whole weather record data set into account 1898-to present) we should see a temperature below 10°F 1 out of every 5 years and a temperature at or below zero 1 out of every 25 years. So tell me how the normal low of 35°F, as the current data set of 30 years (1981-2010) implies, is anywhere near accurate to our coldest temperatures during the winter? If you take data of the whole weather record period (1898-to present) the coldest temperature should actually be 31°F. That to me is more indicative of a normal temperature at the coldest point in our winter. The 35°F temperature implies that we rarely hit freezing or are not supposed to hit freezing and that just simply is not the case. To me the 30 year period really skews the data inaccurately.

Link to comment
Share on other sites

I have always had problems with the concept of only taking the last 30 years to represent what is "normal" for a given area. Granted climate is variable, but the 30 year only data can be very misleading. I have done extensive research on this for DFW's climate. DFW's coldest normal temperature at the coldest point in winter has risen steadly from 32°F (1961-1990) to 33°F (1971-2000) to 35°F (1981-2010). Where in fact, during the winter months we have anywhere from 14 to 64 days where the temperatures reaches 32°F or colder and we are supposed to average about 34 days. There has never been a winter where we did not see at least 14 days of 32°F or colder and there has never been a winter where we did not dip to at least 25°F. We have been as cold as 8°F below zero. On average (taking the whole weather record data set into account 1898-to present) we should see a temperature below 10°F 1 out of every 5 years and a temperature at or below zero 1 out of every 25 years. So tell me how the normal low of 35°F, as the current data set of 30 years (1981-2010) implies, is anywhere near accurate to our coldest temperatures during the winter? If you take data of the whole weather record period (1898-to present) the coldest temperature should actually be 31°F. That to me is more indicative of a normal temperature at the coldest point in our winter. The 35°F temperature implies that we rarely hit freezing or are not supposed to hit freezing and that just simply is not the case. To me the 30 year period really skews the data inaccurately.

We need to have some sort of baseline that doesn't depend too much on climate trends that are no longer relevant. The last 30 years is large enough to be a valid sample, but not so big as to pull in data from a different climate period and make the normals way out of whack.

Link to comment
Share on other sites

We need to have some sort of baseline that doesn't depend too much on climate trends that are no longer relevant. The last 30 years is large enough to be a valid sample, but not so big as to pull in data from a different climate period and make the normals way out of whack.

I think a 30-year period is probably about as good as can be done for normal monthly temperatures, but it does not work as well for normal monthly precipitation, especially for summer precipitation. For example, August has historically been one of the wettest months in Washington, DC (average 4.12 inches of precipitation, 1871-2012), but the recent 30-year normal suggests that August has now become a relatively dry month (average 2.92 inches of precipitation, 1981-2010). However, that recent "normal" period may well have been an aberration, as August storms just happened to miss DC during 1981-2010. Relying on that "normal" would have been counterproductive last year, as 8.92 inches of precipitation were recorded at DCA in August 2011.

Link to comment
Share on other sites

We need to have some sort of baseline that doesn't depend too much on climate trends that are no longer relevant. The last 30 years is large enough to be a valid sample, but not so big as to pull in data from a different climate period and make the normals way out of whack.

Yes, but that was the point of my whole post. 35°F is too warm for the lowest temp at DFW when we average more than a third of winter at or below freezing. We are supposed to see 34 days of at or below freezing temps in winter. The 31°F average lowest temp for the whole record period is a much more accurate depiction of our climate during the coldest part of winter. To me that is not a good baseline.

Link to comment
Share on other sites

I have always had problems with the concept of only taking the last 30 years to represent what is "normal" for a given area. Granted climate is variable, but the 30 year only data can be very misleading. I have done extensive research on this for DFW's climate. DFW's coldest normal temperature at the coldest point in winter has risen steadly from 32°F (1961-1990) to 33°F (1971-2000) to 35°F (1981-2010). Where in fact, during the winter months we have anywhere from 14 to 64 days where the temperatures reaches 32°F or colder and we are supposed to average about 34 days. There has never been a winter where we did not see at least 14 days of 32°F or colder and there has never been a winter where we did not dip to at least 25°F. We have been as cold as 8°F below zero. On average (taking the whole weather record data set into account 1898-to present) we should see a temperature below 10°F 1 out of every 5 years and a temperature at or below zero 1 out of every 25 years. So tell me how the normal low of 35°F, as the current data set of 30 years (1981-2010) implies, is anywhere near accurate to our coldest temperatures during the winter? If you take data of the whole weather record period (1898-to present) the coldest temperature should actually be 31°F. That to me is more indicative of a normal temperature at the coldest point in our winter. The 35°F temperature implies that we rarely hit freezing or are not supposed to hit freezing and that just simply is not the case. To me the 30 year period really skews the data inaccurately.

You seem to be conflating the ideas of "lowest average low temperature" and "average lowest low temperature". The two are very, very different. Just because the average low never dips below 35F doesn't mean you aren't expected to see lows below that from time to time. That would be silly, if your low always stayed right at the "average low" every night. It's the same reason the "first freeze" occurs well before the average low ever reaches freezing (if it even does).

The 35F represents a mean low. However, there is always going to be variability around that mean, and depending on the standard deviation (which at DFW in winter would probably be relatively high), you'd expect to see a significant chunk of of the lower tail of the distribution to fall below freezing. Hence your averaging ~30 days with lows at or below freezing.

Link to comment
Share on other sites

You seem to be conflating the ideas of "lowest average low temperature" and "average lowest low temperature". The two are very, very different. Just because the average low never dips below 35F doesn't mean you aren't expected to see lows below that from time to time. That would be silly, if your low always stayed right at the "average low" every night. It's the same reason the "first freeze" occurs well before the average low ever reaches freezing (if it even does).

The 35F represents a mean low. However, there is always going to be variability around that mean, and depending on the standard deviation (which at DFW in winter would probably be relatively high), you'd expect to see a significant chunk of of the lower tail of the distribution to fall below freezing. Hence your averaging ~30 days with lows at or below freezing.

I've been talking about the average lowest temperature. The 35°F temp represents the current 30 year period of 1981-2010 average lowest temp which occurs in January at DFW. This number has risen from 32°F from the climate period of 1961-1990, to 33°F from 1971-2000, to the 35°F it is at now. If you calculated the same number over the entire climate record here which is September 1898-to present that temperature would actually be 31°F. I am saying that the current 35°F temp is skewed warmer over the last 30 years, but there really has not been a significant change in the average number of freezes seen at DFW airport over the same period. I still think the 31°F is a more realistic and accurate average lowest temp for this area and represents the climate better than the 35°F number. I think that number is a little misleading.

Link to comment
Share on other sites

I've been talking about the average lowest temperature. The 35°F temp represents the current 30 year period of 1981-2010 average lowest temp which occurs in January at DFW. This number has risen from 32°F from the climate period of 1961-1990, to 33°F from 1971-2000, to the 35°F it is at now. If you calculated the same number over the entire climate record here which is September 1898-to present that temperature would actually be 31°F. I am saying that the current 35°F temp is skewed warmer over the last 30 years, but there really has not been a significant change in the average number of freezes seen at DFW airport over the same period. I still think the 31°F is a more realistic and accurate average lowest temp for this area and represents the climate better than the 35°F number. I think that number is a little misleading.

35F is not the average lowest low temperature at DFW, it's the lowest average low temperature. That is to say, those days of the year, the average low temperature is 35F. Add substantial variability on top of that and you get plenty of days that are much colder than that. If this is what you're saying, sorry... it's hard to tell. Haha.

If the standard deviation is large (much greater than 4F, which I would expect for a station like DFW in winter), a 4F change in the lowest average low temperature would not necessarily translate to a significant change in the average amount of days below freezing. The distribution would be wide enough that the change in the mean would not be very substantial.

Link to comment
Share on other sites

I've been talking about the average lowest temperature. The 35°F temp represents the current 30 year period of 1981-2010 average lowest temp which occurs in January at DFW. This number has risen from 32°F from the climate period of 1961-1990, to 33°F from 1971-2000, to the 35°F it is at now. If you calculated the same number over the entire climate record here which is September 1898-to present that temperature would actually be 31°F. I am saying that the current 35°F temp is skewed warmer over the last 30 years, but there really has not been a significant change in the average number of freezes seen at DFW airport over the same period. I still think the 31°F is a more realistic and accurate average lowest temp for this area and represents the climate better than the 35°F number. I think that number is a little misleading.

Has there been a change in the number of very high minimum temps? I could imagine a scenario where there may be just as many freezes, but now there are more days where the low never drops below 50. Would 31° then still be a better measure in your opinion?

Link to comment
Share on other sites

Has there been a change in the number of very high minimum temps? I could imagine a scenario where there may be just as many freezes, but now there are more days where the low never drops below 50. Would 31° then still be a better measure in your opinion?

I can't say for sure if there has been a increase in the amount of days where the low doesn't drop below 50°F, but that is excessively warm for here in the winter time for a low. That's close to our normal high during the coldest part of winter which is 56°F, used to be 54°F.

My whole point was that the entire climate record is a better depiction of climatology than just 30 years. The climate has not changed significantly enough to be that different and the entire record period shows a bit better data with more accurate means in my opinion. Specifically, the last 30 years definitly skews things toward the warmer and I'm in not in the camp that believes in "Global Warming." To me that's the same as only averaging the most recent 30 grades out of a series of 100 projects to determine someone's grade. It's arbitrary.

Link to comment
Share on other sites

My whole point was that the entire climate record is a better depiction of climatology than just 30 years. The climate has not changed significantly enough to be that different and the entire record period shows a bit better data with more accurate means in my opinion. Specifically, the last 30 years definitly skews things toward the warmer and I'm in not in the camp that believes in "Global Warming." To me that's the same as only averaging the most recent 30 grades out of a series of 100 projects to determine someone's grade. It's arbitrary.

By definition, the mean of the last 30 years is a more "accurate" depiction of the modern state of the climate than the mean of the whole climate period, assuming climate changes over multi-decadal periods (naturally or via anthropogenic sources). So just because you feel that 31F is a more representative lowest average low doesn't make it actually the case... the value isn't just some number pulled out of thin air, it's based on actual data, and the data doesn't lie. :P

And your analogy is actually a good one... specifically, if your grades on your last 30 projects are higher than over the series of 100 projects, in a statistically significant way, then you can say that there is a trend in your grades, and that the mean of the last 30 projects is more representative of your current average grade than the 100 project mean would be.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...