Jump to content

TheClimateChanger

Members
  • Posts

    3,940
  • Joined

  • Last visited

Everything posted by TheClimateChanger

  1. Yeah, I don’t understand either. But look at the example I posted, and check others, and you’ll see, in a number of cases, it’s not a simple arithmetic average. And in some cases, diverges significantly (almost always warmer than the mean).
  2. Looking at Newark, New Jersey, it's not as clear cut. It's teetering on the edge of summertime, as there is only one colder meteorological summer month in the expanded history (June 1903), although the mean high in June 1859 was only 0.1F warmer than the first two weeks of October 2025.
  3. I would say this is endless summer. At least it is in Ohio, not sure about New York City. This is one thing you often overlook by focusing on the anomalous past warm years, there were a lot of historic years that were VERY cold. Looking at Cleveland, the first two weeks of October is running warmer than several historic Junes and even some of the coldest Augusts. Nobody would say June or August aren't summer months, just because they are colder than average. So if those months were colder than this October, and yet still considered to be summer, I would say you must accept that it is still summer. The mean high temperature for October so far is 72.7F. We can see there were 18 historic Junes with mean high temperatures for the month at or below that temperature, most recently in 1972. And, for August, we can find one cooler year (1927) and one just 0.1F warmer than this October to date (1915). Nobody would say August 1927 wasn't a summer month.
  4. A "normal" year would be 10th hottest ever (excluding 2025's partial data). Note, as well, despite this "fib" factor on the normals, every single year since 2020 has come in above even these artificially elevated norms. Prior to 2020, only four years were at or above 52.5F (1921, 1931, 1998, and 2012) and yet they want us to believe/accept that this is the new normal? What game are they playing?
  5. The normals aren't simple arithmetic averages. In fact, with the 1991-2020 normals, it looks like they were piloting some sort of new method of calculating them in Ohio, where the normals are significantly above the means. This is especially prevalent at Toledo, where the normal is an unbelievable 1.9F above the calculated mean. It appears to be a new methodology to factor in the warming trend, so more months are near or below average, instead of exclusively above average. Because of this change, even a "normal" year would be among the warmest ever recorded at Toledo. Just another way for them to hide the incline, I guess.
  6. Definitely had some solid frost each of the last two mornings in my area, especially heavy on the rooftops. Did not drop below freezing though on my thermometer.
  7. I wonder how cold it got in November 1950? I know there were a lot of single digit and subzero readings in Ohio during the great blizzard which dropped a widespread 20-30+ inches of snow in the Ohio Valley and Appalachians.
  8. Quite a different picture for our friend in Dayton, where a daily record rainfall of nearly 2 1/2" of rain converted a small annual deficit into a surplus of nearly 2 inches. This was also the 67th most precipitation for any day on record - dating back to the 19th century! It may always be sunny in Philadelphia, but it's always cloudy in Dayton.
  9. Water withdrawals have essentially zero impact on Lake Champlain's level. It's a minute fraction of the total volume, and most of it is treated and redeposited into the lake system.
  10. I'd say so with the yearly deficit nearing one foot.
  11. Increased Siberian snow cover is, in part, a consequence of a warming Arctic: Enhanced Arctic moisture transport toward Siberia in autumn revealed by tagged moisture transport model experiment | npj Climate and Atmospheric Science
  12. I think looking at precipitation only is insufficient to determine dryness without also considering the impact of elevated temperatures on the hydrological cycle. Ground and surface waters may tell a different story. We can see for instance Lake Champlain is teetering on the edge of its all-time record low level. We can see from this September 4th article, the all-time low level is 92.4' set in 1908. Source: Lake Champlain approaching record low levels | WAMC We can see in recent days, the lake has dropped to within about 7" of that record, although it has recovered a bit today in response to recent rains.
  13. Wow! After coming up just 1F shy of the monthly record yesterday, CAR set a new monthly record today. They had a rough stretch in August when three consecutive days (8/11-8/13) all came up one shy of the monthly record, so this avenges that. They also tied the June (and all-time) monthly record just last year (2024).
  14. We used to be destined for greatness, but it was stolen. In 1837, the St. Clair River, north of Detroit, was still closed to navigation on June 1, with ice harvested from the river on the 4th of July! From Special Bulletin No. 149 of the Michigan Agricultural College [now Michigan State University] dated February 1926:
  15. Take a look at 1836. Tri-daily mean of 38.2F for October, and 30.8F for November, at Allegheny Arsenal in Pittsburgh (Lawrenceville). Not looking for a debate here... the 1820s data is probably substandard exposure and too high. Not that it matters much, last year was 56.4F at the airport, which is 500 feet higher in elevation for context. Despite the lower elevation, the 46.9F mean observed in 1837 is more than 1F lower than any year in the official record, and 48.2F in 1836 is second lowest. This is what was stolen from us!
  16. No, because they don't get upslope snows, which are what pile up on the western slopes of the Appalachians. It looks DC used to average around 2 feet per year, but that's still substantially more than today. They haven't had 2 feet of snow in any winter in the last 11 years. And the early numbers are probably an undercount as they were mostly just daily observations after the storm had ended, versus today's systematized observations of reporting peak accumulation in each six-hour period. UCAR estimates a bias of 15-20 percent, which could be even greater if there is missing/bad data. See: Snowfall measurement: a flaky history | NCAR & UCAR News So it's reasonable to believe that, in the absence of human intervention, DC should be receiving 25-30+ inches of snow each winter. It's hard to imagine the depth of snow that should have fallen over the last 100 years but failed to fall due to human interventions.
  17. Lol, I know it sounds crazy, but it's true. Look it up. Here is another example... winters in DC used to be colder than modern winters in Elkins, West Virginia, which averages 70" of snow a year (elev: ~2k feet). And the old DC numbers are probably artificially elevated due to UHI and substandard siting/exposure (rooftop, and for a time, northern window well). By contrast, there are only like 20,000 people in all of Randolph County, West Virginia, which is a big, sprawling county full of national forestland, so there is no UHI at all.
  18. It's a general observation. The October climatology for New York used to be the same as modern Scranton, PA (elev: ~1,000 feet, and inland). Mean temp (1869-1900): 55.0F in NYC; mean temp (2015-2024): 55.1F at Scranton. And there have been several 1"+ snows in Scranton in October. Plus, the NYC figures are probably artificially elevated due to UHI, which is minimal in Scranton. The actual climate was probably colder.
  19. I can assure you that New York was screwed out of at least 5 or 6 October 1"+ snowfalls that would have occurred in a natural atmosphere.
  20. If not for the incredible warming during that period, there probably should have been several more October snowfalls that never were allowed to occur. And the lack of a 1" snowfall was more bad luck than anything, considering even DC had snowfalls in excess of an inch in October 1925 and October 1940.
  21. And it's kind of surprising too, when a lot of empirical data supports warming in that era. This blog suggests a systemic early warming bias in the instrumental temperature records that predate the adoption of the Stevenson Screen, and questionable SST data... although I'm not sure how that would impact proxy-based reconstructions. It looks to me like a lot of proxy data (e.g., the above) would support warming, but I'm not sure if empirical data like this enters into the proxies since such data does not exist for earlier years. Early global warming
  22. I think the weirdest thing is how it was there was a general consensus that the late 19th century and turn of the 20th century was unusually warm among period scientists, but a lot of bogus "reconstructions" today try to tell us it was the coldest since the early Holocene. See, e.g., below, that period is shown as nearly the coldest in the past 1,000 years, surpassed only by a cooler interval in the 15th century. Makes a big difference with how the present looks compared to the past. If there was already say 0.2 or 0.3C warming by 1900, then we're even further off the charts then commonly supposed. It makes VERY little sense to say these very smart people were investigating the cause of the observed warming trend, when, in fact, it was unusually cold. But if you buy the reconstruction below, that's exactly the implication to be drawn.
×
×
  • Create New...