All Activity
- Past hour
-
Losing the 3k is unfortunate. In the few times looking at the RRFFFSSSS, the 3k was much more "correct". Losing the acuracy of mixing with be a real forcasting crunch.
-
Central PA Spring 2026 Discussion/Obs Thread
ChescoWx replied to Voyager's topic in Upstate New York/Pennsylvania
I have compiled the climate averages based on the NOAA calculation of Climate Normals for the philly burbs of Chester County PA for a full year with 30 year averages updated every 10 years. The first base period was from 1901-1930 with our most recent being the current climate base of 1991-2020. You can see our warmest base climate normal occurred back in the 1921-1950 and 1931-1960 periods. -
E PA/NJ/DE Spring 2026 Obs/Discussion
ChescoWx replied to PhiEaglesfan712's topic in Philadelphia Region
I have compiled the climate averages based on the NOAA calculation of Climate Normals for the philly burbs of Chester County PA for a full year with 30 year averages updated every 10 years. The first base period was from 1901-1930 with our most recent being the current climate base of 1991-2020. You can see our warmest base climate normal occurred back in the 1921-1950 and 1931-1960 periods. -
It better come with the wet or my yard is going to be Arizona by July.
-
And besides losing the joy of a happy hour NAMing, the NAM is/was the best for mixing lines as we saw last winter. We’re gonna get surprised by pingers.
-
Another big source for error too comes from rounding. We only touched upon this briefly but it's an interesting concept to think about - but thinking out loud here, I guess you can consider something like rounding human error. At first glance, something like rounding may not seem like a huge deal but when you're dealing with the computation of tens of thousands (maybe even in the hundredths of thousands or millions?) of equations, errors due to rounding are going to add up quickly and this could very well be a large source for error when you start getting out to say 3 days. You could also argue that modeling is rather accurate out to 15-16 days because when you're getting that far out in time what are your two options really? Modelling and climatology. I never knew this until this was discussed briefly as well in my class, but the reason the GFS is ran to 384 hours is because that is about the cut-off of when modeling has an advantage over climatology. After that time climatology tends to outweigh modelling. Interesting stuff
-
You could almost predict that the separate layer of interpretive modeling wouldn't even be necessary. A lot of the cause behind their creation was/is to cure both error that's known, but anticipation and suspicion, upon each model run. But therein is the source of the human error...right? In a future whence QC modeling cores are like ...terrifyingly accurate out to 17 some odd days, the interpretations become just a veneer technology layer at the output end of each cycle - more for readability. Yeah ..that would soup to nuts be a bad day for weather forecasters as a profession, wouldn't it?
-
2026-2027 Strong/Super El Nino
GaWx replied to Stormchaserchuck1's topic in Weather Forecasting and Discussion
Latest CDAS suggests resumption of warming started: this is an approximation of ONI but with a slight cold bias (this implies RONI of ~+0.5) -
I know you have stated this over the years and it's something I believe in as well, but introduce quantum computing and improve model initialization schemes and we will see forecast accuracy improve drastically - and not just accuracy but we probably see less wavering. But who knows...maybe this is a nice theory lol
-
I take everything with a grain of salt, but I think I heard someone mention it in here in the past as well....im gonna miss being NAM'd.... in both good ways and bad lol
-
2026-2027 Strong/Super El Nino
GaWx replied to Stormchaserchuck1's topic in Weather Forecasting and Discussion
-
Yeah, but those are interpretive tools, after that fact. Ha! Interpretative tools like that compendium you gave there, they are created by humans. You know what that means ... it means we're likely to introduce a whole new layer of potential error. LOL Drool human aside ... one might suspect general error improves, in step, with/if the raw data source improves; because obviously interpretive algorithms are tapping into less error...etc.
-
So now what do we say when the RRFS crushes us with wind?
-
-
This week has been beautiful. Highs in the 60s and lows in the 40s.
-
Part of the degrading in forecasting too is all of these stupid products that exist. Model snowfall maps, Freezing Rain Accumulation maps, Supercell Composite Parameter, Significant Tornado Parameter, the stupid hazard type at the bottom right of the SHARPpy forecast soundings, maybe throw updraft helicity swaths in here too
-
Prismshine Productions started following New England
-
Looks like tomorrow's rain event is fizzling again. Maybe .25"
-
The dry period has been nice after weeks of rain, but after 2+ weeks of nothing we need rain again. Fortunately, we should get it this weekend into early next week.
-
As long as it's not 50 degrees in the daytime soon thereafter, I'll be okay with this.
-
-
Today won't be terrible here once we get this warm front north. 65-70 most of CT.
-
Not that anyone asked but imho, this sort of evolution was both inevitable, but quite possibly necessary to keep up. It kind of goes along with arguing against those with AI paranoia ( in general...), posturing all these dystopian fears over a future guided by x-y-z plausible consequence... There may be value in those concerns, but unfortunately, where was all this improving computing power over the generations going to go in the first place? The advent in technological history of Artificial Intelligence, at least to the scale of an Asimovian simulacra, was going to eventually result - and it's not impossible that actual self-awareness ( currently dubbed 'the singularity' in popular culture ) will one soon enough arrive. Digression thee, but in the same sense an evolution of these modeling systems is just as inevitable. Beyond that concept, these rapid update species make more sense to have in place with the inevitability of a more ubiquitous as Quantum Computing "comes on line". This latter aspect is going to be the biggest game changer in forecast modeling since Fourier Transform theory was integrated by Navier-Stokes fluid mechanics - the fundamental principle of how wave mechanics are propagated through the atmospheric medium. .. caressed by thermodynamics, 3-D integral vector calculus and parameterized by the Coriolis Parameter... one can perhaps intuitively see how/why this becomes exponentially more blown up by chaos out in time, because all those tiny wave functions both initially, and emergent along the processing way, feed back on crippling the outlook by corruption. Quantum Computing does such a vastly improved ability at predicting where and most importantly whence those area fabricated, it can also tell something about whether they are fictitious or real. Keep the real; toss the sci-fi. What remains is ...panacea for random contamination if you will, a fix for what the original modeler forefathers warned, 'because we can't asses the actual quantum states of every particle in space and time, chaos that results means that no weather model will be sufficiently accurate beyond a certain number of days' In the years since ...that had been extended considerably just by coming up with more physically proven mechanics, combined with denser empirical input data ( initialization of grids and so forth). A lot of error production while processing comes from interpolation to "fill" gaps in data sparseness...etc, because interpolation is estimating. In other word, probability guess-work. Thus, input density and increased conventional computing power did/does extend outward by reducing the uncertainty cost. But there's ultimately still a theoretical limit; if we are not sampling at the discrete-est levels, and then computing in the same scalar space, we are about as far as we can go to improve using conventional tech. With ever improving sensory technology combined with arriving Quantum Computing ... that sounds quite a bit like we closing that gap on Quantum blindness. Sorry...just some op ed'ing.
-
Yeah, the last Nam upgrade was in March of 2017. It nailed the January 2016 event. Sad to see the SPC HREF go as its ensemble max snowfall was actually pretty good with the split bands west and east of NYC and 20”+ amounts for the February 2026 event. https://www.noaa.gov/media-release/review-of-jan-2016-blizzard-preliminary-snow-totals-validates-dc-measurement The preliminary Central Park measurement will be adjusted upward to 27.5 inches, which will become an all-time snowfall record for New York City when certified by NOAA’s National Centers for Environmental Information. A communication error between the weather forecast office in Upton, New York, and the Central Park Conservancy, which volunteers to take official snow measurements in Central Park, led to an inaccurate preliminary total of 26.8 inches. The snow team found the mistake when reviewing the Conservancy’s logbook.
-
-
People swear it's not windy here. It's always windy.
