Jump to content
  • Member Statistics

    18,657
    Total Members
    25,819
    Most Online
    Donut Hole
    Newest Member
    Donut Hole
    Joined

May 2026 Obs/Discussion


weatherwiz
 Share

Recommended Posts

Min 35°

Maybe the last of the 30s here? Climo says no, but there’s none in the extended currently. I’m sure we’ll sneak some upper 30s in somewhere even if it’s early  June. 
 

Euro is pushing 100° in Philly for a couple of days next week. :lol:

Link to comment
Share on other sites

If everyone thinks its dry now just wait until we move into Fall and Winter. With a super, super, super strong EL Nino coming the Walker Circulation is going to get so expansive and swallow the entire U.S. There will be no storm track...just a massive high pressure from coast to coast. We'll be seeing highs in the 80's into November and highs in the 50's and 60's through the winter. No precip 

  • Haha 1
  • Crap 1
  • clap 1
Link to comment
Share on other sites

15 hours ago, WxWatcher007 said:

 

Not that anyone asked but imho, this sort of evolution was both inevitable, but quite possibly necessary to keep up.

It kind of goes along with arguing against those with AI paranoia ( in general...), posturing all these dystopian fears over a future guided by x-y-z plausible consequence... There may be value in those concerns, but unfortunately, where was all this improving computing power over the generations going to go in the first place?   The advent in technological history of Artificial Intelligence, at least to the scale of an Asimovian simulacra, was going to eventually result - and it's not impossible that actual self-awareness ( currently dubbed 'the singularity' in popular culture ) will one soon enough arrive. 

Digression there, but in the same sense an evolution of these modeling systems is just as inevitable.   Beyond that concept, these rapid update species make more sense to have in place with the inevitability of a more ubiquitous as Quantum Computing "comes on line".  This latter aspect is going to be the biggest game changer in forecast modeling since Fourier Transform theory was integrated by Navier-Stokes fluid mechanics - the fundamental principle of how wave mechanics are propagated through the atmospheric medium.  .. caressed by thermodynamics, 3-D integral vector calculus and parameterized by the Coriolis Parameter... one can perhaps intuitively see how/why this becomes exponentially more blown up by chaos out in time, because all those tiny wave functions both initially, and emergent along the processing way, feed back on crippling the outlook by corruption.    Quantum Computing does such a vastly improved ability at predicting where and most importantly whence those area fabricated, it can also tell something about whether they are fictitious or real.   Keep the real; toss the sci-fi.  What remains is ...panacea for random contamination if you will, a fix for what the original modeler forefathers warned, 'because we can't asses the actual quantum states of every particle in space and time, chaos that results means that no weather model will be sufficiently accurate beyond a certain number of days'  

In the years since ...that had been extended considerably just by coming up with more physically proven mechanics, combined with denser empirical input data ( initialization of grids and so forth).  A lot of error production while processing comes from interpolation to "fill" gaps in data sparseness...etc, because interpolation is estimating. In other word, probability guess-work. Thus, input density and increased conventional computing power did/does extend outward by reducing the uncertainty cost.  But there's ultimately still a theoretical limit; if we are not sampling at the discrete-est levels, and then computing in the same scalar space, we are about as far as we can go to improve using conventional tech.  With ever improving sensory technology combined with arriving Quantum Computing ... that sounds quite a bit like we closing that gap on Quantum blindness.

Sorry...just some op ed'ing. 

Link to comment
Share on other sites

8 minutes ago, Typhoon Tip said:

Not that anyone asked but imho, this sort of evolution was both inevitable, but quite possibly necessary to keep up.

It kind of goes along with arguing against those with AI paranoia ( in general...), posturing all these dystopian fears over a future guided by x-y-z plausible consequence... There may be value in those concerns, but unfortunately, where was all this improving computing power over the generations going to go in the first place?   The advent in technological history of Artificial Intelligence, at least to the scale of an Asimovian simulacra, was going to eventually result - and it's not impossible that actual self-awareness ( currently dubbed 'the singularity' in popular culture ) will one soon enough arrive. 

Digression thee, but in the same sense an evolution of these modeling systems is just as inevitable.   Beyond that concept, these rapid update species make more sense to have in place with the inevitability of a more ubiquitous as Quantum Computing "comes on line".  This latter aspect is going to be the biggest game changer in forecast modeling since Fourier Transform theory was integrated by Navier-Stokes fluid mechanics - the fundamental principle of how wave mechanics are propagated through the atmospheric medium.  .. caressed by thermodynamics, 3-D integral vector calculus and parameterized by the Coriolis Parameter... one can perhaps intuitively see how/why this becomes exponentially more blown up by chaos out in time, because all those tiny wave functions both initially, and emergent along the processing way, feed back on crippling the outlook by corruption.    Quantum Computing does such a vastly improved ability at predicting where and most importantly whence those area fabricated, it can also tell something about whether they are fictitious or real.   Keep the real; toss the sci-fi.  What remains is ...panacea for random contamination if you will, a fix for what the original modeler forefathers warned, 'because we can't asses the actual quantum states of every particle in space and time, chaos that results means that no weather model will be sufficiently accurate beyond a certain number of days'  

In the years since ...that had been extended considerably just by coming up with more physically proven mechanics, combined with denser empirical input data ( initialization of grids and so forth).  A lot of error production while processing comes from interpolation to "fill" gaps in data sparseness...etc  Thus, input density and increased conventional computing power did/does extend outward.  But there's ultimately still a theoretical limit if we are not sampling at the discrete-est levels, and then computing in the same scalar space.  With ever improving sensory technology combined with Quantum Computing ... that sounds quite a bit like we closing that gap on Quantum blindness.

Sorry...just some op ed'ing. 

Part of the degrading in forecasting too is all of these stupid products that exist.

Model snowfall maps, Freezing Rain Accumulation maps, Supercell Composite Parameter, Significant Tornado Parameter, the stupid hazard type at the bottom right of the SHARPpy forecast soundings, maybe throw updraft helicity swaths in here too

Link to comment
Share on other sites

43 minutes ago, weatherwiz said:

Part of the degrading in forecasting too is all of these stupid products that exist.

Model snowfall maps, Freezing Rain Accumulation maps, Supercell Composite Parameter, Significant Tornado Parameter, the stupid hazard type at the bottom right of the SHARPpy forecast soundings, maybe throw updraft helicity swaths in here too

Yeah, but those are interpretive tools, after that fact. 

Ha! Interpretative tools like that compendium you gave there, they are created by humans. You know what that means ... it means we're likely to introduce a whole new layer of potential error. LOL

Drool humor aside ... one might suspect general error improves, in step, with/if the raw data source improvement; because obviously interpretive algorithms are tapping into less error to begin with...etc.    

 

 

  • Like 1
Link to comment
Share on other sites

1 minute ago, Typhoon Tip said:

Yeah, but those are interpretive tools, after that fact. 

Ha! Interpretative tools like that compendium you gave there, they are created by humans. You know what that means ... it means we're likely to introduce a whole new layer of potential error. LOL

Drool human aside ... one might suspect general error improves, in step, with/if the raw data source improves; because obviously interpretive algorithms are tapping into less error...etc.    

 

 

I know you have stated this over the years and it's something I believe in as well, but introduce quantum computing and improve model initialization schemes and we will see forecast accuracy improve drastically - and not just accuracy but we probably see less wavering. But who knows...maybe this is a nice theory lol

  • Like 1
Link to comment
Share on other sites

6 minutes ago, weatherwiz said:

I know you have stated this over the years and it's something I believe in as well, but introduce quantum computing and improve model initialization schemes and we will see forecast accuracy improve drastically - and not just accuracy but we probably see less wavering. But who knows...maybe this is a nice theory lol

You could almost predict that the separate layer of interpretive modeling wouldn't even be necessary.

A lot of the cause behind their creation was/is to cure both error that's known, but anticipation and suspicion, upon each model run.   But therein is the source of the human error...right?  

In a future whence QC modeling cores are like ...terrifyingly accurate out to 17 some odd days, the interpretations become just a veneer technology layer at the output end of each cycle - more for readability.  Yeah ..that would soup to nuts be a bad day for weather forecasters as a profession, wouldn't it?  

Link to comment
Share on other sites

1 minute ago, Typhoon Tip said:

You could almost predict that the separate layer of interpretive modeling wouldn't even be necessary.

A lot of the cause behind their creation was/is to cure both error that's known, but anticipation and suspicion, upon each model run.   But therein is the source of the human error...right?  

In a future whence QC modeling cores are like ...terrifyingly accurate out to 17 some days, the interpretations can be a veneer layer at the output end of each cycle - 

Another big source for error too comes from rounding. We only touched upon this briefly but it's an interesting concept to think about - but thinking out loud here, I guess you can consider something like rounding human error. At first glance, something like rounding may not seem like a huge deal but when you're dealing with the computation of tens of thousands (maybe even in the hundredths of thousands or millions?) of equations, errors due to rounding are going to add up quickly and this could very well be a large source for error when you start getting out to say 3 days. 

You could also argue that modeling is rather accurate out to 15-16 days because when you're getting that far out in time what are your two options really? Modelling and climatology. I never knew this until this was discussed briefly as well in my class, but the reason the GFS is ran to 384 hours is because that is about the cut-off of when modeling has an advantage over climatology. After that time climatology tends to outweigh modelling. Interesting stuff

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...