Jump to content
  • Member Statistics

    18,657
    Total Members
    25,819
    Most Online
    Donut Hole
    Newest Member
    Donut Hole
    Joined

May 2026 Obs/Discussion


weatherwiz
 Share

Recommended Posts

Min 35°

Maybe the last of the 30s here? Climo says no, but there’s none in the extended currently. I’m sure we’ll sneak some upper 30s in somewhere even if it’s early  June. 
 

Euro is pushing 100° in Philly for a couple of days next week. :lol:

Link to comment
Share on other sites

If everyone thinks its dry now just wait until we move into Fall and Winter. With a super, super, super strong EL Nino coming the Walker Circulation is going to get so expansive and swallow the entire U.S. There will be no storm track...just a massive high pressure from coast to coast. We'll be seeing highs in the 80's into November and highs in the 50's and 60's through the winter. No precip 

  • Haha 1
  • Crap 1
  • clap 1
Link to comment
Share on other sites

15 hours ago, WxWatcher007 said:

 

Not that anyone asked but imho, this sort of evolution was both inevitable, but quite possibly necessary to keep up.

It kind of goes along with arguing against those with AI paranoia ( in general...), posturing all these dystopian fears over a future guided by x-y-z plausible consequence... There may be value in those concerns, but unfortunately, where was all this improving computing power over the generations going to go in the first place?   The advent in technological history of Artificial Intelligence, at least to the scale of an Asimovian simulacra, was going to eventually result - and it's not impossible that actual self-awareness ( currently dubbed 'the singularity' in popular culture ) will one soon enough arrive. 

Digression there, but in the same sense an evolution of these modeling systems is just as inevitable.   Beyond that concept, these rapid update species make more sense to have in place with the inevitability of a more ubiquitous as Quantum Computing "comes on line".  This latter aspect is going to be the biggest game changer in forecast modeling since Fourier Transform theory was integrated by Navier-Stokes fluid mechanics - the fundamental principle of how wave mechanics are propagated through the atmospheric medium.  .. caressed by thermodynamics, 3-D integral vector calculus and parameterized by the Coriolis Parameter... one can perhaps intuitively see how/why this becomes exponentially more blown up by chaos out in time, because all those tiny wave functions both initially, and emergent along the processing way, feed back on crippling the outlook by corruption.    Quantum Computing does such a vastly improved ability at predicting where and most importantly whence those area fabricated, it can also tell something about whether they are fictitious or real.   Keep the real; toss the sci-fi.  What remains is ...panacea for random contamination if you will, a fix for what the original modeler forefathers warned, 'because we can't asses the actual quantum states of every particle in space and time, chaos that results means that no weather model will be sufficiently accurate beyond a certain number of days'  

In the years since ...that had been extended considerably just by coming up with more physically proven mechanics, combined with denser empirical input data ( initialization of grids and so forth).  A lot of error production while processing comes from interpolation to "fill" gaps in data sparseness...etc, because interpolation is estimating. In other word, probability guess-work. Thus, input density and increased conventional computing power did/does extend outward by reducing the uncertainty cost.  But there's ultimately still a theoretical limit; if we are not sampling at the discrete-est levels, and then computing in the same scalar space, we are about as far as we can go to improve using conventional tech.  With ever improving sensory technology combined with arriving Quantum Computing ... that sounds quite a bit like we closing that gap on Quantum blindness.

Sorry...just some op ed'ing. 

  • Confused 1
Link to comment
Share on other sites

8 minutes ago, Typhoon Tip said:

Not that anyone asked but imho, this sort of evolution was both inevitable, but quite possibly necessary to keep up.

It kind of goes along with arguing against those with AI paranoia ( in general...), posturing all these dystopian fears over a future guided by x-y-z plausible consequence... There may be value in those concerns, but unfortunately, where was all this improving computing power over the generations going to go in the first place?   The advent in technological history of Artificial Intelligence, at least to the scale of an Asimovian simulacra, was going to eventually result - and it's not impossible that actual self-awareness ( currently dubbed 'the singularity' in popular culture ) will one soon enough arrive. 

Digression thee, but in the same sense an evolution of these modeling systems is just as inevitable.   Beyond that concept, these rapid update species make more sense to have in place with the inevitability of a more ubiquitous as Quantum Computing "comes on line".  This latter aspect is going to be the biggest game changer in forecast modeling since Fourier Transform theory was integrated by Navier-Stokes fluid mechanics - the fundamental principle of how wave mechanics are propagated through the atmospheric medium.  .. caressed by thermodynamics, 3-D integral vector calculus and parameterized by the Coriolis Parameter... one can perhaps intuitively see how/why this becomes exponentially more blown up by chaos out in time, because all those tiny wave functions both initially, and emergent along the processing way, feed back on crippling the outlook by corruption.    Quantum Computing does such a vastly improved ability at predicting where and most importantly whence those area fabricated, it can also tell something about whether they are fictitious or real.   Keep the real; toss the sci-fi.  What remains is ...panacea for random contamination if you will, a fix for what the original modeler forefathers warned, 'because we can't asses the actual quantum states of every particle in space and time, chaos that results means that no weather model will be sufficiently accurate beyond a certain number of days'  

In the years since ...that had been extended considerably just by coming up with more physically proven mechanics, combined with denser empirical input data ( initialization of grids and so forth).  A lot of error production while processing comes from interpolation to "fill" gaps in data sparseness...etc  Thus, input density and increased conventional computing power did/does extend outward.  But there's ultimately still a theoretical limit if we are not sampling at the discrete-est levels, and then computing in the same scalar space.  With ever improving sensory technology combined with Quantum Computing ... that sounds quite a bit like we closing that gap on Quantum blindness.

Sorry...just some op ed'ing. 

Part of the degrading in forecasting too is all of these stupid products that exist.

Model snowfall maps, Freezing Rain Accumulation maps, Supercell Composite Parameter, Significant Tornado Parameter, the stupid hazard type at the bottom right of the SHARPpy forecast soundings, maybe throw updraft helicity swaths in here too

Link to comment
Share on other sites

43 minutes ago, weatherwiz said:

Part of the degrading in forecasting too is all of these stupid products that exist.

Model snowfall maps, Freezing Rain Accumulation maps, Supercell Composite Parameter, Significant Tornado Parameter, the stupid hazard type at the bottom right of the SHARPpy forecast soundings, maybe throw updraft helicity swaths in here too

Yeah, but those are interpretive tools, after that fact. 

Ha! Interpretative tools like that compendium you gave there, they are created by humans. You know what that means ... it means we're likely to introduce a whole new layer of potential error. LOL

Drool humor aside ... one might suspect general error improves, in step, with/if the raw data source improvement; because obviously interpretive algorithms are tapping into less error to begin with...etc.    

 

 

  • Like 1
Link to comment
Share on other sites

1 minute ago, Typhoon Tip said:

Yeah, but those are interpretive tools, after that fact. 

Ha! Interpretative tools like that compendium you gave there, they are created by humans. You know what that means ... it means we're likely to introduce a whole new layer of potential error. LOL

Drool human aside ... one might suspect general error improves, in step, with/if the raw data source improves; because obviously interpretive algorithms are tapping into less error...etc.    

 

 

I know you have stated this over the years and it's something I believe in as well, but introduce quantum computing and improve model initialization schemes and we will see forecast accuracy improve drastically - and not just accuracy but we probably see less wavering. But who knows...maybe this is a nice theory lol

  • Like 1
Link to comment
Share on other sites

6 minutes ago, weatherwiz said:

I know you have stated this over the years and it's something I believe in as well, but introduce quantum computing and improve model initialization schemes and we will see forecast accuracy improve drastically - and not just accuracy but we probably see less wavering. But who knows...maybe this is a nice theory lol

You could almost predict that the separate layer of interpretive modeling wouldn't even be necessary.

A lot of the cause behind their creation was/is to cure both error that's known, but anticipation and suspicion, upon each model run.   But therein is the source of the human error...right?  

In a future whence QC modeling cores are like ...terrifyingly accurate out to 17 some odd days, the interpretations become just a veneer technology layer at the output end of each cycle - more for readability.  Yeah ..that would soup to nuts be a bad day for weather forecasters as a profession, wouldn't it?  

Link to comment
Share on other sites

1 minute ago, Typhoon Tip said:

You could almost predict that the separate layer of interpretive modeling wouldn't even be necessary.

A lot of the cause behind their creation was/is to cure both error that's known, but anticipation and suspicion, upon each model run.   But therein is the source of the human error...right?  

In a future whence QC modeling cores are like ...terrifyingly accurate out to 17 some days, the interpretations can be a veneer layer at the output end of each cycle - 

Another big source for error too comes from rounding. We only touched upon this briefly but it's an interesting concept to think about - but thinking out loud here, I guess you can consider something like rounding human error. At first glance, something like rounding may not seem like a huge deal but when you're dealing with the computation of tens of thousands (maybe even in the hundredths of thousands or millions?) of equations, errors due to rounding are going to add up quickly and this could very well be a large source for error when you start getting out to say 3 days. 

You could also argue that modeling is rather accurate out to 15-16 days because when you're getting that far out in time what are your two options really? Modelling and climatology. I never knew this until this was discussed briefly as well in my class, but the reason the GFS is ran to 384 hours is because that is about the cut-off of when modeling has an advantage over climatology. After that time climatology tends to outweigh modelling. Interesting stuff

Link to comment
Share on other sites

34 minutes ago, weatherwiz said:

Another big source for error too comes from rounding. We only touched upon this briefly but it's an interesting concept to think about - but thinking out loud here, I guess you can consider something like rounding human error. At first glance, something like rounding may not seem like a huge deal but when you're dealing with the computation of tens of thousands (maybe even in the hundredths of thousands or millions?) of equations, errors due to rounding are going to add up quickly and this could very well be a large source for error when you start getting out to say 3 days. 

You could also argue that modeling is rather accurate out to 15-16 days because when you're getting that far out in time what are your two options really? Modelling and climatology. I never knew this until this was discussed briefly as well in my class, but the reason the GFS is ran to 384 hours is because that is about the cut-off of when modeling has an advantage over climatology. After that time climatology tends to outweigh modelling. Interesting stuff

You know it just occurred to me...

Think how much work is going to have to go into the Internet technology bases, pan-systemically, for this up-coming August change.   All industry reliance - star there and start thinking about 'turning off' all these guidance'

I wonder...  hm.   One way to do it is to leave the other product suite ( old ) as is, while giving a time and chance for the source provider-ship to catch up. Running in parallel might be expensive, but provided a chance for deep, deep product integration to be root-canalled.   NAM/MOS web-pages, ... crip filing and FTP automations... Companies that may rely on those automations...  hard to know where to begin.  Graphics engines?  my god.  massive, massive overhaul in the wholesale industry.  

The other way to handle ( if smart ) is to induct Claude or Gemini CODEX AI ...like real fast, and start drafting up whole new web architectures - like head start it.  Expedience being the objective.  Because in tech parlance, August is in ten minutes from May. When the existing took the last 20+ years of human engineering to create all that product suite and deeply rooted sourcing, and product reliances ..et al - yikes.   The whole system can't really just be stopped on a dime because NOAA clicks a mouse to fire up these new tools.  

I mean what am I missing... ?  

As an afterthought, you wonder if the vendors may have already been working background with NCEP - perhaps developing against a beta system...

Link to comment
Share on other sites

6 minutes ago, Typhoon Tip said:

You know it just occurred to me...

Think how much work is going to have to go into the Internet technology bases, pan-systemically, for this up-coming August change.   All industry reliance - star there and start thinking about 'turning off' all these guidance'

I wonder...  hm.   One way to do it is to leave the other product suite ( old ) as is, while giving a time and chance for the source provider-ship to catch up. Running in parallel might be expensive, but provided a chance for deep, deep product integration to be root-canalled.   NAM/MOS web-pages, ... crip filing and FTP automations... Companies that may rely on those automations...  hard to know where to begin.  Graphics engines?  my god.  massive, massive overhaul in the wholesale industry.  

The other way to handle ( if smart ) is to induct Claude or Gemini CODEX AI ...like real fast, and start drafting up whole new web architectures - like head start it.  Because the existing took the last 20+ years of human engineering to create all that product suite and deeply rooted sourcing, and product reliances. The whole system can't really just be stopped on a dime because NOAA clicks a mouse to fire up these new tools.  

I mean what am I missing... ?  

As an afterthought, you wonder if the vendors have worked with NCEP already - perhaps developing against a beta system...

This is a great point - you would have to figure it is going to take some time for vendors to make the necessary changes and adjustments needed and I would have to imagine this is not going to be an easy task and this is going to require a ton of OT hours. Now, it's also possible many vendors have already been preparing for this as this has been known for a while but I guess the question is how much work could have been completed in preparation for this? 

Link to comment
Share on other sites

10 minutes ago, weatherwiz said:

This is a great point - you would have to figure it is going to take some time for vendors to make the necessary changes and adjustments needed and I would have to imagine this is not going to be an easy task and this is going to require a ton of OT hours. Now, it's also possible many vendors have already been preparing for this as this has been known for a while but I guess the question is how much work could have been completed in preparation for this? 

Yeah...I guess what I was in part dancing around is whether or not we should expect general product outages.  We may go through a "bug" period where modeling gets black out times.  

Prooobably why they chose August?  Since that is the most quiescent time of the year, if you're going to do a huge disruptive product overhaul, down to the software level - which I also don't have a lot of faith is likely to be properly hardship-estimated ... - that makes sense to do it during dog-day doldrums.  

I've been in several different software capacities over the last 25 years ... I can tell you... when the issue is code-base level, the time estimates are almost always between 2 and 5 times longer in reality then is planned. 

Any other software engineers in here will know what I am talking about. 

  • Like 1
Link to comment
Share on other sites

2 hours ago, Typhoon Tip said:

Not that anyone asked but imho, this sort of evolution was both inevitable, but quite possibly necessary to keep up.

It kind of goes along with arguing against those with AI paranoia ( in general...), posturing all these dystopian fears over a future guided by x-y-z plausible consequence... There may be value in those concerns, but unfortunately, where was all this improving computing power over the generations going to go in the first place?   The advent in technological history of Artificial Intelligence, at least to the scale of an Asimovian simulacra, was going to eventually result - and it's not impossible that actual self-awareness ( currently dubbed 'the singularity' in popular culture ) will one soon enough arrive. 

Digression there, but in the same sense an evolution of these modeling systems is just as inevitable.   Beyond that concept, these rapid update species make more sense to have in place with the inevitability of a more ubiquitous as Quantum Computing "comes on line".  This latter aspect is going to be the biggest game changer in forecast modeling since Fourier Transform theory was integrated by Navier-Stokes fluid mechanics - the fundamental principle of how wave mechanics are propagated through the atmospheric medium.  .. caressed by thermodynamics, 3-D integral vector calculus and parameterized by the Coriolis Parameter... one can perhaps intuitively see how/why this becomes exponentially more blown up by chaos out in time, because all those tiny wave functions both initially, and emergent along the processing way, feed back on crippling the outlook by corruption.    Quantum Computing does such a vastly improved ability at predicting where and most importantly whence those area fabricated, it can also tell something about whether they are fictitious or real.   Keep the real; toss the sci-fi.  What remains is ...panacea for random contamination if you will, a fix for what the original modeler forefathers warned, 'because we can't asses the actual quantum states of every particle in space and time, chaos that results means that no weather model will be sufficiently accurate beyond a certain number of days'  

In the years since ...that had been extended considerably just by coming up with more physically proven mechanics, combined with denser empirical input data ( initialization of grids and so forth).  A lot of error production while processing comes from interpolation to "fill" gaps in data sparseness...etc, because interpolation is estimating. In other word, probability guess-work. Thus, input density and increased conventional computing power did/does extend outward by reducing the uncertainty cost.  But there's ultimately still a theoretical limit; if we are not sampling at the discrete-est levels, and then computing in the same scalar space, we are about as far as we can go to improve using conventional tech.  With ever improving sensory technology combined with arriving Quantum Computing ... that sounds quite a bit like we closing that gap on Quantum blindness.

Sorry...just some op ed'ing. 

Wow-- even denser than usual! My thesaurus just spontaneously ignited.

  • Haha 1
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...