dendrite Posted 4 hours ago Share Posted 4 hours ago Min 35° Maybe the last of the 30s here? Climo says no, but there’s none in the extended currently. I’m sure we’ll sneak some upper 30s in somewhere even if it’s early June. Euro is pushing 100° in Philly for a couple of days next week. Link to comment Share on other sites More sharing options...
dendrite Posted 3 hours ago Share Posted 3 hours ago Growing consensus for 1-3” S NH/ME into NE MA…euro, gfs, 3k. The soaker I’ve been waiting for? 1 Link to comment Share on other sites More sharing options...
Damage In Tolland Posted 3 hours ago Share Posted 3 hours ago 9 hours ago, dendrite said: June July May August September April December October March November February January The spring months are the worst. Just a terrible season . Julorch Augdewst Junorch January February December Septorcher November Octornace Mayorch Morch Napril 1 1 Link to comment Share on other sites More sharing options...
dendrite Posted 3 hours ago Share Posted 3 hours ago Wow Feb above Dec despite the holiday and low sun angle? Winter wise…does anything beat a snowy Dec? Link to comment Share on other sites More sharing options...
dendrite Posted 3 hours ago Share Posted 3 hours ago And the summer months above the winter months. We old now. Link to comment Share on other sites More sharing options...
Damage In Tolland Posted 3 hours ago Share Posted 3 hours ago 3 minutes ago, dendrite said: And the summer months above the winter months. We old now. I thought hard about that . But we’ve mainly lost good winters ( I know we just had one). But they are so rare anymore that I’d rather HHH and storms and possible canes if a rifle is as pointed at my knee. Link to comment Share on other sites More sharing options...
weatherwiz Posted 3 hours ago Author Share Posted 3 hours ago If everyone thinks its dry now just wait until we move into Fall and Winter. With a super, super, super strong EL Nino coming the Walker Circulation is going to get so expansive and swallow the entire U.S. There will be no storm track...just a massive high pressure from coast to coast. We'll be seeing highs in the 80's into November and highs in the 50's and 60's through the winter. No precip 1 1 1 Link to comment Share on other sites More sharing options...
Damage In Tolland Posted 3 hours ago Share Posted 3 hours ago Euro / NAM are relative Steins east of River . Both 25-.40 or so . Only gfs has significant rains there Link to comment Share on other sites More sharing options...
SouthCoastMA Posted 2 hours ago Share Posted 2 hours ago Summer/Deep Spring starts Saturday good riddance to the extended late winter/early Spring meh Link to comment Share on other sites More sharing options...
weatherwiz Posted 2 hours ago Author Share Posted 2 hours ago NBMv5 became operational last week. Curious to see how this performs. 1 Link to comment Share on other sites More sharing options...
Typhoon Tip Posted 1 hour ago Share Posted 1 hour ago 15 hours ago, WxWatcher007 said: Not that anyone asked but imho, this sort of evolution was both inevitable, but quite possibly necessary to keep up. It kind of goes along with arguing against those with AI paranoia ( in general...), posturing all these dystopian fears over a future guided by x-y-z plausible consequence... There may be value in those concerns, but unfortunately, where was all this improving computing power over the generations going to go in the first place? The advent in technological history of Artificial Intelligence, at least to the scale of an Asimovian simulacra, was going to eventually result - and it's not impossible that actual self-awareness ( currently dubbed 'the singularity' in popular culture ) will one soon enough arrive. Digression there, but in the same sense an evolution of these modeling systems is just as inevitable. Beyond that concept, these rapid update species make more sense to have in place with the inevitability of a more ubiquitous as Quantum Computing "comes on line". This latter aspect is going to be the biggest game changer in forecast modeling since Fourier Transform theory was integrated by Navier-Stokes fluid mechanics - the fundamental principle of how wave mechanics are propagated through the atmospheric medium. .. caressed by thermodynamics, 3-D integral vector calculus and parameterized by the Coriolis Parameter... one can perhaps intuitively see how/why this becomes exponentially more blown up by chaos out in time, because all those tiny wave functions both initially, and emergent along the processing way, feed back on crippling the outlook by corruption. Quantum Computing does such a vastly improved ability at predicting where and most importantly whence those area fabricated, it can also tell something about whether they are fictitious or real. Keep the real; toss the sci-fi. What remains is ...panacea for random contamination if you will, a fix for what the original modeler forefathers warned, 'because we can't asses the actual quantum states of every particle in space and time, chaos that results means that no weather model will be sufficiently accurate beyond a certain number of days' In the years since ...that had been extended considerably just by coming up with more physically proven mechanics, combined with denser empirical input data ( initialization of grids and so forth). A lot of error production while processing comes from interpolation to "fill" gaps in data sparseness...etc, because interpolation is estimating. In other word, probability guess-work. Thus, input density and increased conventional computing power did/does extend outward by reducing the uncertainty cost. But there's ultimately still a theoretical limit; if we are not sampling at the discrete-est levels, and then computing in the same scalar space, we are about as far as we can go to improve using conventional tech. With ever improving sensory technology combined with arriving Quantum Computing ... that sounds quite a bit like we closing that gap on Quantum blindness. Sorry...just some op ed'ing. 1 Link to comment Share on other sites More sharing options...
CT Valley Snowman Posted 1 hour ago Share Posted 1 hour ago Today won't be terrible here once we get this warm front north. 65-70 most of CT. 1 Link to comment Share on other sites More sharing options...
PhiEaglesfan712 Posted 1 hour ago Share Posted 1 hour ago 2 hours ago, dendrite said: Euro is pushing 100° in Philly for a couple of days next week. As long as it's not 50 degrees in the daytime soon thereafter, I'll be okay with this. Link to comment Share on other sites More sharing options...
weatherwiz Posted 1 hour ago Author Share Posted 1 hour ago 8 minutes ago, Typhoon Tip said: Not that anyone asked but imho, this sort of evolution was both inevitable, but quite possibly necessary to keep up. It kind of goes along with arguing against those with AI paranoia ( in general...), posturing all these dystopian fears over a future guided by x-y-z plausible consequence... There may be value in those concerns, but unfortunately, where was all this improving computing power over the generations going to go in the first place? The advent in technological history of Artificial Intelligence, at least to the scale of an Asimovian simulacra, was going to eventually result - and it's not impossible that actual self-awareness ( currently dubbed 'the singularity' in popular culture ) will one soon enough arrive. Digression thee, but in the same sense an evolution of these modeling systems is just as inevitable. Beyond that concept, these rapid update species make more sense to have in place with the inevitability of a more ubiquitous as Quantum Computing "comes on line". This latter aspect is going to be the biggest game changer in forecast modeling since Fourier Transform theory was integrated by Navier-Stokes fluid mechanics - the fundamental principle of how wave mechanics are propagated through the atmospheric medium. .. caressed by thermodynamics, 3-D integral vector calculus and parameterized by the Coriolis Parameter... one can perhaps intuitively see how/why this becomes exponentially more blown up by chaos out in time, because all those tiny wave functions both initially, and emergent along the processing way, feed back on crippling the outlook by corruption. Quantum Computing does such a vastly improved ability at predicting where and most importantly whence those area fabricated, it can also tell something about whether they are fictitious or real. Keep the real; toss the sci-fi. What remains is ...panacea for random contamination if you will, a fix for what the original modeler forefathers warned, 'because we can't asses the actual quantum states of every particle in space and time, chaos that results means that no weather model will be sufficiently accurate beyond a certain number of days' In the years since ...that had been extended considerably just by coming up with more physically proven mechanics, combined with denser empirical input data ( initialization of grids and so forth). A lot of error production while processing comes from interpolation to "fill" gaps in data sparseness...etc Thus, input density and increased conventional computing power did/does extend outward. But there's ultimately still a theoretical limit if we are not sampling at the discrete-est levels, and then computing in the same scalar space. With ever improving sensory technology combined with Quantum Computing ... that sounds quite a bit like we closing that gap on Quantum blindness. Sorry...just some op ed'ing. Part of the degrading in forecasting too is all of these stupid products that exist. Model snowfall maps, Freezing Rain Accumulation maps, Supercell Composite Parameter, Significant Tornado Parameter, the stupid hazard type at the bottom right of the SHARPpy forecast soundings, maybe throw updraft helicity swaths in here too Link to comment Share on other sites More sharing options...
Typhoon Tip Posted 1 hour ago Share Posted 1 hour ago 43 minutes ago, weatherwiz said: Part of the degrading in forecasting too is all of these stupid products that exist. Model snowfall maps, Freezing Rain Accumulation maps, Supercell Composite Parameter, Significant Tornado Parameter, the stupid hazard type at the bottom right of the SHARPpy forecast soundings, maybe throw updraft helicity swaths in here too Yeah, but those are interpretive tools, after that fact. Ha! Interpretative tools like that compendium you gave there, they are created by humans. You know what that means ... it means we're likely to introduce a whole new layer of potential error. LOL Drool humor aside ... one might suspect general error improves, in step, with/if the raw data source improvement; because obviously interpretive algorithms are tapping into less error to begin with...etc. 1 Link to comment Share on other sites More sharing options...
weatherwiz Posted 1 hour ago Author Share Posted 1 hour ago 1 minute ago, Typhoon Tip said: Yeah, but those are interpretive tools, after that fact. Ha! Interpretative tools like that compendium you gave there, they are created by humans. You know what that means ... it means we're likely to introduce a whole new layer of potential error. LOL Drool human aside ... one might suspect general error improves, in step, with/if the raw data source improves; because obviously interpretive algorithms are tapping into less error...etc. I know you have stated this over the years and it's something I believe in as well, but introduce quantum computing and improve model initialization schemes and we will see forecast accuracy improve drastically - and not just accuracy but we probably see less wavering. But who knows...maybe this is a nice theory lol 1 Link to comment Share on other sites More sharing options...
Typhoon Tip Posted 1 hour ago Share Posted 1 hour ago 6 minutes ago, weatherwiz said: I know you have stated this over the years and it's something I believe in as well, but introduce quantum computing and improve model initialization schemes and we will see forecast accuracy improve drastically - and not just accuracy but we probably see less wavering. But who knows...maybe this is a nice theory lol You could almost predict that the separate layer of interpretive modeling wouldn't even be necessary. A lot of the cause behind their creation was/is to cure both error that's known, but anticipation and suspicion, upon each model run. But therein is the source of the human error...right? In a future whence QC modeling cores are like ...terrifyingly accurate out to 17 some odd days, the interpretations become just a veneer technology layer at the output end of each cycle - more for readability. Yeah ..that would soup to nuts be a bad day for weather forecasters as a profession, wouldn't it? Link to comment Share on other sites More sharing options...
weatherwiz Posted 1 hour ago Author Share Posted 1 hour ago 1 minute ago, Typhoon Tip said: You could almost predict that the separate layer of interpretive modeling wouldn't even be necessary. A lot of the cause behind their creation was/is to cure both error that's known, but anticipation and suspicion, upon each model run. But therein is the source of the human error...right? In a future whence QC modeling cores are like ...terrifyingly accurate out to 17 some days, the interpretations can be a veneer layer at the output end of each cycle - Another big source for error too comes from rounding. We only touched upon this briefly but it's an interesting concept to think about - but thinking out loud here, I guess you can consider something like rounding human error. At first glance, something like rounding may not seem like a huge deal but when you're dealing with the computation of tens of thousands (maybe even in the hundredths of thousands or millions?) of equations, errors due to rounding are going to add up quickly and this could very well be a large source for error when you start getting out to say 3 days. You could also argue that modeling is rather accurate out to 15-16 days because when you're getting that far out in time what are your two options really? Modelling and climatology. I never knew this until this was discussed briefly as well in my class, but the reason the GFS is ran to 384 hours is because that is about the cut-off of when modeling has an advantage over climatology. After that time climatology tends to outweigh modelling. Interesting stuff Link to comment Share on other sites More sharing options...
weatherwiz Posted 1 hour ago Author Share Posted 1 hour ago Wouldn't be surprised to see some small hailers tomorrow Link to comment Share on other sites More sharing options...
Typhoon Tip Posted 52 minutes ago Share Posted 52 minutes ago 34 minutes ago, weatherwiz said: Another big source for error too comes from rounding. We only touched upon this briefly but it's an interesting concept to think about - but thinking out loud here, I guess you can consider something like rounding human error. At first glance, something like rounding may not seem like a huge deal but when you're dealing with the computation of tens of thousands (maybe even in the hundredths of thousands or millions?) of equations, errors due to rounding are going to add up quickly and this could very well be a large source for error when you start getting out to say 3 days. You could also argue that modeling is rather accurate out to 15-16 days because when you're getting that far out in time what are your two options really? Modelling and climatology. I never knew this until this was discussed briefly as well in my class, but the reason the GFS is ran to 384 hours is because that is about the cut-off of when modeling has an advantage over climatology. After that time climatology tends to outweigh modelling. Interesting stuff You know it just occurred to me... Think how much work is going to have to go into the Internet technology bases, pan-systemically, for this up-coming August change. All industry reliance - star there and start thinking about 'turning off' all these guidance' I wonder... hm. One way to do it is to leave the other product suite ( old ) as is, while giving a time and chance for the source provider-ship to catch up. Running in parallel might be expensive, but provided a chance for deep, deep product integration to be root-canalled. NAM/MOS web-pages, ... crip filing and FTP automations... Companies that may rely on those automations... hard to know where to begin. Graphics engines? my god. massive, massive overhaul in the wholesale industry. The other way to handle ( if smart ) is to induct Claude or Gemini CODEX AI ...like real fast, and start drafting up whole new web architectures - like head start it. Expedience being the objective. Because in tech parlance, August is in ten minutes from May. When the existing took the last 20+ years of human engineering to create all that product suite and deeply rooted sourcing, and product reliances ..et al - yikes. The whole system can't really just be stopped on a dime because NOAA clicks a mouse to fire up these new tools. I mean what am I missing... ? As an afterthought, you wonder if the vendors may have already been working background with NCEP - perhaps developing against a beta system... Link to comment Share on other sites More sharing options...
weatherwiz Posted 43 minutes ago Author Share Posted 43 minutes ago 6 minutes ago, Typhoon Tip said: You know it just occurred to me... Think how much work is going to have to go into the Internet technology bases, pan-systemically, for this up-coming August change. All industry reliance - star there and start thinking about 'turning off' all these guidance' I wonder... hm. One way to do it is to leave the other product suite ( old ) as is, while giving a time and chance for the source provider-ship to catch up. Running in parallel might be expensive, but provided a chance for deep, deep product integration to be root-canalled. NAM/MOS web-pages, ... crip filing and FTP automations... Companies that may rely on those automations... hard to know where to begin. Graphics engines? my god. massive, massive overhaul in the wholesale industry. The other way to handle ( if smart ) is to induct Claude or Gemini CODEX AI ...like real fast, and start drafting up whole new web architectures - like head start it. Because the existing took the last 20+ years of human engineering to create all that product suite and deeply rooted sourcing, and product reliances. The whole system can't really just be stopped on a dime because NOAA clicks a mouse to fire up these new tools. I mean what am I missing... ? As an afterthought, you wonder if the vendors have worked with NCEP already - perhaps developing against a beta system... This is a great point - you would have to figure it is going to take some time for vendors to make the necessary changes and adjustments needed and I would have to imagine this is not going to be an easy task and this is going to require a ton of OT hours. Now, it's also possible many vendors have already been preparing for this as this has been known for a while but I guess the question is how much work could have been completed in preparation for this? Link to comment Share on other sites More sharing options...
SouthCoastMA Posted 38 minutes ago Share Posted 38 minutes ago may get relatively steined here. Looks like outer Cape has a better shot, and models locking in on DE Maine/NE Mass Link to comment Share on other sites More sharing options...
Typhoon Tip Posted 34 minutes ago Share Posted 34 minutes ago 10 minutes ago, weatherwiz said: This is a great point - you would have to figure it is going to take some time for vendors to make the necessary changes and adjustments needed and I would have to imagine this is not going to be an easy task and this is going to require a ton of OT hours. Now, it's also possible many vendors have already been preparing for this as this has been known for a while but I guess the question is how much work could have been completed in preparation for this? Yeah...I guess what I was in part dancing around is whether or not we should expect general product outages. We may go through a "bug" period where modeling gets black out times. Prooobably why they chose August? Since that is the most quiescent time of the year, if you're going to do a huge disruptive product overhaul, down to the software level - which I also don't have a lot of faith is likely to be properly hardship-estimated ... - that makes sense to do it during dog-day doldrums. I've been in several different software capacities over the last 25 years ... I can tell you... when the issue is code-base level, the time estimates are almost always between 2 and 5 times longer in reality then is planned. Any other software engineers in here will know what I am talking about. 1 Link to comment Share on other sites More sharing options...
OceanStWx Posted 16 minutes ago Share Posted 16 minutes ago 3 hours ago, dendrite said: Growing consensus for 1-3” S NH/ME into NE MA…euro, gfs, 3k. The soaker I’ve been waiting for? This is just the pre-Christmas forecast. 1 Link to comment Share on other sites More sharing options...
weatherwiz Posted 5 minutes ago Author Share Posted 5 minutes ago 10 minutes ago, OceanStWx said: This is just the pre-Christmas forecast. maybe this Christmas we'll be ripping severe weather but if we're swallowed by the Walker cell we may not even get any fronts Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now