Typhoon Tip
Meteorologist-
Posts
43,921 -
Joined
-
Last visited
About Typhoon Tip

Profile Information
-
Gender
Not Telling
Recent Profile Visitors
55,462 profile views
-
Mm ...he may partially be right ... I've been watching this and comparing it to various guidance. When this region pivots N, and it's not 'filling' in or back redeveloping as it's been pushing coherently N, it's like to shut off the spigot pretty abruptly. We lull... question is, would there then be a rejuvenation as a kind of weak easterly anomaly conveyor temporarily sets up while the filling closed low rolls underneath... ? Maybe. The NAM is paltry with that tho. And given the weakening kinematics ...that's not necessarily tossed. In other words, I can see a pathway to where your neck of the woods gets kind of shafted here https://weather.cod.edu/satrad/?parms=subregional-Mid_Atlantic-02-24-1-100-1&checked=map&colorbar=undefined
-
that's a good point. Didn't consider that. I tell you what ...I bet now that we're in the solar max we may still do better. I'll let it ride and take the hit if I'm wrong. I'm still thinking 77's doable but yeah. Part of my thinking is that we're facing a pretty significant built in correction vector, one with an explosively long room, too. As an aside, this really is an impressive wholesale change coming after tomorrow Just sayn', 2-m in NAM's 69 at ORH and 71 BED at 18z. That's garbage under near full sun and superb wind direction during a summer mixing profile. BL is gonna get tall. We all know FIT/ASH/BED are 6 F warmer than ORH in that setting. Granted MET MOS is 75 so ... it's a nerd's quibble.
-
Also, entertaining low grade heat wave Mon-Wed. 89-92 variety... close. Subtle yet crucially amplifying ridge heights are situated nicely for deep layer heat transport in recent multi-source guidance. It could tick more pronounced yet, too. Also, no BD or N-door frontal sag until perhaps too late on Wed to matter. Meanwhile, 850s are steadily improving from 10 all the way to 16c thru the weekend prior, and the general cloud RH fields are < 60% so sun soaking after elevating successive launch temperatures. Not big heat by very warm. Civility taken off guard a little...despite technical above average April it really has not sensibly appealed that way.
-
I'd use MAV for Saturday. METs not high enough for those synoptic params. If you look at the 850s warming from 15 thru 21z in a pocket just E of over the water, that's actually mixed diurnal plume leaving the coast - which means the mix hgt actually made that level. It's 10C, and the adiabtic extrapolation to 1000 mb is 22.5 ...so the mandatory 2 to 3 slope addition to the real sfc yields closer to 25 C ( 77), which is what the MAV sports. That could also be a deg shy, too
-
Wow... impressive 6 hour change Saturday morning in this new NAM grid. Walking through a summer doorway
-
The heaviest rain this system ordeal has to offer may in fact come from this introductory band that Brian just showed there. The whole system is closing off but that behavior is doing so over pallid gradients/weak baroclinicity, while the mid and upper heights are really filling as it closes ... not deepening further. As it is pulling away, it even opens the trough back up and ends up almost washed out of the hemisphere by the next day as it's corps smears up over the Maritime. [edit, it actually does go through another deepening phase up there but then right after it's disappears] Systems tend to not score well in the course when they're limping into the final. The NAM's lower QPF all along may end up doing better in this. The other models "might" be over producing QPF given weakening structure. It'll be interesting to see what happens with this. It is still a closed circulation, tho maintaining a progressive movement ... but closed circulations may get a window of easterly anomaly which adds ... So both a lower and upper performance are true.
-
Oh geez - personally have no experience yet working with these tools set to replace the existing ... I was talking to Wiz' yesterday, just the roll-out in August could end up a cluster fuck. So many tech suites have the present existing modeling tools deeply integrated, soup to nuts, and all that has to be reworked. Graphical processing alone - eesh So if on top of all that headache, it's to put a piece of shit bad models into play? haha. Sounds like a governmental operation, huh
-
Occasional Thoughts on Climate Change
Typhoon Tip replied to donsutherland1's topic in Climate Change
Mm... fwiw, my own interaction/experience and impressions therefrom with AI tools are more favorable than that. It comes down to one's own responsibility to "asking the question in the right way" One aspect I will fault AI is that it sometimes will hooks it's teeth into a adjective/verb one has chosen to use - perhaps the user had the the 2nd or 3rd preferential definition in mind when they did...- as gospel. It appears as tho the AI probabilistically leans on the first more proper usage? speculation. Either way, it doesn't offer suggestions from suspicion over what the user really meant - probably because it's not a human being in that sense. AI isn't yet at the level of "what was it they were really thinking". This can tint the context when it is subtle, at other times outright diverting conversations down paths users weren't really intending. Thing is, it was always because of the user's word choice. Cobalt says "...the vibes are off," and Dan mentions nuances ... etc. You know, those strike me as really being the state of the art of the technology not really getting the "spirit" of the moment along the exchange, in lieu of its tendency to run to the most concise meaning upon turn of phrases and/or word choice. I've gone back along the exchange history and found inflection points and said, "I didn't mean to imply x, I meant more y" etc... and after the brief pause, the AI admits to a course correction so to speak. It can also help if you use the markdown option in the settings to color the type of experience you want. Mine says no flattery. Don't be obsequious. This actually helps...because if you use a word that's ...a little off, the AI will be less likely to just ignore/accept it - it might even ask me how what I just said relates. I could almost see a future where a type of new job req emerges in industries that have adopted/bought into AI called "AICE" employees - pronounced acer. These are "AI Configuration Engineers" What do you do, "I'm an Acer" for x-y-z. The job entails a fuller/intimate understanding of the tech/circumstance such that the engagement with the AI teases the best solution without those distractions. Which believe it or not ... are hugely costly - even small ones and the deviation, the expenditure in recovery adds to the growing data center push-back concern over resource piggery; it is expensive for a lot of reasons. This could all just be generational, too. It's important to bear in mind, this tech is like the Wright Brothers first 90 feet of successful flight ... well, proportionally, maybe a little farther along. But we're no where close to flying high altitude international flight routes in that metaphor just yet. There are advances, lots of them. That ambit research is definitely not going to stop for better or worse! Plus, imagine when Quantum Computing comes on line, a computing core that finds all possible definitions that can exist, at the same instant, and chooses the top probability - now... plug Gemini into that. Hm? So far, by keeping tabs on my own concision when dealing with AI, I've come to find that it's been the most advantageous accompaniment to both problem solving, and the creative process, since either the invention of the scientific calculator or biology's ability to dream. Impetus on a accompanying helper - we're a ways yet from landing the ability to soup to nuts solutions in isolation. -
like. some in here will lose it if any kind of weather happens ? not sure what was so distracting, warm or cold, about any model run for that matter this morning.
-
weird warm sector ... PHL-NWR and all point surrounding and in between have DPs between 43 and 50 F...whilst clearly by other obs, a warm front has soared N ... One might of thunk DP of 60 in warm sector but that's a dry warm dose punching up the coast. In fact, said warm boundary is nearing or just passing through HFD as you're reading this statement. Winds bump S and go gusty SW across CT and the sat loops have day glow skies if not sun splashing there. Moving rapidly NE. We're likely to get that even up to Rt 2 and SE NH soon... We won't make 70 but I could see it surging into the mid 60s.
-
Yeah...I guess what I was in part dancing around is whether or not we should expect general product outages. We may go through a "bug" period where modeling gets black out times. Prooobably why they chose August? Since that is the most quiescent time of the year, if you're going to do a huge disruptive product overhaul, down to the software level - which I also don't have a lot of faith is likely to be properly hardship-estimated ... - that makes sense to do it during dog-day doldrums. I've been in several different software capacities over the last 25 years ... I can tell you... when the issue is code-base level, the time estimates are almost always between 2 and 5 times longer in reality then is planned. Any other software engineers in here will know what I am talking about.
-
You know it just occurred to me... Think how much work is going to have to go into the Internet technology bases, pan-systemically, for this up-coming August change. All industry reliance - star there and start thinking about 'turning off' all these guidance' I wonder... hm. One way to do it is to leave the other product suite ( old ) as is, while giving a time and chance for the source provider-ship to catch up. Running in parallel might be expensive, but provided a chance for deep, deep product integration to be root-canalled. NAM/MOS web-pages, ... crip filing and FTP automations... Companies that may rely on those automations... hard to know where to begin. Graphics engines? my god. massive, massive overhaul in the wholesale industry. The other way to handle ( if smart ) is to induct Claude or Gemini CODEX AI ...like real fast, and start drafting up whole new web architectures - like head start it. Expedience being the objective. Because in tech parlance, August is in ten minutes from May. When the existing took the last 20+ years of human engineering to create all that product suite and deeply rooted sourcing, and product reliances ..et al - yikes. The whole system can't really just be stopped on a dime because NOAA clicks a mouse to fire up these new tools. I mean what am I missing... ? As an afterthought, you wonder if the vendors may have already been working background with NCEP - perhaps developing against a beta system...
-
You could almost predict that the separate layer of interpretive modeling wouldn't even be necessary. A lot of the cause behind their creation was/is to cure both error that's known, but anticipation and suspicion, upon each model run. But therein is the source of the human error...right? In a future whence QC modeling cores are like ...terrifyingly accurate out to 17 some odd days, the interpretations become just a veneer technology layer at the output end of each cycle - more for readability. Yeah ..that would soup to nuts be a bad day for weather forecasters as a profession, wouldn't it?
-
Yeah, but those are interpretive tools, after that fact. Ha! Interpretative tools like that compendium you gave there, they are created by humans. You know what that means ... it means we're likely to introduce a whole new layer of potential error. LOL Droll humor aside ... one might suspect general error improves, in step, with/if the raw data source improvement; because obviously interpretive algorithms are tapping into less error to begin with...etc.
-
Not that anyone asked but imho, this sort of evolution was both inevitable, but quite possibly necessary to keep up. It kind of goes along with arguing against those with AI paranoia ( in general...), posturing all these dystopian fears over a future guided by x-y-z plausible consequence... There may be value in those concerns, but unfortunately, where was all this improving computing power over the generations going to go in the first place? The advent in technological history of Artificial Intelligence, at least to the scale of an Asimovian simulacra, was going to eventually result - and it's not impossible that actual self-awareness ( currently dubbed 'the singularity' in popular culture ) will one soon enough arrive. Digression there, but in the same sense an evolution of these modeling systems is just as inevitable. Beyond that concept, these rapid update species make more sense to have in place with the inevitability of a more ubiquitous Quantum Computing "coming on line". This latter aspect is going to be the biggest game changer in forecast modeling since Fourier Transform theory was integrated by Navier-Stokes fluid mechanics - the fundamental principle of how wave mechanics are propagated through the atmospheric medium. .. caressed by thermodynamics, 3-D integral vector calculus and parameterized by the Coriolis Parameter... one can perhaps intuitively see how/why this becomes exponentially more blown up by chaos out in time, because all those tiny wave functions both initially, and emergent along the processing way, feed back on crippling the outlook by corruption. Quantum Computing does such a vastly improved ability at predicting where and most importantly whence those area fabricated, it can also tell something about whether they are fictitious or real. Keep the real; toss the sci-fi. What remains is ...panacea for random contamination if you will, a fix for what the original modeler forefathers warned, 'because we can't asses the actual quantum states of every particle in space and time, chaos that results means that no weather model will be sufficiently accurate beyond a certain number of days' In the years since ...that had been extended considerably just by coming up with more physically proven mechanics, combined with denser empirical input data ( initialization of grids and so forth). A lot of error production while processing comes from interpolation to "fill" gaps in data sparseness...etc, because interpolation is estimating. In other word, probability guess-work. Thus, input density and increased conventional computing power did/does extend outward by reducing the uncertainty cost. But there's ultimately still a theoretical limit; if we are not sampling at the discrete-est levels, and then computing in the same scalar space, we are about as far as we can go to improve using conventional tech. With ever improving sensory technology combined with arriving Quantum Computing ... that sounds quite a bit like we closing that gap on Quantum blindness. Sorry...just some op ed'ing.
