Jump to content
  • Member Statistics

    17,505
    Total Members
    7,904
    Most Online
    Dano62
    Newest Member
    Dano62
    Joined

Will We Ever Get To The Point Where Models Are 100% Accurate?


Recommended Posts

If anything, I'm concerned that NWP models are becoming degraded by trying to make them too perfect. For example, at my WFO we've been wondering if parameterizations etc. that worked well at coarser resolutions are not working as well at finer resolutions. IMO the NAM has never been the same ever since becoming the NMM version of the WRF. And some SREF members have shown really poor skill of late, skewing our experimental snowfall probabilities to the high side. To paraphrase Dr. McCoy, I'm a forecaster not a modeler, and don't know the answer myself, but would sure love some guidance here.

1.  The NAM is no-longer "WRF-based".  It runs using a version of NMM that runs under something call NEMS, which is a NOAA modeling software framework based on ESMF.

2.  The SREF system probably needs a complete redesign.  There is some work ongoing to develop a convection scale ensemble based on the HRRR and probably something else from EMC (a NAM-based hourly updating ensemble).

3.  I'm not sure what you mean by "making them too perfect".  Models continue to improve but much work remains.  You make a good point about about parameterizations, as developers still have a ton of work to do in this arena.  For example, there is no guarantee that moving to higher resolution will improve the model, unless parallel improvements are made to the sub-grid scale parameterizations.  Even at cloud resolving scales we have to worry about microphysics, etc.  We still have a long ways to go in terms of data assimilation as well. 

 

Most of the remainder of this thread just makes me sad.  Perfect forecasts?  Really? 

 

Coach, let me know when you have your supercomputer set up.  I'll make sure to point you where to get all of the observations in real time to set up your system.  Thanks.

Link to comment
Share on other sites

I will level you with a quote from my Master's Thesis Advisor; "The meteorologist who work on numerical weather prediction cannot believe how good their models are, the operational meteorologist wonders how they can be this bad".

So, so true. It has been interesting for me to grow up hearing about forecasters (and weenies ) talk about model performance, to hearing it from the research side. Two different worlds, for sure.

Great post as always, dtk. The issue of fixing parameterizations for increasingly small grid sizes is interesting and should keep plenty of people employed for the foreseeable future. I think more often than not I left my Parameterizations class distressed at some of the prospects and ensuing difficulties while seriously impressed that our forecasts are as good as they are.

Link to comment
Share on other sites

  • 2 weeks later...

Yes of course we will. One day we will probably control the weather and use it to maximize the Earth in terms of natural efficiency.

I am talking like 2200 onwards assuming we continue on a somewhat exponential path of ascension

This is false.

It's physically impossible to know both the speed and exact location of a particle at a given interval, and we'd need perfect information on every photon, phonon, atom, and molecule in the system. Basic quantum mechanics here, the equations simply don't allow it. Therefore, we'll never get a perfect initialization, and chaos will take over from there very quickly.

Link to comment
Share on other sites

Small improvements in precision of initial data make huge differences after a few days, as generally shown by these cumulative probability tables.

 

0.80 _ 0.64 _ 0.512 _ 0.4096

 

0.90 _ 0.81 _ 0.729 _ 0.6561

 

0.95 _ 0.91 _ 0.87 _ 0.83

 

Notice that after four time intervals, the 15% improvement at time zero computes out to more than doubling of accuracy after three forward steps. I'm not saying this is exactly the case with model refinement but it is suggestive of the kind of improvement that can develop from relatively small initial improvements.

 

Having said that, I stick to the initial premise that if there are processes, energy cycles or other structural considerations that are not being programmed in, then at some point the fact that these are missing from the computer models will overwhelm their ability to keep up with what will otherwise appear to be random chaos. Not to say that this is a direction that one would have to go, but suppose you noticed that traffic was heavier on certain days than others, but had no theory of human culture to point you in the direction of a seven day week. You might get the general impression from your data of a seven day cycle but it would have less than 100% predictive ability. This is because you would not have known about statutory holidays, annual work cycles, school holidays etc. If you knew those things you could make your traffic prediction model a lot more accurate without having much better initial data.

Link to comment
Share on other sites

  • 2 weeks later...

I have two comments on this:

1) If the 10-16 day forecast is only slightly better than a shot in the dark, then why does the GFS model run that far out?

2) If nanotechnology continues to grow, I can conceive of billions of "nano-sensors" being released into the atmosphere, Conceivably, the entire globe at all levels could be covered. Each nano-bot would need to be programmed for a specific area of the atmosphere, and to communicate with the group it belongs to, and then relay the data they have collected either to Earth stations, or satellites for relay to ground stations.

 

This would give us the data we need - then all we have to do is build the model that can run with all the data.

 

Of course - with all those nano-bots flying around, our view of the sky might change - and they might even interfere with sunlight reaching the ground - thus changing the weather patterns. If the bots can be made small enouogh, and have enough power to fly a large enough area to collect data, then the number required might be low enough so as not to cause interference.

 

FW

Link to comment
Share on other sites

I have two comments on this:

1) If the 10-16 day forecast is only slightly better than a shot in the dark, then why does the GFS model run that far out?

2) If nanotechnology continues to grow, I can conceive of billions of "nano-sensors" being released into the atmosphere, Conceivably, the entire globe at all levels could be covered. Each nano-bot would need to be programmed for a specific area of the atmosphere, and to communicate with the group it belongs to, and then relay the data they have collected either to Earth stations, or satellites for relay to ground stations.

 

This would give us the data we need - then all we have to do is build the model that can run with all the data.

 

Of course - with all those nano-bots flying around, our view of the sky might change - and they might even interfere with sunlight reaching the ground - thus changing the weather patterns. If the bots can be made small enouogh, and have enough power to fly a large enough area to collect data, then the number required might be low enough so as not to cause interference.

 

FW

I think even if we had every mile we would be fine. If you realistically had something in the air every 5 square miles you would have a deadly accurate read for storms. Either way these are such large scale systems

Link to comment
Share on other sites

I think even if we had every mile we would be fine. If you realistically had something in the air every 5 square miles you would have a deadly accurate read for storms. Either way these are such large scale systems

So I guess the most important question - beyond the technology itself - is whether anyone (gov't or private) would be willing to spend the money to build and deploy the nano-bots.

Link to comment
Share on other sites

I think even if we had every mile we would be fine. If you realistically had something in the air every 5 square miles you would have a deadly accurate read for storms. Either way these are such large scale systems

 

 

So I guess the most important question - beyond the technology itself - is whether anyone (gov't or private) would be willing to spend the money to build and deploy the nano-bots.

 

This still doesn't account for the amount of supercomputing we'd need and imperfect models we have. You have to truncate everything eventually. Plus there will always be measurement error, even with "nanobots". The larger point is that are we considering "perfect"/100% accurate forecasts. By definition, we'd have to get everything right out to the infinite decimal place which will, of course, never happen. So what counts as "acceptably" perfect for this hypothetical? Within 1F? 0.1F? 0.01F? The question/idea itself is flawed in the first place regardless of future technology since it is an impossible goal.

Link to comment
Share on other sites

This still doesn't account for the amount of supercomputing we'd need and imperfect models we have. You have to truncate everything eventually. Plus there will always be measurement error, even with "nanobots". The larger point is that are we considering "perfect"/100% accurate forecasts. By definition, we'd have to get everything right out to the infinite decimal place which will, of course, never happen. So what counts as "acceptably" perfect for this hypothetical? Within 1F? 0.1F? 0.01F? The question/idea itself is flawed in the first place regardless of future technology since it is an impossible goal.

 

I don't think you're taking the nanobots completely seriously, Mister.

Link to comment
Share on other sites

Is there any need for 100% accuracy in weather forecasting? Who would benefit if we knew exactly what the temperature, wind speed, precip amount, etc is going to be on a particular day.

I think what we would benefit the most from is more accurate forecasting for severe storms, like tornados, hurricanes, and blizzards.

Link to comment
Share on other sites

Is there any need for 100% accuracy in weather forecasting? Who would benefit if we knew exactly what the temperature, wind speed, precip amount, etc is going to be on a particular day.

I think what we would benefit the most from is more accurate forecasting for severe storms, like tornados, hurricanes, and blizzards.

 

But don't the two go hand in hand?  In order to have the most accurate forecasts for severe weather, we will need to have more accuarate forecasts for temps, winds etc. as they are necessary parameters in determining severe weather threats.

Link to comment
Share on other sites

But don't the two go hand in hand? In order to have the most accurate forecasts for severe weather, we will need to have more accuarate forecasts for temps, winds etc. as they are necessary parameters in determining severe weather threats.

Yes. 100% accuracy includes everything. Theoretically. If you're high.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...