Jump to content
  • Member Statistics

    17,514
    Total Members
    7,904
    Most Online
    CHSVol
    Newest Member
    CHSVol
    Joined

Introducing the AmericanWx.com WRF


Recommended Posts

We're proud to announce that we are now producing initial public runs of our own AmericanWx.com WRF model. Cory has been working painstakingly hard on getting some runs available in preparation for this storm. For now we are running on an eastern 2/3 CONUS domain until Cory gets the machines clustered. This is only running off of "The Beast" (as it has come to be known) donated by member SP. Once clustered it will be full CONUS.

We have hourly forecast progs available out to 30hr. With time there will be more variables and maps added and some of the existing maps will be tweaked (scales, variables, etc).

The current maps available...

2m Temp/10m Wind/SLP (link is clickable and images can be "looped" with mouseovers like e-wall)

Simulated Reflectivity

Hourly QPF

Hourly Snowfall

Total QPF

Total Snowfall

Simulated IR Satellite

850mb Temp/Height/Wind

700mb RH/Height/Wind

500mb Temp/Height/Wind

300mb Wind

And forecast soundings for various US cities

Here is the link to our WRF page: http://models.americanwx.com/

For now, the current run is written over the previous run so be sure to check your times and make sure you're not viewing old data.

Thanks again to Cory, SP, and all of the people who donated to make this happen.

As always, questions, comments, and suggestions are encouraged.

Link to comment
Share on other sites

  • Replies 67
  • Created
  • Last Reply

After a bit of browsing.. the very first suggestion I have for you.. A loop feature in each variable you have currently would be nice. A simple looping through the frames that are available. (java scripts... :X)

But it looks good right now!

Did have one file could not be found though for total 1 hour precip at hour interval 30.

Edit: Also for hour 30 1 hour snow fall.

    	The page you tried to access does not exist on this server. This  page may not exist due to the following reasons: 	

[list=1][*][b]You are the owner of this web site and you have not  uploaded[/b] (or incorrectly uploaded) your web site. For information  on uploading your web site using FTP client software or web design  software, [url="http://ftphelp.secureserver.net/index.html"]click  here for FTP Upload Information[/url].[*][b]The URL that you have entered in your browser is  incorrect.[/b] Please re-enter the URL and try again.[*][b]The Link that you clicked on incorrectly points to  this page.[/b] Please contact the owner of this web site to inform  them of this situation.[/list]

Link to comment
Share on other sites

Some of the bugs we've identified are:

1) Missing the last frame on hourly precipitation and hourly snowfall. All of the graphics are there, it's just misaligned with the links (hour 30 is at hour 29).

2) The QPF scale on the hourly precip/snowfall isn't static. It jumps around based on the maximum amount in each frame, so you have to be careful.

3) The temperature scale at 850 has the same issue as above...you see some jumping around as you scroll through it.

I'll be working on fixes for them tomorrow.

Link to comment
Share on other sites

Some of the bugs we've identified are:

1) Missing the last frame on hourly precipitation and hourly snowfall. All of the graphics are there, it's just misaligned with the links (hour 30 is at hour 29).

2) The QPF scale on the hourly precip/snowfall isn't static. It jumps around based on the maximum amount in each frame, so you have to be careful.

3) The temperature scale at 850 has the same issue as above...you see some jumping around as you scroll through it.

I'll be working on fixes for them tomorrow.

:thumbsup: Nice quick response awesome!

Link to comment
Share on other sites

For those interested, here are the nuts and bolts of the model:

Primary Time Step : Adaptive

Step to Output Time : Yes

Grid dimentions (NX x NY) : 301 x 255

Vertical Layers (NZ) : 40

Grid Spacing : 12.00km

Top of Model Atmosphere : 50mb

Parent Domain : NA

Timing Information

Simulation Length : 30 Hours

Boundary Update Freq : 03 Hours

Restart File Interval : 06 Hours

Initialization: : GFS

Boundary conditions: GFS

Model Physics

Dynamics : Non-Hydrostatic

Cumulus Scheme : Kain-Fritsch (I prefer to use Grell-3D at higher resolutions but it adds a lot of run-time, so for now we're using this for the current 12 km)

Microphysics Scheme : WSM Single-Moment 6-Class Graupel

PBL Scheme : Yonsei University

Land Surface Scheme : Noah 4-Layer LSM

Number Soil Layers : 4

Surface Layer Physics : Monin-Obukhov

M-O Heat and Moisture : Surface Fluxes On

M-O Snow-cover Effects : Included

Long Wave Radiation : RRTM

Short Wave Radiation : Dudhia Scheme

Cloud Effects : Cloud Effects On

Topographic Shading : Shading Effects Off

ARW Core Model Dynamics

Dynamics : Non-Hydrostatic

Gravity Wave Drag : Off

Time-Integration Scheme : Runge-Kutta 3rd Order

Diffusion Sheme : Simple Diffusion

6th-order Diffusion : 6th-Order W/O Up-Gradient

6th-order Diffusion Rate : 0.12

Eddy Coefficient Scheme : 2D 1st Order Closure

Damping Option : W-Rayleigh

Damping Depth from Top : 5 Km

Damping Coefficient : 0.2

W Damping : W Damping On

Horiz Momentum Advection : 5th Order

Horiz Scalar Advection : 5th Order

Vert Momentum Advection : 3rd Order

Vert Scalar Advection : 3rd Order

Sound Time Step Ratio : Automatic

Moisture Advection Option : Positive-Definite

Scalar Advection Option : Positive-Definite

TKE Advection Option : Positive-Definite

I also have it using Digital Filter Initialization (DFI) using the Twice DFI setting and the Dolph filter. DFI helps reduce model spin-up time during the early stages of integration due to mass/momentum imbalance in the initial conditions. It'll use this until we can get LAPS up and running and get the model initializing off our own real-time "hot start" analysis system.

It's also using adaptive time-stepping. Basically the model auto-throttles its time-step to generate the fastest run while maintaining numerical stability. If it determines that something is getting out of hand--like what appears to be a CFL violation off the Southeast US coast in the latest run--it will shrink its time-step down to prevent a crash. The usual method of time-stepping is to set it to 6 * grid spacing, but using the adaptive technique cut our run time in half.

Also ingests high-res, near real-time sea surface and Great Lakes temperatures from NASA SPORT.

Link to comment
Share on other sites

What are the times it runs/finishes?

We're doing 12Z and 00Z runs. The model run time varies a bit, but most of the images should be uploaded by the next synoptic time (12Z is available by around 18Z, 00Z is available around 06Z).

Can you explain more about how you initialize the model? What are you using for boundary conditions? NAM? GFS? Open?

Right now it's initialized from GFS and uses the same for boundary conditions. Later plans are to initialize from LAPS and possibly use our own coarse-resolution cycling WRF for boundary conditions.

Link to comment
Share on other sites

Several questions:

1. Why are you only using 12km resolution? Combined with the fact you're only using 2/3 of the US, the full 5km version would take over 8x the power to use in the same amount of time (not to mention the different convective model's additional time)... is the server it's being upgraded to really that powerful?

2. What exactly does adaptive time-stepping do? From what I gather it changes the timing to not use unneeded processing power on areas where nothing's going on, and zooms in on the right locations in the temporal dimension... How do you generate maps for a static time-step (such as 1-hr precip) if the time-step changes?

3. What would be the stats on the coarse-res boundary model (just the basics... how often it's run, what resolution, what areas it covers, how it's initialized, and what it uses for boundary conditions)?

4. How is the snowfall computed? Does it use the model internals or is an algorithm like Kuchera or Cobb used on the output?

5. Any chance of getting GRIB output (or NetCDF, but GRIB is always preferred for model output) with server-side regional/per-variable subsetting on this anytime soon (or even GRIB for a fixed 22N-53N 129W-64W or slightly larger region with per-variable subsetting, even if it has to be the Canadian style with each variable being a separate file)?

6. Are you going to nest inside the model a super-high-res version (1-2 km or so) for the Northeast (I'm not from there, but many here are)?

7. Where and when will the verification statistics be made available (might as well provide the degradation curves or whatever they're called for the AmWx WRF compared to the GFS and NAM)?

8. What mapping program do you use?

9. What exactly are the 40 levels of the atmosphere (as in, how many mb for each... is it simply 50mb-1000mb at 25mb intervals plus surface, or is it more complicated)?

10. Where is 500mb vorticity?

Question 5 is by far the most important of the bunch.

Link to comment
Share on other sites

Ugh...the 12Z run will be late. I just got home from working all night and found that my internet connection had flaked out (that doesn't happen often though). The model wasn't able to get the files and run itself, so I've started it manually. Will probably be ready around 21Z. Sorry for the delay. axesmiley.png

Link to comment
Share on other sites

This is frakin awesome. Thanks for all the hard work in getting this together. It'll be interesting to see how it performs and how trustworthy it is.

One thing that'd be cool is archive of model output and then a page where you can compare forecasts from different runs for the same verification time. That's definitely a down the road kind of thing, but it'd be really neat, I think.

Link to comment
Share on other sites

Great job on the WRF being one of the only models do model the near bomb surface pressure falls early on. The HPC surface analysis had the coastal down to 998.

post-999-0-09296200-1294820980.png

One of the more amped models, the SUNY SB MM5 only had 1003:

post-999-0-08911000-1294820948.gif

The 0Z NAM was around 1001-1002, and the GFS around 1003. The only other model that was close was the NCEP WRF-ARW at around 998/999. I couldn't tell based on its use of contours.

Link to comment
Share on other sites

Several questions:

1. Why are you only using 12km resolution? Combined with the fact you're only using 2/3 of the US, the full 5km version would take over 8x the power to use in the same amount of time (not to mention the different convective model's additional time)... is the server it's being upgraded to really that powerful?

2. What exactly does adaptive time-stepping do? From what I gather it changes the timing to not use unneeded processing power on areas where nothing's going on, and zooms in on the right locations in the temporal dimension... How do you generate maps for a static time-step (such as 1-hr precip) if the time-step changes?

3. What would be the stats on the coarse-res boundary model (just the basics... how often it's run, what resolution, what areas it covers, how it's initialized, and what it uses for boundary conditions)?

4. How is the snowfall computed? Does it use the model internals or is an algorithm like Kuchera or Cobb used on the output?

5. Any chance of getting GRIB output (or NetCDF, but GRIB is always preferred for model output) with server-side regional/per-variable subsetting on this anytime soon (or even GRIB for a fixed 22N-53N 129W-64W or slightly larger region with per-variable subsetting, even if it has to be the Canadian style with each variable being a separate file)?

6. Are you going to nest inside the model a super-high-res version (1-2 km or so) for the Northeast (I'm not from there, but many here are)?

7. Where and when will the verification statistics be made available (might as well provide the degradation curves or whatever they're called for the AmWx WRF compared to the GFS and NAM)?

8. What mapping program do you use?

9. What exactly are the 40 levels of the atmosphere (as in, how many mb for each... is it simply 50mb-1000mb at 25mb intervals plus surface, or is it more complicated)?

10. Where is 500mb vorticity?

Question 5 is by far the most important of the bunch.

1) The current form is really only a beta release, I guess you could say. It's only 12 km to save crunch time. I don't know if full 5 km will be possible to do because when I quoted that, I wasn't taking into account the length of time it takes the machine to do post-processing. With the current set of graphics and soundings, it takes roughly 1 hour 15 minutes to generate and upload them all (and we have plans to add more), which is a few minutes longer than the model run itself. If we go getting too crazy with high-res, even with both machines running full overclocked throttle we'll be doing a lot of waiting. The final resolution will depend on test results when I get them clustered.

2) With adaptive time-stepping, it essentially tries to give you the fastest run possible while maintaining stability. It constantly recalculates the timestep by comparing the max Courant number in the domain to the maximum stable Courant number, and adjusts accordingly to keep things from getting unstable and crashing the model (prevents a Courant-Friedrich-Lewy (CFL) violation http://en.wikipedia....3Lewy_condition .The precipitation maps are generated from the WRF's output files that it dumps at the end of each model hour, so it's not dependent on timesteps.

3) I'm assuming you're referring to a possible future cycling WRF model? I don't have any info on that yet. The current model is bounded by the ~27km GFS.

4) Snowfall comes directly from what's produced inside the model. It doesn't use any ratios or other schemes outside of that.

5) There has been some talk of making GRIB available but that would be up to the administrators. There are plans to release BUFKIT sounding files later as well, as soon as I clear some bugs.

6) There probably will be no nesting, especially at that type of resolution because of the crunch time.

7) I can't speak to verification since I don't know when or if it will be done.

8) Graphics are generated with NCAR Command Language (NCL)

9) Vertical levels are more complicated than that...they're not evenly distributed. There are more levels in the lowest layers of the model atmosphere. Usually any plots that are done on isobaric surfaces are interpolated from the nearest model levels.

10) I haven't gotten vorticity scripts working quite right yet. The current NCL plots I have make that field look like a chaotic mess. Once we drop the horizontal resolution down further, it might not be possible to use vorticity all because it's too noisy, unless I find a way to drastically smooth it.

Link to comment
Share on other sites

10) I haven't gotten vorticity scripts working quite right yet. The current NCL plots I have make that field look like a chaotic mess. Once we drop the horizontal resolution down further, it might not be possible to use vorticity all because it's too noisy, unless I find a way to drastically smooth it.

I'm pretty sure that NCL has a standard spatial 9-point smoother....but you're probably going to have to call several passes of it to get anything reasonable/smooth looking. If that's not enough, there must be a way to utilize the spline interpolation (which has smoothing properties) in combination with a grid to grid utility.

Link to comment
Share on other sites

With the current set of graphics and soundings, it takes roughly 1 hour 15 minutes to generate and upload them all (and we have plans to add more), which is a few minutes longer than the model run itself.... Graphics are generated with NCAR Command Language (NCL).

Use GrADS instead. It's much faster. You shouldn't be taking any more than 1/2 second per map for generation (assuming you already have made GRIB files for GrADS to read), and that's even pretty slow (I can generate 5 or so 800x600 maps per second from 0.5 deg GFS on a VirtualBox Linux server running on a single core of an ordinary laptop... decrease image size to 400x300 and it's around 20 per second).

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.

×
×
  • Create New...