MegaMike

Meteorologist
  • Content Count

    329
  • Joined

  • Last visited

About MegaMike

  • Birthday 09/09/1993

Profile Information

  • Four Letter Airport Code For Weather Obs (Such as KDCA)
    KHFD
  • Gender
    Male
  • Location:
    Storrs, CT

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. I stumbled upon this thread by chance trying to look for a precipitation analysis method for an evaluation with respect to multiple (44) winter weather events. The idea was that if I conduct MODE analysis between StageIV precipitation w.r.t modeled hourly precipitation, I can determine the accuracy of banded precipitation vs. several modeling systems (ICLAMS/WRF). Unfortunately, it's recommended (strongly suggested) by the developers to avoid such a methodology due to poorly ingested liquid water equivalent observations under snowy conditions... The rain gauges struggle to observe liquid water equivalent when snow is falling... I'm now considering the RTMA, URMA, or possibly, a satellite derived product instead. "Each River Forecast Center (RFC) has the ability to manually quality control the Multisensor Precipitation Estimates (MPE) precipitation data in its region of responsibility before it is sent to be included in the Stage IV mosaic. The precipitation values,however, are not intentionally modified downwards during snow events. Rather, due to inaccurate measuring of liquid equivalents at many gauge locations (e.g., a lack of the equipment to melt and measure the frozen precip), zero or very low values are reported at these locations. These "bad" gauge values then go into the MPE algorithm, resulting in liquid precip estimates that are too low during winter storms. There are also problems with zero or too low precipitation values at many RFC gauge locations even outside of heavy snowfall events." "There are problems with the RFC precip data in the eastern U.S. during heavy snow events. While ASOS stations have the equipment to melt the snow and derive the liquid equivalent precip, the RFC stations in the East do not. So when there are big snowfall events such as the January 2016 blizzard, the snow accumulations get recorded, but the corresponding liquid equivalents often come in as zero or near zero amounts, which are incorrect." If you're curious (Model Evaluation Tool for MODE analysis): https://dtcenter.org/sites/default/files/community-code/met/docs/user-guide/MET_Users_Guide_v8.1.2.pdf
  2. The difference between the NAM3km and the NAM12km/32km at 500hPa is pretty extreme. By now, you'd expect some consistency or at least a realistic simulation (from the NAM3km). I definitely agree with you... Toss the NAM3km.
  3. I really like your graphics, The 4 Seasons! Are your interpolations done manually? Also, what program/software do you utilize to create your graphics? Some recommendations/suggestions (in case you erased your inbox): If you're looking for pre-1993 observations, try getting your data via the Global Historical Climatologist Network (GHCN). You can get snowfall totals (and what not) from 1973-today (for in-situ/surface stations). If you want snowfall totals for specific events, you'll have to sum daily snowfall accumulations for 't' # of days and for 'n' # of stations. Conveniently, the data is quality checked! Here's the URL: ftp://ftp.ncdc.noaa.gov/pub/data/ghcn/daily/ Personally, I like working with the yearly ('by_year') .csv files. Last thing, if you're using crowd sourced data, I'd recommend processing the data through a spatial outlier test. The "Local Anselin Moran's" algorithm can be used to scientifically eliminate suspect/faulty observations. Michael Squires (created the Regional Snowfall Index scale =~ NESIS ) used this method for his research. Algorithm details: https://pro.arcgis.com/en/pro-app/tool-reference/spatial-statistics/h-how-cluster-and-outlier-analysis-anselin-local-m.htm If you're interested, I wrote script for Moran's algorithm utilizing Python.
  4. Mainly FORTRAN (great program for running numerical computations) on a Linux OS. When it comes to post-processing data; NCL, REVU, MATLAB, and or PYTHON.
  5. Pinging is getting louder (Pawtucket/Providence). I'll call it a night here in the "Bucket." I don't expect much more additional snowfall accumulation(s) for my area. Enjoy the storm, everyone!
  6. Yea... Starting to mix now in Pawtucket/Providence... Visibility at KPVD increased to 0.75 miles. I went outside and there's a mix b/n snow/sleet/and mist. I'd put the ratio at 80/10/10. I roughly have 8 inches of snow otg.
  7. Unfortunately, snow ratios haven't been impressive over the past hour in Pawtucket (near Providence). Visibility increased since earlier this morning as well. I'll go outside to make an observation soon.
  8. Mixing hasn't reached the Pawtucket/Providence area so far. KPVD actually observed 0.05 inches of LWE (assuming it's SWE) in 5 minutes from 3:35->3:40am 0.17 inches of LWE since 3:00am with no observations of mixed precipitation..
  9. Roughly about 3 inches in Pawtucket. Sticky snow with good ratios. Not much accumulation on the roadways though... Maybe about an inch.
  10. I thought she was looking for how the units canceled out and a simple function for the weight of snow/water on a flat plane. This is what I did. But, last point and I'll stop discussing this topic... If you're looking for the weight of snow/water on an incline/roof, you can take the weight of the snow *times* the cosine of theta (angle of the roof). So, weight.perp.roof(depth[inches],theta[degrees]) = (((997 kg/m^3)*[(SWE)^3]* [( 2.54 cm/inch)^3]* [(1m/100cm)^3)])*2.20462 lb/kg)*cos(theta) I'm sure the younger audience on this forum would find it helpful to see how the units cancel out.
  11. Adding a SWE term: If snow water equivalent (SWE) is known (in inches)... weight(depth) = ((997 kg/m^3)*[(SWE)^3]* [( 2.54 cm/inch)^3]* [(1m/100cm)^3)])*2.20462 lb/kg
  12. I think I see what she's asking for... The weight of water in lbs as a function of snow depth and or volume (length,width,height)... Density of water = 997 kg/m^3. Assuming a volume of 1 cubic yard: weight(depth) = ((997 kg/m^3)*[1 yard]^3*[(3 ft/yard)^3]*[(12 inch/ft)^3]* [( 2.54 cm/inch)^3]* [(1m/100cm)^3)])*2.20462 lb/kg == 1680 lb You can also change it to (assuming length/width/height is measured in yards): weight(length,width,height) = ((997 kg/m^3)*[length*width*height]*[(3 ft/yard)^3]*[(12 inch/ft)^3]* [( 2.54 cm/inch)^3]* [(1m/100cm)^3)])*2.20462 lb/kg Just be careful with the units. I made these two conversions assuming you measured a depth/length/width/height in yards. You'll have to exclude some terms if you utilize a different unit of measurement. edit: weight not wight...
  13. No problem, Wiz! I'm glad to help. I may keep my process in this thread for the sake of reference.
  14. I've been working on extracting operational variables myself for modeling purposes. I believe the first url ('https://www.ncdc.noaa.gov/data-access/model-data/model-datasets/global-forcast-system-gfs') archives only analysis data (is this what you want?). Regardless, the second url ('http://www.nco.ncep.noaa.gov/pmb/products/gfs/') stores/archives both analysis and operational (forecast/predicted) data. When you write "choose which parameters and data," I'm assuming you mean NWP variables such as temperature, wind speed, etc... That being stated; Typically, NWP data is stored by grib files. Very rarely are these grib files stored for specific variable such as 'temperature' or 'wind speed' separately. If they are, however, all you'd need to do is connect to a particular server then download a variable-specific grib file for 'x' different files. You can then convert that file into a .txt/.csv/etc... file afterwards in order to manipulate the data as you wish. What I think you'll need to do, however, is download a grib file (which contains all variables) for one timestamp from a server, then you'll have to extract a specific variable from a grib file itself. Therefore you'll need to a) connect to a server to obtain grib files likely by ftp b) extract a variable from the grib file and c) store the extracted variable as a .txt/.csv/etc... file to manipulate its data. This is how I'm planning on accomplishing this task for my project: a) Firstly, I'd recommend this server ('ftp://tgftp.nws.noaa.gov/SL.us008001/ST.opnl/'). The server hosts multiple NWP models (analysis/operational) in a user friendly and organized manner. It'll be easy to cd and loop through certain directories in order to obtain multiple NWP models for multiple variables == It's convenient . You can use this link 'https://www.weather.gov/mdl/ndfd_data_grid' to determine the abbreviations and the formatting of the directories. b) As for extracting certain variables, I'd recommend NCAR Command Language (NCL). The program can be run on Linux (recommended) or a Linux Bash Shell via Windows 10. It takes only a couple lines of code to extract data using this program. Alternatively, you can use MATLAB by utilizing nctoolbox or MeteoLab (likely more functions than this). Both functions accomplish the same procedure in a slightly more intensive way. If you want a more interactive program and have the patience of a Jets or Browns fan (~1/2 a century worth of patience), you can use the NOAA Weather and Climate Toolkit. This program is 10x more tedious and longer than simply using NCL or MATLAB. You'd have to download the grib files manually (or automatically by a loop using a program of your choice) before being able to process the data through the Toolkit. I first used this program when I was an undergraduate. I definitely wouldn't recommend this program now, unfortunately. I'd say it's useful for case studies of one or two events. - You can also use Python and probably R to obtain, extract, and manipulate grib files, as well. My knowledge is very limited for these programs though. I hope this helps and I welcome anyone else who has other methods, ideas, and or corrections!
  15. Hello,

    Please read my comment that I posted to your 'stock market' related thread! :thumbsup: