MegaMike

Meteorologist
  • Content Count

    336
  • Joined

  • Last visited

About MegaMike

  • Birthday 09/09/1993

Profile Information

  • Four Letter Airport Code For Weather Obs (Such as KDCA)
    KHFD
  • Gender
    Male
  • Location:
    Storrs, CT

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Yup! That sounds like you already have it installed. You can install/configure certain libraries and modules using conda (windows terminal) and pip (anaconda terminal) for Jupyter Notebook. I faced similar issues when I first started scripting. Basemap took a lot of time to install/configure... Overall, it took me ~1 year of constant scripting to understand Python pretty well. I started with MATLAB, but decided to move to Python since the script is compatible w/Linux (plus all of the utilities!). I'm sure you can find a Python class as it relates to meteorology, however, I learned Python best by writing script that I was interested in. My first script I created was 1) obtain GHCN-D snowfall observations then 2) identifying suspicious observations using a spatial outlier algorithm. The intent was to use it on observations that were provided by members of this forum so I can label weenies w/statistical support
  2. Anything is possible with Python! That is what my script does (.csv output), but I only consider liquid water equivalent, snow water equivalent, and snowfall. The good thing about Python is that there are utilities/modules/etc... that already exist that you can install then run pretty easily for specific datasets. For example, for GHCN: https://github.com/aaronpenne/get_noaa_ghcn_data... Avoid any utility or archive that requires you to pay. This data is public. I'd suggest installing Jupyter Notebook. https://jupyter.org/ It's basically a (somewhat) interactive interface that runs Python script. It comes with a lot of pre-compiled modules and libraries too. You can message me if you'd like more details or run into any problems.
  3. If you're looking for max/min's (temperature, wind speed, wind gust, etc...) and or precipitation, GHCN-D is great. I'm not sure if there's an interactive website that hosts this data though... I use Python script to extract daily observations by station and date. https://www.ncdc.noaa.gov/data-access/land-based-station-data/land-based-datasets/global-historical-climatology-network-ghcn
  4. I've been asked to evaluate the National Blend of Models for its accuracy in terms of snowfall. As a side task, I wrote Python script to plot snowfall accumulations for the most recent NBM cycles (00z, 01z, 02z, and 03z). They look pretty bullish compared to other models, but spatially they look pretty similar (higher impacts in western MA and the Worcester hills). I'm looking forward to seeing how well they perform. I hold the NBM to high standards although its snowfall accumulations (in my opinion) look too high for this event. If anyone's interested, most operational NWP products are available via the NOMADS server <nomads.ncep.noaa.gov>. In the future, I'll make my plots nicer, but for the time being, it's not being used by a third party so :shrugs:
  5. I stumbled upon this thread by chance trying to look for a precipitation analysis method for an evaluation with respect to multiple (44) winter weather events. The idea was that if I conduct MODE analysis between StageIV precipitation w.r.t modeled hourly precipitation, I can determine the accuracy of banded precipitation vs. several modeling systems (ICLAMS/WRF). Unfortunately, it's recommended (strongly suggested) by the developers to avoid such a methodology due to poorly ingested liquid water equivalent observations under snowy conditions... The rain gauges struggle to observe liquid water equivalent when snow is falling... I'm now considering the RTMA, URMA, or possibly, a satellite derived product instead. "Each River Forecast Center (RFC) has the ability to manually quality control the Multisensor Precipitation Estimates (MPE) precipitation data in its region of responsibility before it is sent to be included in the Stage IV mosaic. The precipitation values,however, are not intentionally modified downwards during snow events. Rather, due to inaccurate measuring of liquid equivalents at many gauge locations (e.g., a lack of the equipment to melt and measure the frozen precip), zero or very low values are reported at these locations. These "bad" gauge values then go into the MPE algorithm, resulting in liquid precip estimates that are too low during winter storms. There are also problems with zero or too low precipitation values at many RFC gauge locations even outside of heavy snowfall events." "There are problems with the RFC precip data in the eastern U.S. during heavy snow events. While ASOS stations have the equipment to melt the snow and derive the liquid equivalent precip, the RFC stations in the East do not. So when there are big snowfall events such as the January 2016 blizzard, the snow accumulations get recorded, but the corresponding liquid equivalents often come in as zero or near zero amounts, which are incorrect." If you're curious (Model Evaluation Tool for MODE analysis): https://dtcenter.org/sites/default/files/community-code/met/docs/user-guide/MET_Users_Guide_v8.1.2.pdf
  6. The difference between the NAM3km and the NAM12km/32km at 500hPa is pretty extreme. By now, you'd expect some consistency or at least a realistic simulation (from the NAM3km). I definitely agree with you... Toss the NAM3km.
  7. I really like your graphics, The 4 Seasons! Are your interpolations done manually? Also, what program/software do you utilize to create your graphics? Some recommendations/suggestions (in case you erased your inbox): If you're looking for pre-1993 observations, try getting your data via the Global Historical Climatologist Network (GHCN). You can get snowfall totals (and what not) from 1973-today (for in-situ/surface stations). If you want snowfall totals for specific events, you'll have to sum daily snowfall accumulations for 't' # of days and for 'n' # of stations. Conveniently, the data is quality checked! Here's the URL: ftp://ftp.ncdc.noaa.gov/pub/data/ghcn/daily/ Personally, I like working with the yearly ('by_year') .csv files. Last thing, if you're using crowd sourced data, I'd recommend processing the data through a spatial outlier test. The "Local Anselin Moran's" algorithm can be used to scientifically eliminate suspect/faulty observations. Michael Squires (created the Regional Snowfall Index scale =~ NESIS ) used this method for his research. Algorithm details: https://pro.arcgis.com/en/pro-app/tool-reference/spatial-statistics/h-how-cluster-and-outlier-analysis-anselin-local-m.htm If you're interested, I wrote script for Moran's algorithm utilizing Python.
  8. Mainly FORTRAN (great program for running numerical computations) on a Linux OS. When it comes to post-processing data; NCL, REVU, MATLAB, and or PYTHON.
  9. Pinging is getting louder (Pawtucket/Providence). I'll call it a night here in the "Bucket." I don't expect much more additional snowfall accumulation(s) for my area. Enjoy the storm, everyone!
  10. Yea... Starting to mix now in Pawtucket/Providence... Visibility at KPVD increased to 0.75 miles. I went outside and there's a mix b/n snow/sleet/and mist. I'd put the ratio at 80/10/10. I roughly have 8 inches of snow otg.
  11. Unfortunately, snow ratios haven't been impressive over the past hour in Pawtucket (near Providence). Visibility increased since earlier this morning as well. I'll go outside to make an observation soon.
  12. Mixing hasn't reached the Pawtucket/Providence area so far. KPVD actually observed 0.05 inches of LWE (assuming it's SWE) in 5 minutes from 3:35->3:40am 0.17 inches of LWE since 3:00am with no observations of mixed precipitation..
  13. Roughly about 3 inches in Pawtucket. Sticky snow with good ratios. Not much accumulation on the roadways though... Maybe about an inch.
  14. I thought she was looking for how the units canceled out and a simple function for the weight of snow/water on a flat plane. This is what I did. But, last point and I'll stop discussing this topic... If you're looking for the weight of snow/water on an incline/roof, you can take the weight of the snow *times* the cosine of theta (angle of the roof). So, weight.perp.roof(depth[inches],theta[degrees]) = (((997 kg/m^3)*[(SWE)^3]* [( 2.54 cm/inch)^3]* [(1m/100cm)^3)])*2.20462 lb/kg)*cos(theta) I'm sure the younger audience on this forum would find it helpful to see how the units cancel out.
  15. Adding a SWE term: If snow water equivalent (SWE) is known (in inches)... weight(depth) = ((997 kg/m^3)*[(SWE)^3]* [( 2.54 cm/inch)^3]* [(1m/100cm)^3)])*2.20462 lb/kg