Welcome to American Weather

JonathanW

Members
  • Content count

    119
  • Joined

  • Last visited

About JonathanW

  • Rank
    Engineer
  • Birthday 08/02/1968

Profile Information

  • Four Letter Airport Code For Weather Obs (Such as KDCA)
    kgai
  • Gender
    Male
  • Location:
    Montgomery Village, MD
  • Interests
    Electronics, telecommunications, weather (metrology and prediction), science and technology in general
  1. I will watch this with interest.
  2. One problem I ran into -- on high dpi screens (e.g. the Surface Pros, Yogas and other recent PCs), the location map can be hard to access, as it's simply too small. As with other programs that don't yet take such screens into account, it can be fixed by including a "manifest" file in the same directory as the executable. You'll also need to modify the registry as found here: http://www.danantonielli.com/adobe-app-scaling-on-high-dpi-displays-fix/ The file should be named "sharpy.exe.manifest": <?xml version="1.0" encoding="UTF-8" standalone="yes"?> <assembly xmlns="urn:schemas-microsoft-com:asm.v1" manifestVersion="1.0" xmlns:asmv3="urn:schemas-microsoft-com:asm.v3"> <dependency> <dependentAssembly> <assemblyIdentity type="win32" name="Microsoft.Windows.Common-Controls" version="6.0.0.0" processorArchitecture="*" publicKeyToken="6595b64144ccf1df" language="*"> </assemblyIdentity> </dependentAssembly> </dependency> <dependency> <dependentAssembly> <assemblyIdentity type="win32" name="Microsoft.VC90.CRT" version="9.0.21022.8" processorArchitecture="amd64" publicKeyToken="1fc8b3b9a1e18e3b"> </assemblyIdentity> </dependentAssembly> </dependency> <trustInfo xmlns="urn:schemas-microsoft-com:asm.v3"> <security> <requestedPrivileges> <requestedExecutionLevel level="asInvoker" uiAccess="false"/> </requestedPrivileges> </security> </trustInfo> <asmv3:application> <asmv3:windowsSettings xmlns="http://schemas.microsoft.com/SMI/2005/WindowsSettings"> <ms_windowsSettings:dpiAware xmlns:ms_windowsSettings="http://schemas.microsoft.com/SMI/2005/WindowsSettings">false</ms_windowsSettings:dpiAware> </asmv3:windowsSettings> </asmv3:application> </assembly>
  3. If it's helpful, here's the script I use for downloading and processing grib data. I make no claims as to the beauty of the program, and at some point, I plan on going over it with a fine-toothed comb for efficiency and minor errors. But it works fine as is. Feel free to make use of it in all or in part. This is my original work. <?PHP // This is a php file which, when automatically run (for example, as a cron job every 10 minutes) checks for updated HRRR model data // covering the current and next forecast hour, and if it's available, downloads and processes it. // The processing includes the creation of: // 1) geographically resampled and chronologically interpolated geotiff raster files created for every 10 minutes between forecast hours // 2) derived products calculated from HRRR products // 3) shapefiles containing level contour polygons // 4) ascii raster files containing the geotiff data // // This script relies upon the GDAL library. This library is freely available for most platforms. // Files are broken out by type (of course), forecast period coverage (every 10 minutes between current hour and next hour), // and product. One file covers one forecast period and one product. // // Additional scripts, called on demand via grlevelx products, create placefiles and associated imagery for given locales // on-the-fly from these data files. // // Needless to say, there are other applications for the downloaded and processed data. // This script was written to work in Ubuntu Linux, but will run on other platforms with minor modifications. // // The subdirectory structure is as follows: // /var/www/gribs/hrrr/gribs: temporary downloaded grib files // /var/www/gribs/hrrr/grid: derived ascii raster files, in AAIGrid format, average about 30 MB each // /var/www/gribs/hrrr/img: image files covering a geographic subset of model data, created by further scripts on an // as-requested basis // /var/www/gribs/hrrr/shp: shape files containing level contour polygons, average about 2 MB each product // /var/www/gribs/hrrr/temp: lockfiles // /var/www/gribs/tif: geotiff files, about 20.5 MB each // // The script will run fine on older hardware (it's currently running on an Intel E5200 dual-core processor), but the computations // the GDAL library functions use are intensive; more products processed means more processing time. As-is, it will complete // processing of its configured data set in about 8 minutes. // // Jonathan Williams, 2015 date_default_timezone_set ( 'Zulu' ); chdir('/var/www/gribs/hrrr'); #chdir('c:\apache24\htdocs\gribs\hrrr'); // Array of grib products to download. File inventories may be found here: // http://www.nco.ncep.noaa.gov/pmb/products/hrrr/hrrr.t00z.wrfprsf00.grib2.shtml // Array is of the following format: // [[Parameter]_[Level/Layer]]_[Output file name prefix]_[Level Flag] = [Contour Flag]_[$Offset]_[$Interval]_[$Extra]_[$PreCalc] // where // [Parameter] & [Level/Layer]: identifying characteristics of the product subset in the grib file // Level Flag, 1 = Product is not height above surface, 0 = subtract product value from surface geopotential height for height above surface // Contour Flag: 1 = level contours desired for this product, 0 = do not create contours // $Offset: value applied in contour creation. From GDAL_Contour documentation: "Offset from zero relative to which to interpret intervals" // $Interval: value applied in contour creation. GDAL_Contour doc: "elevation interval between contours" // $Extra: Additional GDAL_Contour parameter (future expansion) // $PreCalc: uniform multiplier to apply to all grid points prior to other calculations $ProdArray = array( "CAPE_surface_CAPE_1" => "1_0_500_ _1", "CIN_surface_CIN_1" => "1_0_50_ _1", "LFTX_500-1000 mb_LFTX_1" => "1_0_20_ _10", "TMP_2 m above ground_TMP_1" => "1_-1.11111111_2.7777778_ _1", "TMP_surface_STMP_1" => "1_-1.11111111_2.7777778_ _1", "PRMSL_mean sea level_PRMSL_1" => "1_0_200_ _1", "DPT_2 m above ground_DPT_1" => "1_-1.11111111_2.7777778_ _1", "HLCY_3000-0 m above ground_HLCY_1" => "1_0_250_ _1", "MXUPHL_5000-2000 m above ground_MXUPHL_1" => "1_0_25_ _1", "PWAT_entire atmosphere (considered as a single layer)_PWAT_1" => "1_0_5_ _1", ); // Products that will be calculated/derived from downloaded products. // Array is of the following format: // [Output file name prefix] = [Operand Prod A]_[Operand Prod B]_[Equation]_[Contour Flag]_[$Offset]_[$Interval]_[$Extra]_[$PreCalc] $CalcArray = array( "EHI" => "CAPE_HLCY_(A*B)/160000_1_0_1_ _1" ); // Reference Surface geopotential height grib product $SurfaceLevel = "HGT_surface_SLVL"; // Initialize a few variables date_default_timezone_set("GMT"); $ModelUnixTime = time(); // Current time (Unix time, in seconds). This variable will be used // to find the model run to acquire. $ForecastOffset = 0; // The model forecast hour. Start at 0 (current hour). $LastFile = 0; // File for keeping track of the most recent grib file accessed. Load that into "$LastFile". $trackhandle = @fopen('track.txt','r'); if($trackhandle){ if(($buffer = fgets($trackhandle))!==false){ $LastFile = rtrim($buffer); } fclose($trackhandle); } // Loop from current hour to six hours behind, to find the most recent HRRR model run available // on the NCEP FTP site with grib files covering both the current forecast hour and the next // hour. We're interested in acquiring both the data for the current forecast hour and the data // for the next forecast hour (both from the same, most recent model run), and interpolating // between the two in 10-minute intervals. echo "Downloading grib files...\n"; for($i = 0; $i < 6; $i++){ $url = 'http://www.ftp.ncep.noaa.gov/data/nccf/nonoperational/com/hrrr/prod/hrrr.'.date('Ymd',$ModelUnixTime).'/hrrr.t'.date('H',$ModelUnixTime).'z.wrfprsf'.str_pad($ForecastOffset+1,2,"0",STR_PAD_LEFT).'.grib2'; $idx = curl_init($url.".idx"); curl_setopt($idx, CURLOPT_NOBODY, true); curl_setopt($idx, CURLOPT_CONNECTTIMEOUT,30); curl_setopt($idx, CURLOPT_TIMEOUT,30); curl_exec($idx); $retcode = curl_getinfo($idx, CURLINFO_HTTP_CODE); // $retcode >= 400 -> not found, $retcode = 200, found. if($retcode < 350) break; // We've found the most current model run; move on. curl_close($idx); $ModelUnixTime -= 3600; // Current run not yet found, so search for a model run one hour earlier, and... $ForecastOffset += 1; // increment the forecast hour so that we're still covering the same forecast time } curl_close($idx); if($i == 6) exit("Couldn't find a recent grib file"); #------------------------------------------------------- // Set "$FileFound" to the grib file for the forecast hour *before* what was found; this will be // stored in our "track.txt" tracker file. $FileFound = date('YmdH',$ModelUnixTime).str_pad($ForecastOffset,2,"0",STR_PAD_LEFT); // If we have a new model file, proceed with acquiring the data if($FileFound != $LastFile){ // Begin with downloading the most recently available $idx = curl_init($url.".idx"); curl_setopt($idx, CURLOPT_RETURNTRANSFER, true); curl_setopt($idx, CURLOPT_CONNECTTIMEOUT,30); curl_setopt($idx, CURLOPT_TIMEOUT,30); $idxfile = curl_exec($idx); // Download grib file index curl_close($idx); if($idxfile===FALSE) exit("Couldn't access index file 1"); $idxlines = explode("\n",$idxfile); // Store index file contents in array "$idxlines" // Loop to find in the index file the start and end points of each desired product within the grib file. // As these are listed in the index files, we don't have to download the entire grib file. foreach($ProdArray as $ProdLine => $tmp){ list($Prod,$Level,$FName,$LFlag) = explode("_",$ProdLine); foreach($idxlines as $key => $line){ $linearray = explode(":",rtrim($line)); if(($linearray[3]==$Prod)&&($linearray[4]==$Level)){ $start = $linearray[1]; unset($linearray); if(isset($idxlines[$key+1])){ $linearray = explode(":",rtrim($idxlines[$key+1])); $stop = $linearray[1]-1; }else{ $stop = ''; } break; } } if(!isset($start)){ exit("Product not found!"); } $Range[$FName] = $start."-".$stop; // Range of bytes in the grib file containing the product we want unset($start); unset($stop); } // Now down to business -- use the $Range value for each product to download data from the // appropriate grib file, and store it in a temporary, local grib file foreach($ProdArray as $ProdLine => $tmp){ list($Prod,$Level,$FName,$LFlag) = explode("_",$ProdLine); $gribhandle = @fopen('./grib/temp_'.$FName."_1.grib2","w+"); $grib = curl_init($url); curl_setopt($grib, CURLOPT_CONNECTTIMEOUT,30); curl_setopt($grib, CURLOPT_TIMEOUT,30); curl_setopt($grib, CURLOPT_FILE, $gribhandle); curl_setopt($grib, CURLOPT_FOLLOWLOCATION, true); curl_setopt($grib, CURLOPT_RANGE,$Range[$FName]); if(curl_exec($grib)===FALSE) exit("Couldn't download ".$url); curl_close($grib); fclose($gribhandle); } // Download the reference surface geopotential height list($Prod,$Level,$FName) = explode("_",$SurfaceLevel); foreach($idxlines as $key => $line){ $linearray = explode(":",rtrim($line)); if(($linearray[3]==$Prod)&&($linearray[4]==$Level)){ $start = $linearray[1]; unset($linearray); if(isset($idxlines[$key+1])){ $linearray = explode(":",rtrim($idxlines[$key+1])); $stop = $linearray[1]-1; }else{ $stop = ''; } break; } } if(!isset($start)){ exit("Product not found!"); } $Range[$FName] = $start."-".$stop; unset($start); unset($stop); $gribhandle = @fopen('./grib/temp_'.$FName."_1.grib2","w+"); $grib = curl_init($url); curl_setopt($grib, CURLOPT_CONNECTTIMEOUT,30); curl_setopt($grib, CURLOPT_TIMEOUT,30); curl_setopt($grib, CURLOPT_FILE, $gribhandle); curl_setopt($grib, CURLOPT_FOLLOWLOCATION, true); curl_setopt($grib, CURLOPT_RANGE,$Range[$FName]); if(curl_exec($grib)===FALSE) exit("Couldn't download ".$url); curl_close($grib); fclose($gribhandle); unset($idxlines); // Now, do the whole thing over again for the previous (current) forecast hour $url = 'http://www.ftp.ncep.noaa.gov/data/nccf/nonoperational/com/hrrr/prod/hrrr.'.date('Ymd',$ModelUnixTime).'/hrrr.t'.date('H',$ModelUnixTime).'z.wrfprsf'.str_pad($ForecastOffset,2,"0",STR_PAD_LEFT).'.grib2'; $idx = curl_init($url.".idx"); curl_setopt($idx, CURLOPT_RETURNTRANSFER, true); $idxfile = curl_exec($idx); curl_close($idx); if($idxfile===FALSE) exit("Couldn't access index file 0"); $idxlines = explode("\n",$idxfile); foreach($ProdArray as $ProdLine => $tmp){ list($Prod,$Level,$FName,$LFlag) = explode("_",$ProdLine); foreach($idxlines as $key => $line){ $linearray = explode(":",rtrim($line)); if(($linearray[3]==$Prod)&&($linearray[4]==$Level)){ $start = $linearray[1]; unset($linearray); if(isset($idxlines[$key+1])){ $linearray = explode(":",rtrim($idxlines[$key+1])); $stop = $linearray[1]-1; }else{ $stop = ''; } break; } } if(!isset($start)){ exit("Product not found!"); } $Range[$FName] = $start."-".$stop; unset($start); unset($stop); } foreach($ProdArray as $ProdLine => $tmp){ list($Prod,$Level,$FName,$LFlag) = explode("_",$ProdLine); $gribhandle = @fopen('./grib/temp_'.$FName."_0.grib2","w+"); $grib = curl_init($url); curl_setopt($grib, CURLOPT_CONNECTTIMEOUT,30); curl_setopt($grib, CURLOPT_TIMEOUT,30); curl_setopt($grib, CURLOPT_FILE, $gribhandle); curl_setopt($grib, CURLOPT_FOLLOWLOCATION, true); curl_setopt($grib, CURLOPT_RANGE,$Range[$FName]); if(curl_exec($grib)===FALSE) exit("Couldn't download ".$url); curl_close($grib); fclose($gribhandle); } list($Prod,$Level,$FName) = explode("_",$SurfaceLevel); foreach($idxlines as $key => $line){ $linearray = explode(":",rtrim($line)); if(($linearray[3]==$Prod)&&($linearray[4]==$Level)){ $start = $linearray[1]; unset($linearray); if(isset($idxlines[$key+1])){ $linearray = explode(":",rtrim($idxlines[$key+1])); $stop = $linearray[1]-1; }else{ $stop = ''; } break; } } if(!isset($start)){ exit("Product not found!"); } $Range[$FName] = $start."-".$stop; unset($start); unset($stop); $gribhandle = @fopen('./grib/temp_'.$FName."_0.grib2","w+"); $grib = curl_init($url); curl_setopt($grib, CURLOPT_CONNECTTIMEOUT,30); curl_setopt($grib, CURLOPT_TIMEOUT,30); curl_setopt($grib, CURLOPT_FILE, $gribhandle); curl_setopt($grib, CURLOPT_FOLLOWLOCATION, true); curl_setopt($grib, CURLOPT_RANGE,$Range[$FName]); if(curl_exec($grib)===FALSE) exit("Couldn't download ".$url); curl_close($grib); fclose($gribhandle); unset($idxlines); //------------------------------------------------------- // Convert surface geopotential height grib files into geotiff files, using gdalwarp to map // geographical coordinates onto something we can work with // http://www.gdal.org/gdalwarp.html echo "Finding surface level...\n"; list($Prod,$Level,$LevelBase) = explode("_",$SurfaceLevel); $ExecText = 'gdalwarp -srcnodata 9999 -dstnodata 9999 -t_srs EPSG:4326 ./grib/temp_'.$LevelBase.'_0.grib2 ./tif/'.$LevelBase."_00m.tif"; exec($ExecText); $ExecText = 'gdalwarp -srcnodata 9999 -dstnodata 9999 -t_srs EPSG:4326 ./grib/temp_'.$LevelBase.'_1.grib2 ./tif/'.$LevelBase."_60m.tif"; exec($ExecText); unlink('./grib/temp_'.$FName."_0.grib2"); unlink('./grib/temp_'.$FName."_1.grib2"); //------------------------------------------------------- // Convert remaining product grib files into geotiff files, using gdalwarp to map geographical // coordinates onto something we can work with foreach($ProdArray as $ProdLine => $CData){ list($Prod,$Level,$FName,$LFlag) = explode("_",$ProdLine); echo $FName."\n"; list($GetContours,$Offset,$Interval,$Extra,$PreCalc) = explode("_",$CData); $ExecText = 'gdalwarp -srcnodata 9999 -dstnodata 9999 -t_srs EPSG:4326 ./grib/temp_'.$FName.'_0.grib2 ./tif/temp_'.$FName."_00m.tif"; exec($ExecText); $ExecText = 'gdalwarp -srcnodata 9999 -dstnodata 9999 -t_srs EPSG:4326 ./grib/temp_'.$FName.'_1.grib2 ./tif/temp_'.$FName."_60m.tif"; exec($ExecText); // wipe temporary grib files no longer needed, and delete old geotiff files unlink('./grib/temp_'.$FName."_0.grib2"); unlink('./grib/temp_'.$FName."_1.grib2"); for ($i = 0; $i < 7; $i++){ if(file_exists('./tif/'.$FName."_".str_pad($i,2,"0",STR_PAD_RIGHT).'m.tif')){ unlink('./tif/'.$FName."_".str_pad($i,2,"0",STR_PAD_RIGHT).'m.tif'); } } // use gdal_calc to apply precalc values to data, and subtract surface geopotential height // from products as appropriate // http://www.gdal.org/gdal_calc.html if($LFlag == 1){ $ExecText = '/usr/bin/gdal_calc.py -A ./tif/temp_'.$FName.'_00m.tif --A_band=1 --NoDataValue=9999 --outfile=./tif/'.$FName.'_00m.tif --calc="'.$PreCalc.'*A"'; exec($ExecText); $ExecText = '/usr/bin/gdal_calc.py -A ./tif/temp_'.$FName.'_60m.tif --A_band=1 --NoDataValue=9999 --outfile=./tif/'.$FName.'_60m.tif --calc="'.$PreCalc.'*A"'; exec($ExecText); }else{ $ExecText = '/usr/bin/gdal_calc.py -A ./tif/temp_'.$FName.'_00m.tif --A_band=1 -B ./tif/'.$LevelBase.'_00m.tif --B_band=1 --NoDataValue=9999 --outfile=./tif/'.$FName.'_00m.tif --calc="'.$PreCalc.'*(A-B)"'; exec($ExecText); $ExecText = '/usr/bin/gdal_calc.py -A ./tif/temp_'.$FName.'_60m.tif --A_band=1 -B ./tif/'.$LevelBase.'_60m.tif --B_band=1 --NoDataValue=9999 --outfile=./tif/'.$FName.'_60m.tif --calc="'.$PreCalc.'*(A-B)"'; exec($ExecText); } // wipe temporary geotiff files no longer needed unlink('./tif/temp_'.$FName."_00m.tif"); unlink('./tif/temp_'.$FName."_60m.tif"); // use gdal_calc to calculate linearly interpolated data between this forecast hour and the // next, in 10 minute intervals, and write the output as geotiff files for ($i = 1; $i < 6; $i++){ $j = round($i/6,6); $ExecText = '/usr/bin/gdal_calc.py -A ./tif/'.$FName.'_00m.tif --A_band=1 -B ./tif/'.$FName.'_60m.tif --B_band=1 --NoDataValue=9999 --outfile=./tif/'.$FName.'_'.str_pad($i,2,"0",STR_PAD_RIGHT).'m.tif --calc="(A+('.$j.'*(B-A)))"'; exec($ExecText); } } //--------------------------------------------------------------- // Produce calculated geotiff files from product files foreach($CalcArray as $FName => $CData){ echo $FName."\n"; list($FName1,$FName2,$COp,$GetContours,$Offset,$Interval,$Extra,$PreCalc) = explode("_",$CData); for ($i = 0; $i < 7; $i++){ $ExecText = '/usr/bin/gdal_calc.py -A ./tif/'.$FName1.'_'.$i.'0m.tif --A_band=1 -B ./tif/'.$FName2.'_'.$i.'0m.tif --B_band=1 --NoDataValue=9999 --outfile=./tif/'.$FName.'_'.str_pad($i,2,"0",STR_PAD_RIGHT).'m.tif --calc="'.$COp.'"'; exec($ExecText); } } //--------------------------------------------------------------- // So, now we've downloaded grib data and produced geotiff raster files for all // of our data. These files cover the entire geographic extent of the model type // in question, for the current forecast hour to the next, in 10 minute intervals. // The geotiff files are broken out by product. // They are found in the following form: // ./tif/[FName]_XXm.tif // where [FName] is from the ProdArray/CalcArray lists, and "XX" is the 10 minute // time period in question (from "00" to "60") //--------------------------------------------------------------- // Next, we need to produce resampled geotiff files (cubic spline interpolation at 0.03 degree // points) to improve map appearance at high zoom levels, create level contour shape files // from the data, and create ascii files from the geotiff files for further script processing. foreach($ProdArray as $ProdLine => $CData){ list($Prod,$Level,$FName,$LFlag) = explode("_",$ProdLine); list($GetContours,$Offset,$Interval,$Extra,$PreCalc) = explode("_",$CData); echo "Contouring, Resampling, ASCII Grids...".$FName."\n"; for ($i = 0; $i < 7; $i++){ $temphandle = @fopen('./temp/'.$FName.'lockfile','w'); // create a lock file while we're working on the output geotiff files if($temphandle){ if(flock($temphandle, LOCK_EX)){ if(file_exists('./tif/'.$FName."_".$i."0m_res.tif")){ unlink('./tif/'.$FName."_".$i."0m_res.tif"); } $ExecText = 'gdalwarp -srcnodata 9999 -dstnodata 9999 -tr 0.03 0.03 -r cubicspline ./tif/'.$FName."_".$i.'0m.tif ./tif/'.$FName."_".$i."0m_res.tif"; exec($ExecText); flock($temphandle, LOCK_UN); } fclose($temphandle); // close the lock file } unlink('./tif/'.$FName."_".$i."0m.tif"); // wipe non-resampled geotiff files // if product calls for contours, calculate here if($GetContours == 1){ $temphandle = @fopen('./temp/'.$FName.'lockfile','w'); if($temphandle){ if(flock($temphandle, LOCK_EX)){ if(file_exists('./shp/'.$FName."_".str_pad($i,2,"0",STR_PAD_RIGHT)."m.shp")){ // wipe old shape files unlink('./shp/'.$FName."_".str_pad($i,2,"0",STR_PAD_RIGHT)."m.shp"); unlink('./shp/'.$FName."_".str_pad($i,2,"0",STR_PAD_RIGHT)."m.shx"); unlink('./shp/'.$FName."_".str_pad($i,2,"0",STR_PAD_RIGHT)."m.prj"); unlink('./shp/'.$FName."_".str_pad($i,2,"0",STR_PAD_RIGHT)."m.dbf"); } // use gdal_contour to produce shape (vector) files containing contour level polygons from the geotiff // files // http://www.gdal.org/gdal_contour.html $ExecText = "gdal_contour -b 1 -a ".$FName." -snodata 9999 -i ".$Interval." -off ".$Offset.$Extra.'./tif/'.$FName."_".str_pad($i,2,"0",STR_PAD_RIGHT)."m_res.tif .\/shp\/".$FName."_".str_pad($i,2,"0",STR_PAD_RIGHT)."m.shp"; exec($ExecText); } flock($temphandle, LOCK_UN); } fclose($temphandle); } $temphandle = @fopen('./temp/'.$FName.'lockfile','w'); if($temphandle){ if(flock($temphandle, LOCK_EX)){ if (file_exists('./grid/'.$FName.'_'.$i.'0m.asc')){ unlink('./grid/'.$FName.'_'.$i.'0m.asc'); unlink('./grid/'.$FName.'_'.$i.'0m.prj'); unlink('./grid/'.$FName.'_'.$i.'0m.asc.aux.xml'); } // Use gdal_translate to take the geotiff files and create ascii files of the data. // This will allow derivation of, for example, grids of text values for placefiles. // http://www.gdal.org/gdal_translate.html // http://www.gdal.org/frmt_various.html $ExecText = 'gdal_translate -b 1 -of AAIGrid -ot Float32 ./tif/'.$FName.'_'.$i.'0m_res.tif ./grid/'.$FName.'_'.$i.'0m.asc'; exec($ExecText); flock($temphandle, LOCK_UN); } fclose($temphandle); } } } // Do the same as above for calculated data (could be combined with above) foreach($CalcArray as $FName => $CData){ list($FName1,$FName2,$COp,$GetContours,$Offset,$Interval,$Extra,$PreCalc) = explode("_",$CData); echo "Contouring, Resampling, ASCII Grids...".$FName."\n"; for ($i = 0; $i < 7; $i++){ $temphandle = @fopen('./temp/'.$FName.'lockfile','w'); if($temphandle){ if(flock($temphandle, LOCK_EX)){ if(file_exists('./tif/'.$FName."_".$i."0m_res.tif")){ unlink('./tif/'.$FName."_".$i."0m_res.tif"); } $ExecText = 'gdalwarp -srcnodata 9999 -dstnodata 9999 -tr 0.03 0.03 -r cubicspline ./tif/'.$FName."_".$i.'0m.tif ./tif/'.$FName."_".$i."0m_res.tif"; exec($ExecText); flock($temphandle, LOCK_UN); } fclose($temphandle); } unlink('./tif/'.$FName."_".$i."0m.tif"); if($GetContours == 1){ $temphandle = @fopen('./temp/'.$FName.'lockfile','w'); if($temphandle){ if(flock($temphandle, LOCK_EX)){ if(file_exists('./shp/'.$FName."_".str_pad($i,2,"0",STR_PAD_RIGHT)."m.shp")){ unlink('./shp/'.$FName."_".str_pad($i,2,"0",STR_PAD_RIGHT)."m.shp"); unlink('./shp/'.$FName."_".str_pad($i,2,"0",STR_PAD_RIGHT)."m.shx"); unlink('./shp/'.$FName."_".str_pad($i,2,"0",STR_PAD_RIGHT)."m.prj"); unlink('./shp/'.$FName."_".str_pad($i,2,"0",STR_PAD_RIGHT)."m.dbf"); } $ExecText = "gdal_contour -b 1 -a ".$FName." -snodata 9999 -i ".$Interval." -off ".$Offset.$Extra.'./tif/'.$FName."_".str_pad($i,2,"0",STR_PAD_RIGHT)."m_res.tif .\/shp\/".$FName."_".str_pad($i,2,"0",STR_PAD_RIGHT)."m.shp"; exec($ExecText); } flock($temphandle, LOCK_UN); } fclose($temphandle); } $temphandle = @fopen('./temp/'.$FName.'lockfile','w'); if($temphandle){ if(flock($temphandle, LOCK_EX)){ if (file_exists('./grid/'.$FName.'_'.$i.'0m.asc')){ unlink('./grid/'.$FName.'_'.$i.'0m.asc'); unlink('./grid/'.$FName.'_'.$i.'0m.prj'); unlink('./grid/'.$FName.'_'.$i.'0m.asc.aux.xml'); } $ExecText = 'gdal_translate -b 1 -of AAIGrid -ot Float32 ./tif/'.$FName.'_'.$i.'0m_res.tif ./grid/'.$FName.'_'.$i.'0m.asc'; exec($ExecText); flock($temphandle, LOCK_UN); } fclose($temphandle); } } } // Clean up if(file_exists('./tif/'.$LevelBase."_00m.tif")){ unlink('./tif/'.$LevelBase."_00m.tif"); } if(file_exists('./tif/'.$LevelBase."_60m.tif")){ unlink('./tif/'.$LevelBase."_60m.tif"); } // Update our "track.txt" file with the latest model run $trackhandle = @fopen('track.txt','w'); if($trackhandle){ fwrite($trackhandle,$FileFound); fclose($trackhandle); }else{ exit("Problem creating tracking file"); } } ?>
  4. It seems to me what's being described below is essentially random noise in the measurements. Of course, that's going to be the case. What's being discussed with respect to DCA's measurements is a consistent, statistically significant bias.
  5. If you're still looking for information on this (or if others are looking for such information), grib files can range up into the hundreds of megabytes. But it's also possible to grab only the parts you need using functions like CURL (PHP). If you want to e.g. get into downloading your own grib files via ftp, the accompanying .idx files will tell you where in the respective grib file each parameter starts. As for processing the files, the free GDAL series of utilities can be used to manipulate and transform grib files into other formats, such as tif files and shapefiles, and further render them into things like .png image files. For example: gdalwarp can take a grib file and project it on a desired coordinate system as well as take a geographic subset of a grib file; gdal_calc can run simple or complex calculations on grib file points; gdal_contour can generate shape files from grib files; gdaldem can generate image files; ogr2ogr can take shape files and transform them into other formats for easier script processing, like .kml files. I run php scripts that check for NAM and HRRR grib file updates every 10 minutes for the purpose of making GRLevel3 placefiles, and downloads/processes relevant subsets of them if available. It then processes them into .tif and shapefile intermediate files (covering the CONUS), with interpolation between hourly runs for every 10 minutes. When a placefile is loaded for a given radar site, scripts further process the area of the .tif and shapefiles around the radar site into .png files and text placefiles. Those scripts run on a 2.5 GHz dual core processor server (Intel E5200 -- most certainly NOT cutting edge). It's capable of processing and calculating 16 parameters (.tif and contour shapefiles) from grib files, which just about maxes out what it can do in 10 minutes when new grib files are available. I will likely be upgrading that system sometime soon, but it's a useful reference. Also, FORTRAN's not that bad But you can do amazing things in PERL and PHP these days.
  6. I'm an electronics engineer with the U.S. Department of Commerce. I don't work for NOAA, though I know people who do. And yes, there are E/CprE grads who work for NOAA. They work on hardware such as NEXRAD upgrades (phased array systems are up-and-coming), satellite system design, experimental/system testing and analysis, and the computer systems necessary to process the massive amounts of data and advanced algorithms NOAA scientists develop. They also work in areas such as telecommunications and radio spectrum engineering and management (my area of expertise). As for where such jobs are and when they are open, I'd recommend keeping an eye on usajobs.gov. You might also do some research into the work various NOAA labs and organizational divisions do (Severe Storms, Earth Systems Research, NESDIS, etc.) and perhaps contact people there to inquire 1) what sort of jobs might be available and 2) express your interest in such. Government hiring is an interesting thing -- there are standards hiring authorities have to follow -- but it never hurts to develop contacts where you'd like to work.
  7. While combating the effects of solar radiation is one of the purposes of a fan, remember that radiative heating/cooling doesn't just take place during the day (note the dew that can form on surfaces open to the cold sky at night, even though the air temperature is above the dew point temperature--the temperature of those surfaces can fall below the air temperature, because they radiate their heat away). Additionally, forced air decreases the response time of a thermometer to changes in air temperature. For consistency, the fan should run at a constant rate at all times. To be clear, I'm not saying that it's required. But to my mind, I'd like to see a constant forced air rate.
  8. I agree. And I've really appreciated the expertise and insight I've seen on this board (especially in the modeling threads) from beginning to the end for this amazing storm. It's been like taking a seminar
  9. This isn't the first time the Euro has miscalled a winter storm. At any rate, a nice lesson from this (if it needs to be learned yet again) is that no one model has a lock on calling all forecasts correctly, and assuming one is all you need is not wise. Come to think of it, that's a good general principle to follow for numerical modeling overall.
  10. Out of curiosity: do any of the model gurus here plan on doing any sort of correlation between model predictions and what actually transpired? Is that normally done (I'd imagine NOAA does it, but just curious about folks here). To a first approximation (going from memory), I agree that for the DC area, the GFS seems to have outperformed the Euro, in terms of stability, storm totals, and locations.
  11. It's one heck of a storm that drops 16-18" in the snow hole
  12. Agree. It looks like there's still some decent moisture/bands coming down from PA.