1 / 49

WRF Tutorial

WRF Tutorial. For Version 2.2 / Synoptic Lab 10/3/2007 Robert Fovell rfovell@ucla.edu. http://tinyurl.com/2uagrc or http://www.atmos.ucla.edu/~fovell/WRF/wrf_tutorial_2007.ppt.htm. Background on WRF model. “Weather Research and Forecasting”

Download Presentation

WRF Tutorial

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. WRF Tutorial For Version 2.2 / Synoptic Lab 10/3/2007 Robert Fovell rfovell@ucla.edu http://tinyurl.com/2uagrc or http://www.atmos.ucla.edu/~fovell/WRF/wrf_tutorial_2007.ppt.htm

  2. Background on WRF model • “Weather Research and Forecasting” • Co-developed by research and operational communities • ARW core “Advanced Research WRF” • NMM core “Nonhydrostatic Mesoscale Model” • Supercedes MM5 and Eta models • Current version 2.2 • Platforms include Linux and Mac OS X

  3. WRF advantages • Better numerics than MM5 • Arakawa C grid, R-K scheme, odd order advection w/ implicit diffusion • Much less diffusive, larger effective resolution, permits longer time steps • Better handling of topography than Eta (original NAM) • NAM model is now WRF-NMM • Fortran 95 (MM5 was F77) • NetCDF, GRIB1 and GRIB2

  4. Further advantages • MPI from the ground up • Allows real data and idealized simulations in same framework • Plug-in architecture (different groups will supply WRF “cores”) • Recently added: moving nests and nudging • NetCDF output - many great tools such as NetCDF operators: http://nco.sourceforge.net/

  5. WRF disadvantages • Bleeding edge • Smaller range of physics choices (tho more modern) • Software design is unintuitive for physical scientists • Can take hours to compile • But does not need frequent recompiling • Comparatively slower than MM5 • NetCDF files can be huge

  6. WRF and related software • WRF Preprocessing System (WPS) • Replaces/supercedes WRF SI • WRF-ARW model • Single node and MPI • WRF postprocessing software • RIP (read/interpolate/plot) • GrADS • Specific to “hurricane” Synoptic Lab environment • Neglecting for now: GRIB2, ARWpost

  7. Web resources • WRF model users site http://www.mmm.ucar.edu/wrf/users/user_main.html • ARW users’ guide http://www.mmm.ucar.edu/wrf/users/docs/user_guide/contents.html • WRF-ARW/WPS online tutorial http://www.mmm.ucar.edu/wrf/OnLineTutorial/index.htm • WRF namelist description http://www.mmm.ucar.edu/wrf/users/docs/user_guide/users_guide_chap5.html#Nml • Tutorial presentations http://www.mmm.ucar.edu/wrf/users/tutorial/tutorial_presentation.htm

  8. My resources • This presentation (PPT format) http://www.atmos.ucla.edu/~fovell/WRF/wrf_tutorial_2007.ppt • WRF on Mac OS X http://www.atmos.ucla.edu/~fovell/WRF/WRF_ports.html http://macwrf.blogspot.com

  9. Setup on “hurricane” machines Presumed: • tcsh environment • Intel Fortran compiler (64-bit) • my environment setup employed • precompiled versions of WRF, WPS, RIP and wrf_to_grads

  10. Environment setup>…is the command line prompt • If you don’t have a .cshrc file (worth saving) - recommended > cp /home/fovell/.cshrc . > source .cshrc • If you want to keep your present .cshrc > cp /home/fovell/cshrc_fovell.csh . > ./cshrc_fovell.csh [you need to have the compiler environment set up already]

  11. # RGF additions [abridged] setenv RIP_ROOT /home/fovell/RIP4 setenv GADDIR /home/fovell/lib/grads setenv GASCRP /home/fovell/gradslib # alias lsm 'ls -alt | more' alias rm 'rm -i' alias cp 'cp -i' alias mv 'mv -i' alias trsl ' tail -f rsl.out.0000' alias mpirun 'nohup time /home/fovell/mpich-1.2.7p1/bin/mpirun' alias w2g '/home/fovell/WRF2GrADS/wrf_to_grads' setenv P4_GLOBMEMSIZE 4096000 setenv P4_SOCKBUFSIZE 65536 unlimit limit coredumpsize 0 This environment uses my versions of netcdf, mpich, grads, RIP

  12. Set up a run directory > cd > mkdir FELIX > cd FELIX > cp /home/fovell/WRFtutorial/make_all_links.csh . > make_all_links.csh > cp /home/fovell/WRFtutorial/namelist.* . [copies namelist.input, namelist.wps]

  13. WRF for real-data run Hurricane Felix (2007) [This example uses data that will not remain online]

  14. WPS overview • Tasks • (1) set up a domain (can be reused) • geogrid.exe • (2) unpack parent model data (e.g., from GFS, NAM, etc.) • ungrib.exe • (3) prepare unpacked data for WRF • metgrid.exe • Controlled by namelist.wps

  15. namelist.wps &share wrf_core = 'ARW', max_dom = 1, start_date = '2007-09-02_00:00:00','2007-09-02_00:00:00', end_date = '2007-09-03_12:00:00','2007-09-03_12:00:00', interval_seconds = 10800, io_form_geogrid = 2, / • For start_date, end_date need one column for each domain • interval_seconds is parent model data frequency (here, 3 h)

  16. namelist.wps (cont.) &geogrid parent_id = 1, 1, parent_grid_ratio = 1, 3, i_parent_start = 1, 53, j_parent_start = 1, 65, e_we = 70, 259, e_sn = 40, 199 geog_data_res = '2m','2m', dx = 36000, dy = 36000, map_proj = 'lambert', ref_lat = 15.0 ref_lon = -75.0, truelat1 = 29.6, truelat2 = 29.6, stand_lon = -75.0, geog_data_path = '/home/fovell/WPS_GEOG/geog' / there is more…

  17. geogrid - create domain > geogrid.exe * creates geo_em.d01.nc (a NetCDF file) * look for “Successful completion of geogrid.” > plotgrids.exe * creates gmeta > idt gmeta * uses NCAR graphics tool to view domain

  18. ‘gmeta’ file

  19. > ncdump geo_em.d01.nc | more netcdf geo_em.d01 { • dimensions: • Time = UNLIMITED ; // (1 currently) • DateStrLen = 19 ; • west_east = 69 ; • south_north = 39 ; • south_north_stag = 40 ; • west_east_stag = 70 ; • land_cat = 24 ; • soil_cat = 16 ; • month = 12 ; • variables: • char Times(Time, DateStrLen) ; • float XLAT_M(Time, south_north, west_east) ; • XLAT_M:FieldType = 104 ; • XLAT_M:MemoryOrder = "XY " ; • XLAT_M:units = "degrees latitude" ; • XLAT_M:description = "Latitude on mass grid" ; • XLAT_M:stagger = "M" ;

  20. Parent model data issues • Sources include GFS, NAM, NARR reanalysis data, etc. • Need a different Vtable (variable table) for each source • e.g., Vtable.GFS, Vtable.AWIP (NAM), Vtable.NARR, etc. • Look in /home/fovell/WRFtutorial

  21. Accessing parent model data > link_grib.csh /home/fovell/2007090200/gfs * links to where parent model for case resides ** data files start with ‘gfs*’ > ln -sf /home/fovell/WRFtutorial/Vtable.GFS Vtable * specifies appropriate Vtable > ungrib.exe * extracts parent model data * look for “Successful completion of ungrib.”

  22. Next step: metgrid > metgrid.exe ...hopefully you see ... “Successful completion of metgrid.” ...Output looks like... met_em.d01.2007-09-02_00:00:00.nc met_em.d01.2007-09-02_21:00:00.nc met_em.d01.2007-09-02_03:00:00.nc met_em.d01.2007-09-03_00:00:00.nc met_em.d01.2007-09-02_06:00:00.nc met_em.d01.2007-09-03_03:00:00.nc met_em.d01.2007-09-02_09:00:00.nc met_em.d01.2007-09-03_06:00:00.nc met_em.d01.2007-09-02_12:00:00.nc met_em.d01.2007-09-03_09:00:00.nc met_em.d01.2007-09-02_15:00:00.nc met_em.d01.2007-09-03_12:00:00.nc met_em.d01.2007-09-02_18:00:00.nc

  23. ncdump on a metgrid file netcdf met_em.d01.2007-09-02_00:00:00 { dimensions: Time = UNLIMITED ; // (1 currently) DateStrLen = 19 ; west_east = 69 ; south_north = 39 ; num_metgrid_levels = 27 ; num_sm_levels = 4 ; num_st_levels = 4 ; south_north_stag = 40 ; west_east_stag = 70 ; z-dimension0012 = 12 ; z-dimension0016 = 16 ; z-dimension0024 = 24 ; This data source has 27 vertical levels. This will vary with source.

  24. WRF model steps • Tasks • Run real.exe (to finish creation of WRF model input data) • Run wrf.exe • Both use namelist.input • Configured separately from namelist.wps but includes overlapping information

  25. namelist.input &time_control run_days = 0, run_hours = 36, run_minutes = 0, run_seconds = 0, start_year = 2007 , 2007 , start_month = 09 , 09 , start_day = 02 , 02 , start_hour = 00 , 00 , start_minute = 00, 00, start_second = 00, 00, end_year = 2007 , 2007 , end_month = 09 , 09 , end_day = 03 , 03 , end_hour = 12 , 12 , end_minute = 00, 00, end_second = 00, 00, For start_*, end_*, one column per domain

  26. namelist.input (cont.) interval_seconds = 10800 input_from_file = .true., .true., history_interval = 60, 60, frames_per_outfile = 6, 6, restart = .false., restart_interval = 5000, interval_seconds matches namelist.wps input_from_file should normally be ‘true’ for each domain history_interval - how frequently (in min) output created frames_per_outfile - number of writes in each history file If wish to restart mode, restart = .true. (and set model start_* data to restart time) restart_interval = frequency (min) for writing restart files

  27. namelist.input (cont.) &domains time_step = 150, time_step_fract_num = 0, time_step_fract_den = 1, max_dom = 1, s_we = 1, 1, 1, e_we = 70, 259, 94, s_sn = 1, 1, 1, e_sn = 40, 199, 91, s_vert = 1, 1, 1, e_vert = 31, 31, 31, num_metgrid_levels = 27 dx = 36000, 12000, 333, dy = 36000, 12000, 333, grid_id = 1, 2, 3, parent_id = 0, 1, 2, i_parent_start = 0, 53, 30, j_parent_start = 0, 65, 30, parent_grid_ratio = 1, 3, 3, parent_time_step_ratio = 1, 3, 3,

  28. namelist.input (cont.) &physics mp_physics [Microphysics] = 1 , 1 , ra_lw_physics [Longwave rad] = 1 , 1 , ra_sw_physics [Shortwave rad] = 1 , 1 , radt [Radiation time step; min] = 10 , 10 , sf_sfclay_physics [Surface layer] = 1 , 1 , sf_surface_physics [Surface] = 1 , 1 , bl_pbl_physics [Boundary layer] = 1 , 1 , bldt [Boundary layer time step; min]= 0, 0, cu_physics [cumulus scheme] = 1 , 0 , cudt [cumulus time step; min] = 5 , isfflx = 1, ifsnow = 0, icloud = 1, surface_input_source = 1, num_soil_layers = 5 , mp_zero_out = 0 ,

  29. Notes on physics • Need to use SAME microphysics (mp) scheme in each domain, but can use different cumulus (cu) schemes • Some physics combinations work better than others, some don’t work at all -- this is only lightly documented • bldt = 0 means boundary layer scheme is called every time step

  30. namelist.input (cont.) &dynamics w_damping = 0, diff_opt [subgrid turbulence] = 1, km_opt [ “ ] = 4, diff_6th_opt [numerical smoothing] = 0, diff_6th_factor [ “ ] = 0.12, base_temp = 290. damp_opt = 0, zdamp = 5000., 5000., 5000., dampcoef = 0.01, 0.01, 0.01 khdif = 0, 0, 0, kvdif = 0, 0, 0, Only some diff_opt/km_opt combinations make sense, and choices are resolution-dependent. More info: http://www.mmm.ucar.edu/wrf/users/tutorial/tutorial_presentation.htm

  31. http://www.mmm.ucar.edu/wrf/users/tutorial/200707/WRF_Physics_Dudhia.pdfhttp://www.mmm.ucar.edu/wrf/users/tutorial/200707/WRF_Physics_Dudhia.pdf

  32. real.exe • Has changed a lot since version 2.1.2 • Number of vertical model levels now specified w/ real.exe • The num_metgrid_levels comes from parent model; you set e_vert (number of WRF levels) here • Can reset WRF levels by rerunning real.exe • Can also specify which levels you want e_vert = 31, 31, 31, num_metgrid_levels = 27

  33. Setting levels in namelist.input(optional) • WRF uses “sigma” or “eta” coordinates (1.0 is model • bottom, 0.0 is top) • Added lines to &domains in namelist.input, presuming • e_vert = 51, requests a model top pressure of 50 mb • (5000 Pa) and concentrates vertical resolution in lower trop p_top_requested = 5000 eta_levels = 1.00,0.9969,0.9935,0.9899,0.9861,0.9821, 0.9777,0.9731,0.9682,0.9629,0.9573,0.9513, 0.9450,0.9382,0.9312,0.9240,0.9165,0.9088, 0.9008,0.8925,0.8840,0.8752,0.8661,0.8567, 0.8471,0.8371,0.8261,0.8141,0.8008,0.7863,0.7704, 0.7531,0.7341,0.7135,0.6911,0.6668,0.6406, 0.6123,0.5806,0.5452,0.5060,0.4630,0.4161, 0.3656,0.3119,0.2558,0.1982,0.1339,0.0804,0.0362,0.0000,

  34. Run real.exe > mpirun -np 2 real.exe wrf@iniki.atmos.ucla.edu's password: starting wrf task 0 of 2 starting wrf task 1 of 2 2.624u 1.248s 0:12.63 30.5% 0+0k 0+0io 0pf+0w > tail rsl.out.0000 --> extrapolating TEMPERATURE near sfc: i,j,psfc, p target d01 2007-09-03_12:00:00 forcing artificial silty clay loam LAND CHANGE = 0 WATER CHANGE = 0 d01 2007-09-03_12:00:00 Timing for processing 0 s. LBC valid between these times 2007-09-03_09:00:00.0000 2007-09-03_12:00:00 d01 2007-09-03_12:00:00 Timing for output 0 s. d01 2007-09-03_12:00:00 Timing for loop # 13 = 0 s. d01 2007-09-03_12:00:00 real_em: SUCCESS COMPLETE REAL_EM INIT

  35. Aside: password-less execution • Last slide’s mpirun command asked for 2 cpus (-np 2) • By default, 2 cpus on same workstation are accessed • To avoid being asked for password: > cd ~/.ssh > ssh-keygen -t dsa [then hit return 4 times] Your public key has been saved in /home/wrf/.ssh/id_dsa.pub. The key fingerprint is: cc:78:50:1e:77:23:ca:8f:81:3d:f0:d2:a4:8a:2e:a7 wrf@iniki.atmos.ucla.edu > cp id_dsa.pub authorized_keys [if does not already exist] > cd ../FELIX

  36. Run wrf.exe • Output of real.exe is wrfbdy_d01 and wrfinput_d01 (NetCDF files) • Additional wrfinput files created for nests if max_dom > 1 • Run the model > mpirun -np 4 wrf.exe & • Creates wrfout_d01* files keyed by simulation date, and rsl.out/rsl.error files for each CPU requested

  37. FELIX output • Namelist set up to do 36 h run • Look for at end of rsl.out.0000 file: d01 2007-09-03_12:00:00 wrf: SUCCESS COMPLETE WRF • Output files created: • This is because history_interval was 60 min and frames_per_outfile was 6 wrfout_d01_2007-09-02_00:00:00 wrfout_d01_2007-09-03_00:00:00 wrfout_d01_2007-09-02_06:00:00 wrfout_d01_2007-09-03_06:00:00 wrfout_d01_2007-09-02_12:00:00 wrfout_d01_2007-09-03_12:00:00 wrfout_d01_2007-09-02_18:00:00

  38. Postprocessing WRF output:RIP and GrADS(Vis5D and ARWpost also exist)

  39. RIP • RIP operates in batch mode, using input scripts • RIP can overlay fields, do arbitrary cross-sections, calculate trajectories, and create Vis5D output files • RIP tasks include • Unpack model output data (ripdp_wrf) • Create RIP plotting scripts (rip.in files) • Execute scripts (rip) • RIP can create a LOT of output files

  40. RIP procedure > ripdp_wrf run1 all wrfout_d01* [this creates a new dataset called ‘run1’ and uses all wrfout_d01 files created] > rip run1 rip.T2.in [the rip.T2.in file is a script containing RIP plotting commands] [the output file, rip.T2.cgm, is a graphics metafile] > You can view the cgm file using idt or ictrans

  41. 36 h forecast(2 m T - color; SLP - contour; 10 m winds - vector)

  42. RIP script =========================================================================== feld=T2; ptyp=hc; vcor=s; levs=1fb; cint=0.5; cmth=fill;> arng; cbeg=283; cend=309; cosq=0,violet,12.5,blue,25,green,37.5,> light.green,50,white,62.5,yellow,75,orange,87.5,red,100,brown feld=U10,V10; ptyp=hv; vcmx=20.0; colr=black; linw=1; intv=2; feld=slp; ptyp=hc; vcor=s; levs=1fb; cint=4; nohl;colr=blue;linw=2;nolb feld=map; ptyp=hb; colr=dark.blue; linw=2; feld=tic; ptyp=hb =========================================================================== http://www.mmm.ucar.edu/mm5/documents/ripug_V4.html

  43. GrADS and wrf_to_grads • GrADS produces beautiful graphics • Batch scriptable AND interactive • Interactive: good for overlaying different datasets, computing difference fields [can also be done in RIP] • Doesn’t create huge numbers of intermediate files like RIP • Arbitrary cross-sections are very difficult to construct

  44. GrADS procedure • Copy control_file from /home/fovell/WRFtutorial and edit • Select variables desired and define wrfout files to be accessed (next slide) • w2g control_file run1g • Creates run1g.ctl, run1g.dat http://grads.iges.org/grads/head.html

  45. control_file -3 ! times to put in GrADS file, negative ignores this 0001-01-01_00:00:00 0001-01-01_00:05:00 0001-01-01_00:10:00 end_of_time_list ! 3D variable list for GrADS file ! indent one space to skip U ! U Compoment of wind V ! V Component of wind UMET ! U Compoment of wind - rotated (diagnostic) VMET ! V Component of wind - rotated (diagnostic) W ! W Component of wind THETA ! Theta TK ! Temperature in K TC ! Temperature in C List of available 2D fields follows

  46. control_file (cont.) ! All list of files to read here ! Indent not to read ! Full path OK wrfout_d01_2007-09-02_00:00:00 wrfout_d01_2007-09-02_06:00:00 wrfout_d01_2007-09-02_12:00:00 wrfout_d01_2007-09-02_18:00:00 wrfout_d01_2007-09-03_00:00:00 wrfout_d01_2007-09-03_06:00:00 wrfout_d01_2007-09-03_12:00:00 end_of_file_list ! Now we check to see what to do with the data real ! real (input/output) / ideal / static 1 ! 0=no map background in grads, 1=map background in grads -1 ! specify grads vertical grid ! 0=cartesian, ! -1=interp to z from lowest h ! 1 list levels (either height in km, or pressure in mb) 1000.0 950.0 900.0 850.0 800.0 750.0

  47. Running GrADS > gradsnc -l [GrADS graphics output window opens] ga-> open run1g [ga-> is GrADS environment prompt] ga-> /home/fovell/WRFtutorial/T2_movie.gs [executes this GrADS script; hit return to advance a frame] ga-> quit [to exit]

  48. 36 h forecast(2 m T and 10 m winds)

  49. = end =

More Related