So the default CMAQ CCTM scripts have a section that says:
#> remove existing output files?
set DISP = delete
#set DISP = update
# set DISP = keep
If you choose "delete" there, it clears out your $OUTDIR before running. This can be cool if you are fixing something that you did wrong the first time (which applies to 99% of the time for me), but can cause problems if you want to do a run that is more than 1 day (which should be 100% of the time). What will happen if you try to run with an initial condition of the output from last run is that the output will be deleted before it's accessed as an initial condition. This is really annoying and embarrassing. It also means that...
you pretty much don't want to use the default CMAQ CCTM scripts basically ever.
Instead you want to use a custom-made one that has some if-statements and for-loops. This allows you to have a spin-up period, redirect the output from the day before as the initial condition for the previous day, and so on.
If you're like me, you're like "that sounds easy enough but really tedious." In which case your first plan of attack is to:
don't waste your time and just copy off someone else's
I put a copy on mercury in /Users/Luedke/cmaq/cctm/copy.this.cctm.runscript
I did my best to comment in what you need to change and stuff so it finally makes sense.
Have fun.
Tuesday, April 19, 2011
Monday, April 11, 2011
Remote disk image installation
With most of the software we work with, the only way to install it is by building it from source code. As everyone has at some point experienced, building from source can be difficult and frustrating.
Fortunately, some software comes as "Disk images" (you'll know because the files end in .dmg). These "disk images" are the classic, point and click kind of installers that do all the work for you. The only problem is it takes a few obscure commands to get them to work over command line, and it's not always possible/convenient to go sit down at the machine itself and install it from the GUI. Here's a step-by-step on how to install disk images over the command line.
NOTE: curly braces denote that you should input whatever is appropriate for you and should not actually be input. IE if I read
1) Download the file and put it in your home directory (don't worry, we won't be installing it here.)
2) Mount the disk image. The command for this is:
Fortunately, some software comes as "Disk images" (you'll know because the files end in .dmg). These "disk images" are the classic, point and click kind of installers that do all the work for you. The only problem is it takes a few obscure commands to get them to work over command line, and it's not always possible/convenient to go sit down at the machine itself and install it from the GUI. Here's a step-by-step on how to install disk images over the command line.
NOTE: curly braces denote that you should input whatever is appropriate for you and should not actually be input. IE if I read
ls {myHomeDirectory}
I would actually typels /Users/oberman
1) Download the file and put it in your home directory (don't worry, we won't be installing it here.)
2) Mount the disk image. The command for this is:
hdiutil attach {filename.dmg}
3) cd to the /Volumes directory. You should see your disk image as one of the volumes (don't worry if it doesn't have the exact same name as the filename)
4) cd into your disk image's volume. Note that if the name contains spaces, using tab completion avoids having to monkey around with escape sequences.
5) You should see at least one file that ends with .pkg, .mpkg, or some variant. This is the file we actually want to install. To do so, we use the following command:
4) cd into your disk image's volume. Note that if the name contains spaces, using tab completion avoids having to monkey around with escape sequences.
5) You should see at least one file that ends with .pkg, .mpkg, or some variant. This is the file we actually want to install. To do so, we use the following command:
sudo installer -verbose -pkg {packageFile.pkg} -target {/install/location/} >& {/log/filename}
If you're unsure what to put for /install/location, a good bet is almost always /usr/local/. Note that most programs build into subdirectories of the target directory (IE /target/location/bin, /target/location/lib, etc...).
6) Check the logfile (whatever you set /log/filename to be) for any error messages.
7) If you don't find any error messages, cd back to /Volumes
8) Dismount the installation volume. The command for this is:
6) Check the logfile (whatever you set /log/filename to be) for any error messages.
7) If you don't find any error messages, cd back to /Volumes
8) Dismount the installation volume. The command for this is:
hdiutil detach {/VolumeName}
And you've successfully installed your program! You can now dispose of the original file you downloaded if you wish, but I find I like having them filed away in case I need to reinstall. Hopefully this saves some searching next time you need to install software!
And you've successfully installed your program! You can now dispose of the original file you downloaded if you wish, but I find I like having them filed away in case I need to reinstall. Hopefully this saves some searching next time you need to install software!
Compiling WRF for CMAQ
So, you want meteorology files for you cmaq run, but for some reason, you need to recompile WRF. Wouldn't it be nice if the standard way of compiling WRF would just automatically create perfect meteorology files that could be plugged straight into MCIP? Well, it doesn't.
Here's what you are going to have to do to the wrf source code (before compiling!) in order o use WRF output as input to MCIP and, hence, cmaq, smoke, or any other models-3 product:
step 1)
Download the wrf tarball into a clean directory. If you don't have the source code, you can find it here: http://www.mmm.ucar.edu/wrf/users/download/get_source.html
step 2)
Extract the code from the tarball:
gunzip WRFV3.3.TAR.gz
tar -xf WRFV3.3.TAR
(note, assuming you will also need to pre-process your meteorological input files, you'll probably want to download the WPS code at the same time. WPS code is available at the same site as the WRF code - make sure to get the WPS version that matched eh version of WRF you will be using.)
step 3)
The default WRF compilation does not write to output certain variables that are required by MCIP/CMAQ. You have to edit the registry file before you compile to get these variable to write out.
cd WRFV3/Registry ;move to the "Registry" directory in the WRF file structurevi Registry.EM ;open "Registry.EM" in your text editor of choice
Now find the "ZNT" variable entry, and add an "h" to the eighth column. This will tell WRF to write out the roughness length to your output files. When you are done, your ZNT registry line should look like this:
state real ZNT ij misc 1 - i3rh "ZNT" "TIME-VARYING ROUGHNESS LENGTH"
You should repeat this activity for the following variables:
fractional land use (LANDUSEF)
aerodynamic resistance (RA)
stomatal resistance (RS)
vegetation fraction in the Pleim-Xiu LSM (VEGF_PX)
roughness length (ZNT)
and inverse Monin-Obukhov length (RMOL)
step 4)
Once you are done with the registry, it is time to compile. You should use wrf's configure script to automatically set your compile options and flags. When doing so, be sure to choose the ifort and icc options, and, if you want to enable parallel processing, choose the "dm" (distributed memory) option. Note: on our linux systems, do NOT choose an "sm" (shared memory) option.
cd .././configure>choose ifort/icc/dm run>choose basic nesting
step 5) compile:
./compile em_real >& compile.log
Thats it! You should now have a WRF compilation that will produce met files ready for use in MCIP.
Friday, April 8, 2011
Running CMAQ with mis-matched emissions & met years
It's possible (and common) to run CMAQ with mis-matched emissions and meteorology years (eg. because the NEI is only released every 3 years). If emissions are processed in SMOKE with the same meteorology to be used in CMAQ, then you're all set. However, if you're using emissions already processed in SMOKE with a different year (eg. Steve's 2003 CONUS files) than the meteorology year (eg. 2005) then you need to modify the emissions files to match the timesteps in the meteorology.
To do this, DO NOT use M3EDHDR - as this only changes the SDATE in the file and not the embedded timestep array and will cause CCTM to not be able to read in your files. Instead, use M3TSHIFT (also an m3tool included with IOAPI).
A sample script for changing files for Dec/Jan and Jun/Jul, from 2003 to 2005 can be found here on SOx: /Users/bickford/data/conv_tools/m3tools_scripts/m3tshift_2003conus.csh
Saturday, April 2, 2011
"badly formed number" error in csh scripts on linux
I have several .csh scripts that scroll through months and days to process daily files (eg. for MCIP). When running one such script this week on nitrate, I ran into a "badly formed number" error on days 08 and 09 -- and only on those days. Turns out, this is why:
"If your script uses comparisons of numbers that begin with a 0, CSH will interpret it as an octal number. if the number contains 8 or 9, it will fail because 8 and 9 do not exist in octal. To get around this problem, you should switch to a different shell (like bash) or use a more robust scripting language, such as Perl."
from:http://www.purdue.edu/eas/info_tech/faq/faq_linux.php#csh
"If your script uses comparisons of numbers that begin with a 0, CSH will interpret it as an octal number. if the number contains 8 or 9, it will fail because 8 and 9 do not exist in octal. To get around this problem, you should switch to a different shell (like bash) or use a more robust scripting language, such as Perl."
from:http://www.purdue.edu/eas/info_tech/faq/faq_linux.php#csh
My in-the-moment solution was to hardcode separate scripts for those days, but there are several possible workarounds, including switching shells or operating systems.
Wednesday, March 23, 2011
Editing SMOKE input data text files
Editing SMOKE input data text files can be a long process that sucks and is prone to mistakes. Or it can be a medium-length process that doesn't suck and is way less prone to mistakes.
Here's how to go with that second method:
1. sftp/scp a copy of the file you wanna edit to your desktop. (Ex: get ptinv_ptipm_cap2002v2_02apr2007_v4_orl.txt)
2. import that sucker into Excel:
2a. Say it's a text file and find it on your computer. Then say it's delimited, and start the import at line 16 or wherever the data actually starts after the header. there's a little preview window so you don't mess it up.
2b. tell it that the delimiter is a comma, and that text has no identifier ({none} instead of "). again there's a preview window, and make sure the columns are in the right places and there's still quotes around the text. Then just click Finish
3. You can do all sorts of Excel stuff with the data then. For example I sorted by the longitude column and then deleted all the point sources east of SAGE (-89.415 deg long.)
4. Then go File>>Save As and choose .csv as your file format and save it someplace.
5. Then find that file on your computer and Open With TextEdit.
6. It'll open, but will have a million (actually 3) quotation marks everyplace there should be just one. So go Edit>>Find, and do a Find/Replace of 3 quotes (""") with 1 quote ("). It takes a second and then looks way better.
7. Go to the place on NOX or whatever server and create a new file by going vi "newfile.txt"
8. Press i to go into insert mode. Then do a Select All and Copy on the window showing the .csv file on your computer, and Paste into your vi window. It takes a little while but eventually it all gets there.
9. Then if you like you can copy the header from the original file at the top of this file too, so that it looks identical to your original (other than the data you changed).
So now you can do all sorts of stuff like zero-out certain pollutants, or certain sources. Or multiply emissions by a certain factor. Whatever Excel will let you do, really.
Here's how to go with that second method:
1. sftp/scp a copy of the file you wanna edit to your desktop. (Ex: get ptinv_ptipm_cap2002v2_02apr2007_v4_orl.txt)
2. import that sucker into Excel:
2a. Say it's a text file and find it on your computer. Then say it's delimited, and start the import at line 16 or wherever the data actually starts after the header. there's a little preview window so you don't mess it up.
2b. tell it that the delimiter is a comma, and that text has no identifier ({none} instead of "). again there's a preview window, and make sure the columns are in the right places and there's still quotes around the text. Then just click Finish
3. You can do all sorts of Excel stuff with the data then. For example I sorted by the longitude column and then deleted all the point sources east of SAGE (-89.415 deg long.)
4. Then go File>>Save As and choose .csv as your file format and save it someplace.
5. Then find that file on your computer and Open With TextEdit.
6. It'll open, but will have a million (actually 3) quotation marks everyplace there should be just one. So go Edit>>Find, and do a Find/Replace of 3 quotes (""") with 1 quote ("). It takes a second and then looks way better.
7. Go to the place on NOX or whatever server and create a new file by going vi "newfile.txt"
8. Press i to go into insert mode. Then do a Select All and Copy on the window showing the .csv file on your computer, and Paste into your vi window. It takes a little while but eventually it all gets there.
9. Then if you like you can copy the header from the original file at the top of this file too, so that it looks identical to your original (other than the data you changed).
So now you can do all sorts of stuff like zero-out certain pollutants, or certain sources. Or multiply emissions by a certain factor. Whatever Excel will let you do, really.
Monday, March 7, 2011
Note about running WRF for CMAQ
Looks like for WRF meteorology grids to be compatible with emissions file grids in CMAQ, the wrf grid needs to have an odd number of grid cells in both the x and y directions such that the projection centerpoint (eg. 40N 97W) is the middle of the center grid, not the vertex of four center grids.
Check lat-lons (xlat_m, xlon_m) from the wrf produced geo_em.d01.nc file compared to the projection centerpoint or known emissions grid lat lons to verify grid match.
Subscribe to:
Posts (Atom)