https://wiki.seas.harvard.edu/geos-chem/api.php?action=feedcontributions&user=Salvatore+Farina&feedformat=atomGeos-chem - User contributions [en]2024-03-29T14:07:30ZUser contributionsMediaWiki 1.24.2https://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_setup_guide&diff=38402TOMAS setup guide2018-08-13T23:38:14Z<p>Salvatore Farina: /* Post Processing */</p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with [[TOMAS aerosol microphysics]] on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
However, the 'bleeding edge' code has many recent developments in GEOS-Chem/TOMAS that are not included in the public release, including parallel computing.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Compiler ===<br />
GEOS-Chem works best ''(only)'' with the Intel Ifort Fortran compiler - v11.1<br />
There is an instance of the compiler installed on glooscap, which you can load by doing<br />
module load intel/11.1.073<br />
<br />
'''Alternatively''', I have installed ifort version 11.1.080. This also gives you access to the ''iidb'' debugger. To use this version, add the following to your .bashrc<br />
<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
export FC="ifort"<br />
<br />
=== Code ===<br />
The latest stable version of TOMAS will be included with the next public release. Currently, the latest code can be obtained using git<br />
<br />
git clone https://bitbucket.org/gcst/geos-chem<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
Please follow the directions for [[Installing libraries for GEOS-Chem]] wiki before proceeding. You will need to install the netCDF-4.2 libraries.<br />
<br />
=== Environment ===<br />
After installing the libraries, your .bashrc should include a similar section to the following<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
export LD_LIBRARY_PATH=$GC_LIB:$LD_LIBRARY_PATH<br />
<br />
Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11''<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data (meteorology, emissions, land use, etc.) for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes and additions due to recent GC development and TOMAS specifics.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
=== Restart Files ===<br />
There are restart files for TOMAS at 4x5 resolution at<br />
/net/samqfs/pierce/sfarina/standard_run_directories/restart.TOMASXX<br />
Where ''XX'' is the number of bins. These restart files use an "empty" restart file for 2005/06/01 and spin-up times can be calculated accordingly. I will be adding to this directory in the coming week or two. Restart files for 2x2.5 are located at<br />
/net/samqfs/pierce/sfarina/standard_run_directories/2x2.5/restart.ires.TOMAS15<br />
<br />
So far, I have only used TOMAS15 at this model resolution.<br />
<br />
The North American nested grid is under active development for TOMAS.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== Bash versions ===<br />
Geos Chem now requires bash > v3.2 in order to compile properly. There are two ways to ensure you are using the correct version:<br />
*install your own instance of bash, and point to it in Makefile_header.mk<br />
*on glooscap, type 'module load bash' before compiling<br />
<br />
=== Compile Flags ===<br />
Choice of GEOS-Chem model resolution is now done using compile time flags. Full instructions are available [http://acmg.seas.harvard.edu/geos/doc/man/chapter_3.html#Compile here].<br />
<br />
Example: To build TOMAS for simulations on a global 4x5 degree grid, using geos5 meteorology, I invoke make as follows for each version:<br />
make GRID=4x5 MET=geos5 tomas12<br />
make GRID=4x5 MET=geos5 tomas15<br />
make GRID=4x5 MET=geos5 tomas<br />
make GRID=4x5 MET=geos5 tomas40<br />
<br />
Note: "make tomas" is shorthand for "make TOMAS=yes all"<br />
<br />
Note: "make tomas15" os shorthand for "make TOMAS=yes TOMAS15=yes all"<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 16:13, 3 March 2014 (EST)<br />
<br />
=== Make ===<br />
qrsh allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc:<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do, for example"<br />
cd YOUR_CODE_DIR/GeosCore<br />
pshell16<br />
make -j16 GRID=4x5 MET=geos5 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on the system.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
Alternatively, you can use the following to define a tomas version when compiling:<br />
make TOMAS=yes all<br />
make TOMAS40=yes all<br />
etc.<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
=== A Note about Speed ===<br />
Choosing the appropriate version of tomas for your needs includes consideration of time and resources.<br />
Using 16 processors on glooscap at 4x5 resolution, the model time : real time ratio is roughly as follows:<br />
version | speedup<br />
40 bin - 46<br />
30 bin - 60<br />
15 bin - 125<br />
12 bin - 131<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, '''ask'''.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. To use git on glooscap:<br />
module load git<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have your source code directory, make a separate branch for yourself before making any changes. This will simplify trading and tracking updates/advances/bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
NB: The following section is out of date. Please contact Jeff Pierce or Peter Adams for TOMAS resources.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
==== Copy ====<br />
Copy the relevant files to your postprocessing directory for a given run<br />
ctm.bpch<br />
diaginfo.dat<br />
tracerinfo.dat<br />
proc_one.pro<br />
averageCNCCN_XX.py <-- XX is TOMAS version<br />
plotCNCCN.py<br />
<br />
==== Split ====<br />
Use the script Bpch_Sep_Sal interactively from within the IDL environment to ctm.bpch into separate months<br />
For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
<br />
==== Create netcdf output ====<br />
Using the IDL script proc_one.pro, we extract information from the monthly .bpch files and save it to the standard netCDF<br />
Edit proc_one.pro to use the correct infile/outfiles<br />
Execute proc_one from your shell:<br />
idl proc_one.pro<br />
<br />
==== Counting CN and CCN ====<br />
Run averageCNCCN_XX.py, where XX is the model version<br />
For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
<br />
==== Plotting the Results====<br />
Edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version.<br />
plotCNCCN.py will automatically detect the model version and customize map names.<br />
To plot the surface and zonal average concentrations of CN3, CN10, CN40, and CN80 for august:<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
<br />
==== NCview ====<br />
You can also use ncview on the file ctm.nc to view individual species concentrations or nucleation rates.<br />
ncview ctm.nc<br />
ncview ctm_nuc.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F. This is a hard to track bug related to ongoing development of grid independent geoschem.<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* If you are trying to run geoschem ''outside'' of a qrsh (grid engine) environment (i.e. on the head node), you will need to add '''ulimit -S -s unlimited''' to your .bashrc<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups in different physical locations is a better idea.<br />
* if you have any questions or you are running into trouble, ''please ask'' either myself, Sajeev, or Jeff for help. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=User:Salvatore_Farina&diff=38401User:Salvatore Farina2018-08-13T23:32:50Z<p>Salvatore Farina: </p>
<hr />
<div>I am worked with Jeff Pierce to parallelize TOMAS microphysics, and add "Simple SOA" in GEOS Chem.</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_aerosol_microphysics&diff=16068TOMAS aerosol microphysics2014-03-04T00:23:55Z<p>Salvatore Farina: /* Other features of TOMAS */</p>
<hr />
<div>This page describes the TOMAS aerosol microphysics option in GEOS-Chem. TOMAS is one of two aerosol microphysics packages being incorporated into GEOS-Chem, the other being [[APM aerosol microphysics|APM]].<br />
<br />
== Overview ==<br />
<br />
The TwO-Moment Aerosol Sectional (TOMAS) microphysics package was developed for implementation into GEOS-Chem at Carnegie-Mellon University. Using a moving sectional and moment-based approach, TOMAS tracks two independent moments (number and mass) of the aerosol size distribution for a number of discrete size bins. It also contains codes to simulate nucleation, condensation, and coagulation processes. The aerosol species that are considered with high size resolution are sulfate, sea-salt, OC, EC, and dust. An advantage of TOMAS is the full size resolution for all chemical species and the conservation of aerosol number, the latter of which allows one to construct aerosol and CCN number budgets that will balance.<br />
<br />
=== Authors and collaborators ===<br />
* [mailto:petera@andrew.cmu.edu Peter Adams] ''(Carnegie-Mellon U.)'' -- Principal Investigator<br />
* [mailto:wtrivita@staffmail.ed.ac.uk Win Trivitayanurak] ''(Department of Highways, Thailand)''<br />
* [mailto:dwesterv@andrew.cmu.edu Dan Westervelt] ''(Carnegie-Mellon U.)''<br />
* [mailto:jeffrey.pierce@dal.ca Jeffrey Pierce] ''(Dalhousie U.)''<br />
* [mailto:sal.farina@gmail.com Salvatore Farina] ''(Colorado State U.)''<br />
<br />
Questions regarding TOMAS can be directed at Dan (e-mail linked above).<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 11:53, 27 January 2010 (EST)<br />
<br />
=== TOMAS User Groups ===<br />
<br />
{| border=1 cellspacing=0 cellpadding=5<br />
|- bgcolor="#cccccc"<br />
!User Group<br />
!Personnel<br />
!Projects<br />
|-valign="top"<br />
|[http://www.ce.cmu.edu/%7Eadams/ Carnegie-Mellon University]<br />
|[http://www.ce.cmu.edu/%7Eadams/people.htm#peter Peter Adams]<br>[http://www.ce.cmu.edu/~dwesterv/Site/Home.html Dan Westervelt]<br />
| [http://www.atmos-chem-phys-discuss.net/13/8333/2013/acpd-13-8333-2013.html New particle formation evaluation in GC-TOMAS] <br> Sensitivity of CCN to nucleation rates <br> Development of number tagging and source apportionment model for GC-TOMAS<br />
|-valign="top"<br />
|[http://fizz.phys.dal.ca/%7Epierce/ Dalhousie University] <br> [http://www.atmos.colostate.edu/faculty/pierce.php Colorado State]<br />
|[http://atm.dal.ca/Faculty/Jeffrey_Pierce.php Jeffrey Pierce]<br>Sal Farina<br>Stephen D'Andrea<br />
|Sensitivity of CCN to condensational growth rates <br> TOMAS parallelization <br> Others...<br />
|-valign="top"<br />
|Add yours here<br />
|<br />
|<br />
|}<br />
<br />
== TOMAS-specific setup ==<br />
TOMAS has its own run directories (run.Tomas) that can be downloaded from the Harvard FTP. The <tt>input.geos</tt> file will look slightly different from standard GEOS-Chem, and between versions.<br />
<br />
Pre- v9.02:<br />
To turn on TOMAS, see the "Microphysics menu" in <tt>input.geos</tt> and make sure TOMAS is set to '''T'''. <br />
<br />
v9.02 and later:<br />
TOMAS is enabled or disabled at compile time - the TOMAS flag in input.geos has been removed.<br />
<br />
<br />
TOMAS is a simulation type 3 and utilizes 171-423 tracers. Each aerosol species requires 30 tracers for the 30 bin size resolution, 12 for the 12 bin, etc. Here is the (abbreviated) default setup in input.geos for TOMAS-30 in v9.02 and later (see run.Tomas directory):<br />
<br />
Tracer # Description <br />
1- 62 Std Geos Chem <br />
63 H2SO4 <br />
64- 93 Number <br />
94-123 Sulfate <br />
124-153 Sea-salt <br />
154-183 Hydrophilic EC <br />
184-213 Hydrophobic EC <br />
214-243 Hydrophilic OC <br />
244-273 Hydrophobic OC <br />
274-303 Mineral dust <br />
304-333 Aerosol water<br />
<br />
TOMAS-40 requires 423 tracers (~360 TOMAS tracers for each of the 40-bin species, and ~62 standard GEOS-Chem tracers) <br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 18:48, 8 July 2013 (EDT)<br />
<br />
== Implementation notes ==<br />
<br />
TOMAS validation in [[GEOS-Chem v8-03-01]] was completed on 24 Feb 2010.<br />
<br />
=== Update April 2013 ===<br />
<br />
'''''This update was tested in the 1-month benchmark simulation [[GEOS-Chem_v9-02_benchmark_history#v9-02k|v9-02k]] and approved on 07 Jun 2013.'''''<br />
<br />
Sal Farina has been working with the GEOS-Chem Support Team to inline the TOMAS aerosol microphysics code into the <tt>GeosCore</tt> directory. All TOMAS-specific sections of code are now segregated from the rest of GEOS-Chem with C-preprocessor statements such as:<br />
<br />
#if defined( TOMAS )<br />
<br />
# if defined( TOMAS40 ) <br />
... Code for 40 bin TOMAS simulation (optional) goes here ...<br />
# elif defined( TOMAS12 )<br />
... Code for 12 bin TOMAS simulation (optional) goes here ...<br />
# elif defined( TOMAS15 )<br />
... Code for 15 bin TOMAS simulation (optional) goes here ...<br />
# else<br />
... Code for 30 bin TOMAS simulation (default) goes here ...<br />
# endif<br />
<br />
#endif <br />
<br />
TOMAS is now invoked by compiling GEOS-Chem with one of the following options:<br />
<br />
make -j4 TOMAS=yes ... # Compiles GEOS-Chem for the 30 bin (default) TOMAS simulation<br />
# -j4 compiles 4 files at a time; this reduces overall compilation time<br />
<br />
or<br />
<br />
make -j4 TOMAS40=yes ... # Compiles GEOS-Chem for the 40 bin (optional) TOMAS simulation<br />
# -j4 compiles 4 files at a time; this reduces overall compilation time<br />
<br />
All files in the old <tt>GeosTomas/</tt> directory have now been deleted, as these have been rendered obsolete.<br />
<br />
These updates are included in [[GEOS-Chem v9-02]]. These modifications will not affect the existing GEOS-Chem simulations, as all TOMAS code is not compiled into the executable unless you specify either <tt>TOMAS=yes</tt> or <tt>TOMAS40=yes</tt> at compile time.<br />
<br />
We are in the process of updating the wiki to reflect these changes as they are implemented. <br />
<br />
--[[User:Bmy|Bob Y.]] 13:59, 23 April 2013 (EDT)<br><br />
--[[User:Salvatore Farina|Salvatore Farina]] 13:49, 4 June 2013 (EDT)<br />
<br />
== Computational Information ==<br />
<br />
GC-TOMAS v9-02 (30 sections) on 8 processors: <br />
One year simulation = 7-8 days wall clock time<br />
<br />
More speedups are available using lower aerosol size resolution<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 11:00, 07 May 2013 (EST)<br />
<br />
GC-TOMAS v9-03 on 16 processors (glooscap)<br />
<br />
12 bin: 2.8 days wall time per sim year<br />
<br />
15 bin: 3.3 days wall time per sim year<br />
<br />
30 bin: 6.1 days wall time per sim year<br />
<br />
40 bin: 7.8 days wall time per sim year<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 15:51, 3 March 2014 (EST)<br />
<br />
== Microphysics Code==<br />
The aerosol microphysics code is largely contained within the file <tt>tomas_mod.f</tt>. Tomas_mod and its subroutines are modular -- they use all their own internal variables. For details, see tomas_mod.f and comments. <br />
<br />
=== Nucleation ===<br />
The choice of nucleation theory is selected in the header section of <tt>tomas_mod.f</tt>. The choices are currently binary homogeneous nucleation as in Vehkamaki, 2001 or ternary homogenous nucleation as in Napari et al., 2002. The ternary nucleation rate is typically scaled by a globally uniform tuning factor of 10^-4 or 10^-5. Binary nucleation (Vehkamaki et al. 2002), ion-mediated nucleation (Yu, 2008) and activation nucleation (Kulmala, 2006) are options as well.<br />
<br />
In TOMAS-12 and TOMAS-30, nucleated particles follow the Kerminen approximation to grow to the smallest size bin. This has a tendency to overpredict the number of particles in the smallest bins of those models. See Y. H. Lee, J. R. Pierce, and P. J. Adams 2013 [http://www.geosci-model-dev-discuss.net/6/893/2013/gmdd-6-893-2013.html here] for more details on the consequences of this.<br />
<br />
=== Condensation ===<br />
<br />
=== Coagulation ===<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 14:08, 9 May 2011 (EST)<br />
<br />
== Validation ==<br />
<br />
GC-TOMAS [[GEOS-Chem v8-03-01|v8-03-01]] generally compares very well with observations and other models. Please see our [http://acmg.seas.harvard.edu/geos/wiki_docs/TOMAS/TOMAS_benchmark_ForHarvard.pdf GC-TOMAS v8-02-05 validation document] for more information and figures. <br />
<br />
Below are some results of benchmarking GC-TOMAS with earlier versions of the model as well as observations:<br />
<br />
[[Image:CN10_smaller.jpg]]<br />
<br />
'''Figure 1: CN10 concentrations predicted by GC-TOMAS v8-02-05 against observations. Descriptions of observational data can be found on p 5454 of Pierce et al, Atmos. Chem. Phys., 7, 2007.'''<br />
<br />
----<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 20:13, 10 February 2010 (EST)<br />
<br />
== Other features of TOMAS ==<br />
Other varieties of TOMAS are suited for specific science questions, for example with nucleation studies where explicit aerosol dynamics are needed for nanometer-sized particles. <br />
<br />
=== Set-up Guide ===<br />
<br />
This [[TOMAS setup guide]] was written for users on ACE-NET's Glooscap cluster, but may be more generally applicable.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 11:55, 26 July 2013 (EDT)<br />
<br />
=== Size Resolution ===<br />
<br />
;TOMAS-30: All 7 chemical species have size resolution ranging from 10 nm to 10 µm, spanned by 30 logarithmically spaced (mass doubling) bins.<br />
;TOMAS-40: Same as TOMAS-30 with 10 additional (mass doubling) sub-10nm bins with a lower limit ~1nm<br />
;TOMAS-12: All 7 chemical species have size resolution ranging from 10 nm to 1 µm spanned by 10 logarithmically spaced (mass quadrupling) bins and two supermicron bins. Coarser resolution than TOMAS-30 - Improved computation time. <br />
;TOMAS-15: Same as TOMAS-12 with 3 additional (mass quadrupling) sub-10nm bins with a lower limit ~2nm. Analogous to TOMAS40 with improved computation time.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 12:51, 4 June 2013 (EDT)<br />
<br />
=== Nesting and grid size ===<br />
TOMAS is implemented on a 2x2.5 North American domain. Developed by Jeffrey Pierce (jeffrey.pierce@dal.ca)<br />
<br />
=== AOD, CCN post-processing code ===<br />
Codes available for calculating aerosol optical depth using TOMAS predicted aerosol composition and size and Mie Theory. Also CCN concentrations calculated from TOMAS size-resolved composition and Kohler theory. Developed by Yunha Lee and Jeffrey Pierce, adapted for GEOS-Chem output by Jeffrey Pierce.<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 2:00, 9 May 2011 (EST)<br />
<br />
== References ==<br />
<br />
In this section we provide references relevant to TOMAS aerosl microphysics simulations.<br />
<br />
=== Studies using TOMAS simulations ===<br />
#'''Nucleation in GEOS-Chem:''' Westervelt, D. M., Pierce, J. R., Riipinen, I., Trivitayanurak, W., Hamed, A., Kulmala, M., Laaksonen, A., Decesari, S., and Adams, P. J.: ''Formation and growth of nucleated particles into cloud condensation nuclei: model-measurement comparison'', <u>Atmos. Chem. Phys. Discuss.</u>, '''13''', 8333-8386, doi:10.5194/acpd-13-8333-2013, 2013. [http://www.atmos-chem-phys-discuss.net/13/8333/2013/acpd-13-8333-2013.html LINK]<br />
#'''TOMAS implementation in GEOS-Chem:''' Trivitayanurak, W., Adams, P. J., Spracklen, D. V. and Carslaw, K. S.: ''Tropospheric aerosol microphysics simulation with assimilated meteorology: model description and intermodel comparison'', <u>Atmos. Chem. Phys.</u>, '''8'''(12), 3149-3168, 2008.<br />
#'''TOMAS initial paper, sulfate only:''' Adams, P. J. and Seinfeld, J. H.: ''redicting global aerosol size distributions in general circulation models'', <u>J. Geophys. Res.-Atmos.</u>, '''107'''(D19), -, doi:Artn 4370 Doi 10.1029/2001jd001010, 2002.<br />
#'''TOMAS with sea-salt:''' Pierce, J.R., and Adams P.J., ''Global evaluation of CCN formation by direct emission of sea salt and growth of ultrafine sea salt'', <u>J. Geophys. Res.-Atmos.</u>, '''111''' (D6), doi:10.1029/2005JD006186, 2006.<br />
#'''TOMAS with carbonaceous aerosol:''' Pierce, J. R., Chen, K. and Adams, P. J.: ''Contribution of primary carbonaceous aerosol to cloud condensation nuclei: processes and uncertainties evaluated with a global aerosol microphysics model'', <u>Atmos. Chem. Phys.</u>, '''7'''(20), 5447-5466, doi:10.5194/acp-7-5447-2007, 2007.<br />
#'''TOMAS with dust:''' Lee, Y.H., K. Chen, and P.J. Adams, 2009: ''Development of a global model of mineral dust aerosol microphysics''. <u>Atmos. Chem. Phys.</u>, '''8''', 2441-2558, doi:10.5194/acp-9-2441-2009.<br />
<br />
--[[User:Bmy|Bob Y.]] 17:04, 24 February 2014 (EST)<br />
<br />
=== Input data used by TOMAS ===<br />
#Usoskin, I. G. and Kovaltsov, G. A., ''Cosmic ray induced ionization in the atmosphere: Full modeling and practical applications'', <u>J. Geophys. Res.</u>, '''111''', doi:10.1029/2006JD007150, 2006..<br />
#Yu, Fangqun, et al, ''Ion-mediated nucleation in the atmosphere: Key controlling parameters, implications, and look-up table'', <u>J. Geophys. Res.</u>, '''115''', D03206, doi:10.1029/2009JD012630, 2010.<br />
<br />
--[[User:Bmy|Bob Y.]] 17:03, 24 February 2014 (EST)<br />
<br />
== Previous issues now resolved ==<br />
<br />
=== Minor bug in TOMAS sulfate emissions ===<br />
<br />
'''''This update was tested in the 1-month benchmark simulation [[GEOS-Chem_v9-02_benchmark_history#v9-02o|v9-02o]] and approved on 03 Sep 2013.'''''<br />
<br />
'''''[mailto:sal.farina@gmail.com Sal Farina] wrote:'''''<br />
:Calling mnfix before and after emission ensures the size distribution is well behaved, and eliminates "Negative SF emis" warnings. An edit to mnfix was also introduced, whereby "tiny" mass added to zero mass, "epsilon" number situations resulted in very high mass per particle results - necessitating excessive error detection, correction, and verbosity.<br />
<br />
--[[User:Melissa Payer|Melissa Sulprizio]] 15:08, 7 August 2013 (EDT)<br />
<br />
=== Segmentation Fault ===<br />
You may get an early segfault if your stacksize is not set to either unlimited or a very large number. To avoid this, you either have to change the value of an environmental variable (setenv command in <tt>.cshrc</tt>) or use the <tt>ulimit</tt> command. See [http://wiki.seas.harvard.edu/geos-chem/index.php/Machine_issues_%26_portability#Resetting_stacksize_for_Linux this page] for details.<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 20:20, 10 February 2010 (EST)<br />
<br />
=== Updates for GEOS-Chem v9-02 public release ===<br />
<br />
<div style="color: #aa0000; background: #eeeeee;border: 3px solid red; padding: 1em; margin: auto; width: 90%; ">'''<p>NOTE: As described below, there appears to be [[#Potential parallelization problems|a potential parallelizaiton problem with the TOMAS ND60 diagnostic]]. We are currently looking into this. This issue, however, does not affect the tracer concentrations computed by TOMAS, but only the output of the ND60 diagnostic itself. For this reason we are moving ahead with the TOMAS benchmarks for v9-02. (Bob Yantosca, 21 Feb 2014)</p>'''</div><br />
<br />
We have found and fixed several minor numerical and coding issues prior to the public release of [[GEOS-Chem v9-02]] (01 Mar 2014). The TOMAS40 simulation has been validated with the [[GEOS-Chem Unit Tester]]. Below is the [[GEOS-Chem Unit Tester#Interpreting_results_generated_by_the_GEOS-Chem_Unit_Tester|output of a unit test]] that was submitted on 2014/02/21 at 12:47:26 PM:<br />
<br />
###############################################################################<br />
### VALIDATION OF GEOS-CHEM OUTPUT FILES<br />
### In directory: geos5_4x5_TOMAS40<br />
###<br />
### File 1 : trac_avg.geos5_4x5_TOMAS40.2005070100.sp<br />
### File 2 : trac_avg.geos5_4x5_TOMAS40.2005070100.mp<br />
### Sizes : IDENTICAL (680420788 and 680420788)<br />
### Checksums : IDENTICAL (179613338 and 179613338)<br />
### Diffs : IDENTICAL<br />
###<br />
### File 1 : trac_rst.geos5_4x5_TOMAS40.2005070101.sp<br />
### File 2 : trac_rst.geos5_4x5_TOMAS40.2005070101.mp<br />
### Sizes : IDENTICAL (263480068 and 263480068)<br />
### Checksums : IDENTICAL (1925551193 and 1925551193)<br />
### Diffs : IDENTICAL<br />
###<br />
### File 1 : soil_rst.geos5_4x5_TOMAS40.2005070101.sp<br />
### File 2 : soil_rst.geos5_4x5_TOMAS40.2005070101.mp<br />
### Sizes : IDENTICAL (54040 and 54040)<br />
### Checksums : IDENTICAL (3229970876 and 3229970876)<br />
### Diffs : IDENTICAL<br />
###############################################################################<br />
<br />
In the subsections below, we describe in more detail the fixes that we made for [[GEOS-Chem v9-02]]:<br />
<br />
==== Fixes for minor coding errors ====<br />
<br />
#<p>In <tt>GeosCore/main.F</tt>, we now replaced <tt>CALL FLUSH()</tt> with <tt>CALL FLUSH(6)</tt>. The <tt>FLUSH</tt> routine needs to take an argument. Unit #6 is the unit stdout (i.e. the screen and/or log file).</p><br />
#<p>In routine <tt>CHEM_SO2</tt> (in module <tt>GeosCore/sulfate_mod.F</tt>), we now avoid referencing the dust tracers DST1, DST2, DST3, and DST4 tracers for TOMAS simulations. TOMAS uses size-resolved dust tracers, and therefore does not carry DST1-4 tracers. This error seems to have been introduced when the fix for cloud pH was introduced in Sep 2013.</p><br />
#<p>In routine <tt>COND_NUC</tt> (in module <tt>GeosCore/tomas_mod.F</tt>), we added error traps to avoid division-by-zero errors that occurred when the variable <tt>CSCH</tt> is zero. When <tt>CSCH</tt> is zero, we now set variable <tt>ADDT</tt> to zero. When <tt>ADDT</tt> is zero, it will get reassigned to a minimum time step, so this fix should work OK.</p><br />
#<p>In <tt>GeosCore/gamap_mod.F</tt>, we now have restored several entries to <tt>tracerinfo.dat</tt> for the ND44 diagnostic that were not getting properly printed out when the TOMAS simuation was being used.</p><br />
#<p>In module <tt>GeosCore/drydep_mod.F</tt>, we Now set <tt>MAXDEP=105</tt> for all simulations, including TOMAS. Formerly, TOMAS had <tt>MAXDEP=100</tt>. This is close enough.</p><br />
#<p>In module <tt>GeosCore/diag3.F</tt>, we now avoid an out-of-bounds error in <tt>DEPNAME(N)</tt> during TOMAS simulations. We save the drydep species name from <tt>DEPNAME(N)</tt> into an new variable <tt>DRYDEP_NAME</tt> for <tt>N = 1..NUMDEP</tt>. We then set <tt>DRYDEP_NAME = ''</tt> for <tt>N > NUMDEP</tt>. This error occurs because we extend the # of drydep tracers during TOMAS simulations to account for the size bins.</p><br />
#<p>We have fixed a couple of logical errors that prevented dust emissions from happening. Minor modifications were made to IF statements in <tt>GeosCore/chemistry_mod.F</tt>, <tt>GeosCore/dust_mod.F</tt>, and <tt>GeosCore/input_mod.F</tt>.</p><br />
#<p>In file <tt>GeosCore/Makefile</tt>, make sure to add <tt>tomas_mod.o</tt> to the list of modules used by <tt>wetscav_mod.F</tt> (aka the "dependency listing"). The corrected code should look like this:</p><br />
<br />
wetscav_mod.o : wetscav_mod.F \<br />
dao_mod.o diag_mod.o \<br />
depo_mercury_mod.o get_ndep_mod.o \<br />
get_popsinfo_mod.o tracerid_mod.o \<br />
tracer_mod.o tomas_mod.o<br />
<br />
--[[User:Bmy|Bob Y.]] 10:20, 19 February 2014 (EST)<br />
<br />
==== Fixes for parallelization errors ====<br />
<br />
#<p>In routine <tt>AEROPHYS</tt> (in module <tt>GeosCore/tomas_mod.F</tt>), we need to add the following variables to the <tt>!$OMP+PRIVATE</tt> statement: <tt>TRACNUM</tt>, <tt>NH3_TO_NH4</tt>, and <tt>SURF_AREA</tt>. Adding these now causes TOMAS to have identical sp vs. mp results when chemistry and microphysics are turned on.</p><br />
#<p>In routine <tt>DEPVEL</tt> (in <tt>GeosCore/drydep_mod.F</tt>): Instead of holding <tt>A_RADI</tt> and <tt>A_DEN</tt> as <tt>!$OMP+PRIVATE</tt> in TOMAS simulations (in the main DO loop in <tt>DEPVEL</tt>), we now save the particle size and density values to private variables <tt>DIAM</tt> and <tt>DEN</tt>. We then pass those as arguments to function <tt>DUST_SFCRSII</tt>.</p> <br />
#<p>We have corrected an issue in routine <tt>NFCLDMX</tt> (in module <tt>GeosCore/convection_mod.F</tt>) that potentially impacts the TOMAS wet scavenging, as described below:</p><br />
#*<p>We think there are different results for parallel and serial because of an assumption that's true for normal simulations but fails on TOMAS. The assumption is "tracers are independent through wet scavenging." Since TOMAS scavenging is size dependent, removing material from the distribution before calculating the soluble fraction of another component is "wrong." We now compute the fractions explicitly before the removal step. To do this, we now call routine <tt>COMPUTE_F</tt> in its own parallel DO loop located immediately before the main parallel do loop in <tt>NFCLDMX</tt>.</p><br />
#*<p>This modification also required the ND37 diagnostic IF block to be put into the same loop as <tt>COMPUTE_F</tt>. Furthermore, because <tt>COMPUTE_F</tt> returns the value of diagnostic index <tt>ISOL</tt>, and because <tt>ISOL</tt> is also used for the ND38 diagnostic in the main parallel loop below, we must also save the values of <tt>ISOL</tt> in a 1-D vector. This will allow the values of ISOL to be passed from the first parallel loop to the second. This ensures that the ND37 and ND38 diagnostics will be computed properly for all GEOS-5 simulations that have soluble tracers.</p><br />
#*<p>This modification has been tested in the [[GEOS-Chem Unit Tester]] by Bob Yantosca (04 Feb 2014) and it has yielded identical results for <tt>geos5_4x5_fullchem</tt>, <tt>geos5_4x5_Hg</tt>, <tt>geos5_4x5_RnPbBe</tt>, <tt>geos5_4x5_soa</tt> and <tt>geos5_4x5_soa_svpoa</tt> simulations.</p><br />
#<p>We have made some fixes in <tt>GeosCore/wetscav_mod.F</tt> that caused single-processor TOMAS runs to have different output than multi-processor runs. A few instances of code were computing quantities sequentially and then storing them for later use. These were technically thread-safe, but were susceptible to error because the order of computation would be different when running with parallelization turned on. These sections of code have now been rewritten accordingly.</p><br />
<br />
--[[User:Bmy|Bob Y.]] 14:09, 21 February 2014 (EST)<br />
<br />
==== Removed inefficient subroutine calls ====<br />
<br />
#<p>In <tt>GeosCore/diag3.F</tt>, we now use a 2-D array <tt>(J-L)</tt> for archiving into the ND60 TOMAS diagnostic. This eliminates an array temporary in the call to routine BPCH2.</p><br />
#<p>In routine <tt>AEROPHYS</tt> (in module <tt>GeosCore/tomas_mod.F</tt>), we now use an array <tt>ERR_IND</tt> to pass the I,J,L,N indices to error checking routine <tt>CHECK_VALUE</tt>. We previously used an array descriptor <tt>(/I,J,L,0/)</tt> which caused an array temporary to be created.</p><br />
#<p>In routine <tt>EMISSCARBON</tt> (in module <tt>GeosCore/carbon_mod.F</tt>), we removed array temporaries from the calls to subroutine <tt>EMITSGC</tt>. We now sum two arrays into a temporary array, and then pass that to <tt>EMITSGC</tt>.</p><br />
#<p>We rewrote the subroutine calls to NH4BULKTOBIN to avoid the creation of array temporaries. In most cases this was done by replacing <tt>MK(1:IBINS,SRTSO4)</tt> with <tt>MK(:,SRTSO4)</tt>, etc. By explicitly stating the sub-slice <tt>MK(1:IBINS,SRTSO4)</tt>, this causes the compiler to create an array temporary. Using <tt>MK(:,SRTSO4)</tt> instead allows for a more efficient pointer slice to be passed.</p><br />
<br />
--[[User:Bmy|Bob Y.]] 14:47, 31 January 2014 (EST)<br />
<br />
==== Fixes for convenience ====<br />
<br />
<p>We now read many of the TOMAS data files from the directory <tt>TRIM( DATA_DIR_1x1 ) // 'TOMAS_201402/'</tt>. This avoids us from having to keep these big files (some of which approach 100 MB in size) in individual users' run directories.</p><br />
<br />
--[[User:Bmy|Bob Y.]] 16:20, 31 January 2014 (EST)<br />
<br />
<p>Standard GC bulk dust is now unavailable in tomas simulations. Including the option for bulk dust in tomas simulations led to very confusing logical constructs, causing neither to function in a TOMAS simulation. </p><br />
--[[User:Salvatore Farina|Salvatore Farina]] 16:01, 3 March 2014 (EST)<br />
<br />
== Outstanding issues ==<br />
<br />
=== Potential parallelization problems ===<br />
<br />
We have noticed that there may be a parallelization error in the TOMAS [http://acmg.seas.harvard.edu/geos/doc/man/appendix_5.html ND60 diagnostic]. This may be caused by a coding error; in particular, one or more variables that may have been omitted from an <tt>!$OMP+PRIVATE</tt> declaration.<br />
<br />
This is illustrated by the following [[GEOS-Chem_Unit_Tester#Interpreting_results_generated_by_the_GEOS-Chem_Unit_Tester|unit test simulation]] of the [[GEOS-Chem v9-01-02]] provisional release code (submitted at 2:11 PM on 21 Feb 2014):<br />
<br />
###############################################################################<br />
### VALIDATION OF GEOS-CHEM OUTPUT FILES<br />
### In directory: geos5_4x5_TOMAS40<br />
###<br />
### File 1 : trac_avg.geos5_4x5_TOMAS40.2005070100.sp<br />
### File 2 : trac_avg.geos5_4x5_TOMAS40.2005070100.mp<br />
### Sizes : IDENTICAL (707260156 and 707260156)<br />
### Checksums : DIFFERENT (895530022 and 2949483685)<br />
### Diffs : DIFFERENT<br />
###<br />
### File 1 : trac_rst.geos5_4x5_TOMAS40.2005070101.sp<br />
### File 2 : trac_rst.geos5_4x5_TOMAS40.2005070101.mp<br />
### Sizes : IDENTICAL (263480068 and 263480068)<br />
### Checksums : IDENTICAL (1925551193 and 1925551193)<br />
### Diffs : IDENTICAL<br />
###<br />
### File 1 : soil_rst.geos5_4x5_TOMAS40.2005070101.sp<br />
### File 2 : soil_rst.geos5_4x5_TOMAS40.2005070101.mp<br />
### Sizes : IDENTICAL (54040 and 54040)<br />
### Checksums : IDENTICAL (3229970876 and 3229970876)<br />
### Diffs : IDENTICAL<br />
###############################################################################<br />
<br />
In the above test, all TOMAS diagnostics (ND59, ND60, and ND61) were turned on. The restart files (here named <tt>trac_rst.*</tt>) from the single-processor and multi-processor stages of the unit test are identical, but the <tt>ctm.bpch</tt> files (here named <tt>trac_avg.*</tt>) were different. When the restart files are identical, that means single-processor and multi-processor stages produced the identical tracer concentrations (and soil NOx quantities). <br />
<br />
The only differences in the <tt>trac.avg.*</tt> files between the single-processor and multi-processor stages of the unit test were in TOMAS diagnostic quantities. The affected categories appear to be <tt>TMS-COND</tt>, <tt>TMS-COAG</tt>, <tt>TMS-NUCL</tt>, <tt>AERO-FIX</tt>, which points to the ND60 diagnostic.<br />
<br />
In order to confirm that the ND60 diagnostic exhibits the problem, we ran an additional unit test with ND59 and ND61 turned on, but ND60 turned off. This unit test, which was submitted at 3:33PM on 21 Feb 2014, yielded identical results.<br />
<br />
###############################################################################<br />
### VALIDATION OF GEOS-CHEM OUTPUT FILES<br />
### In directory: geos5_4x5_TOMAS40<br />
###<br />
### File 1 : trac_avg.geos5_4x5_TOMAS40.2005070100.sp<br />
### File 2 : trac_avg.geos5_4x5_TOMAS40.2005070100.mp<br />
### Sizes : IDENTICAL (690218236 and 690218236)<br />
### Checksums : IDENTICAL (4196844107 and 4196844107)<br />
### Diffs : IDENTICAL<br />
###<br />
### File 1 : trac_rst.geos5_4x5_TOMAS40.2005070101.sp<br />
### File 2 : trac_rst.geos5_4x5_TOMAS40.2005070101.mp<br />
### Sizes : IDENTICAL (263480068 and 263480068)<br />
### Checksums : IDENTICAL (1925551193 and 1925551193)<br />
### Diffs : IDENTICAL<br />
###<br />
### File 1 : soil_rst.geos5_4x5_TOMAS40.2005070101.sp<br />
### File 2 : soil_rst.geos5_4x5_TOMAS40.2005070101.mp<br />
### Sizes : IDENTICAL (54040 and 54040)<br />
### Checksums : IDENTICAL (3229970876 and 3229970876)<br />
### Diffs : IDENTICAL<br />
###############################################################################<br />
<br />
We are still looking into this issue. Because this issue only affects the ND60 diagnostic output, but not tracer concentrations, we are moving ahead with the TOMAS benchmarks for [[GEOS-Chem v9-02]] (as of 21 Feb 2014). <br />
<br />
--[[User:Bmy|Bob Y.]] 16:17, 21 February 2014 (EST)<br />
<br />
=== Offline Dust ===<br />
Currently, GEOS-Chem with TOMAS uses proscribed offline dust aerosol data in radiative transfer / photolysis calculations. Due to complications, this is turned off entirely for 2x2.5 resolution.<br />
<br />
=== Vertical Grids ===<br />
Currently, GC-TOMAS is only compatible with the reduced vertical grids:<br />
* [http://acmg.seas.harvard.edu/geos/doc/man/appendix_3.html#A3.3.1 GEOS3_30L]<br />
* [http://acmg.seas.harvard.edu/geos/doc/man/appendix_3.html#A3.4.1 GEOS4_30L]<br />
* [http://acmg.seas.harvard.edu/geos/doc/man/appendix_3.html#A3.5.1 GEOS5_47L]<br />
<br />
Development for the full vertical grids is ongoing.<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 20:43, 10 February 2010 (EST)<br />
<br />
== Obsolete versions of TOMAS ==<br />
<br />
In this section we preserve information that pertained to older versions of TOMAS (before the [[GEOS-Chem v9-02]] release).<br />
<br />
=== Code structure ===<br />
<br />
<div style="color: #aa0000; background: #eeeeee;border: 3px solid red; padding: 1em; margin: auto; width: 90%; ">'''<p>NOTE: This has been rendered obsolete by the [[#Update April 2013|re-integration of TOMAS into GEOS-Chem]], which was included in [[GEOS-Chem v9-02]]. All of the TOMAS routines have now been ported into the <tt>GeosCore</tt> directory. We shall leave this post here for reference. (Bob Yantosca, 30 Jan 2014)</p>'''</div><br />
<br />
The main-level <tt>Code</tt> directory has now been divided into several subdirectories:<br />
<br />
GeosCore/ GEOS-Chem "core" routines<br />
GeosTomas/ Parallel copies of GEOS-Chem routines that reference TOMAS<br />
GeosUtil/ "Utility" modules (e.g. error_mod.f, file_mod.f, time_mod.f, etc.<br />
Headers/ Header files (define.h, CMN_SIZE, CMN_DIAG, etc.)<br />
KPP/ KPP solver directory structure<br />
bin/ Directory where executables are placed<br />
doc/ Directory where documentation is created<br />
help/ Directory for GEOS-Chem Help Screen<br />
lib/ Directory where library files are placed<br />
mod/ Directory where module files are placed<br />
obsolete/ Directory where obsolete versions of code are archived<br />
<br />
Because there were a lot of TOMAS-related modifications in several GEOS-Chem "core" routines, the routines that need to "talk" to TOMAS were placed into a separate subdirectory named <tt>GeosTomas/</tt>. The files in <tt>GeosTomas</tt> are:<br />
<br />
Files:<br />
------<br />
Makefile -- GEOS-Chem routines that have been<br />
aero_drydep.f modified to reference the TOMAS aerosol<br />
carbon_mod.f microphysics package. These are kept<br />
chemdr.f in a separate GeosTomas directory so that<br />
chemistry_mod.f they do not interfere with the routines<br />
cleanup.f in the GeosCore directory.<br />
diag3.f<br />
diag_mod.f The GeosTomas directory only needs to<br />
diag_pl_mod.f contain the files that have been modified<br />
drydep_mod.f for TOMAS. The Makefile will look for<br />
dust_mod.f all other files from the GeosCore directory<br />
emissions_mod.f using the VPATH option in GNU Make.<br />
gamap_mod.f<br />
initialize.f NOTE to GEOS-Chem developers: When you<br />
input_mod.f make changes to any of these routines<br />
isoropia_mod.f in the GeosCore directory, you must also<br />
logical_mod.f make the same modifications to the<br />
ndxx_setup.f corresponding routines in the GeosTomas<br />
planeflight_mod.f directory.<br />
seasalt_mod.f<br />
sulfate_mod.f Maybe in the near future we can work<br />
tomas_mod.f towards integrating TOMAS into the GeosCore<br />
tomas_tpcore_mod.f90 directory more cleanly. However, due to<br />
tpcore_mod.f the large number of modifications that were<br />
tpcore_window_mod.f necessary for TOMAS, it was quicker to<br />
tracerid_mod.f implement the TOMAS code in a separate<br />
wetscav_mod.f subdirectory. <br />
xtra_read_mod.f -- Bob Y. (1/25/10)<br />
<br />
Each of these files were merged with the corresponding files in the <tt>GeosCore</tt> subdirectory. Therefore, in addition to having the GEOS-Chem modifications from [[GEOS-Chem v8-02-05|v8-02-05]], these files also have the relevant TOMAS references.<br />
<br />
A few technical considerations dictated the placing of these files into a separate <tt>GeosTomas/</tt> directory:<br />
<br />
* The ND60 diagnostic in the standard GEOS-Chem code (in <tt>GeosCore/</tt>) is now used for the CH4 offline simulation, but in TOMAS it's used for something else. <br />
* Some parameters needed to be declared differently with for simulations with TOMAS. <br />
* Because not all GEOS-Chem users will choose to use TOMAS, we did not want to unnecessarily bog down the code in <tt>GeosCore/</tt> with references to TOMAS-specific routines. <br />
<br />
All of these concerns could be best solved by keeping parallel copies of the affected routines in the <tt>GeosTomas</tt> directory.<br />
<br />
--[[User:Bmy|Bob Y.]] 13:35, 25 February 2010 (EST)<br />
<br />
=== Building GEOS-Chem with TOMAS ===<br />
<br />
<div style="color: #aa0000; background: #eeeeee;border: 3px solid red; padding: 1em; margin: auto; width: 90%; ">'''<p>NOTE: This has been rendered obsolete by the [[#Update April 2013|re-integration of TOMAS into GEOS-Chem]], which was included in [[GEOS-Chem v9-02]]. All of the TOMAS routines have now been ported into the <tt>GeosCore</tt> directory. We shall leave this post here for reference. (Bob Yantosca, 25 Feb 2014)</p>'''</div><br />
<br />
The <tt>VPATH</tt> feature of [http://www.gnu.org/software/make/manual/make.html GNU Make] is used to simplify the compilation. When GEOS-Chem is compiled with the tomas target, the GNU Make utility will search for files in the <tt>GeosTomas/</tt> directory first. If it cannot find files there, it will then search the <tt>GeosCore/</tt> directory. Thus, if we make a change to a "core" GEOS-Chem routine in the <tt>GeosCore/</tt> subdirectory (say in <tt>dao_mod.f</tt> or <tt>diag49_mod.f</tt>), then those changes will automatically be applied when you build GEOS-Chem with TOMAS. Thus, we only need to keep in <tt>GeosTomas/</tt> separate copies of those files that have to "talk" with TOMAS.<br />
<br />
Several new targets were added to the <tt>Makefile</tt> in the top-level <tt>Code/</tt> directory:<br />
<br />
#=============================================================================<br />
# Targets for TOMAS aerosol microphysics code (win, bmy, 1/25/10)<br />
#=============================================================================<br />
<br />
.PHONY: tomas libtomas exetomas cleantomas<br />
<br />
tomas:<br />
@$(MAKE) -C $(GEOSTOM) TOMAS=yes all<br />
<br />
libtomas:<br />
@$(MAKE) -C $(GEOSTOM) TOMAS=yes lib<br />
<br />
exetomas:<br />
@$(MAKE) -C $(GEOSTOM) TOMAS=yes exe<br />
<br />
cleantomas:<br />
@$(MAKE) -C $(GEOSTOM) TOMAS=yes clean<br />
<br />
You can build GEOS-Chem with the TOMAS option by typing:<br />
<br />
make tomas ...<br />
<br />
This will automatically do the proper things to build the TOMAS code into GEOS-Chem, such as:<br />
<br />
* Adding a <tt>-DTOMAS</tt> C-preprocessor switch to the <tt>FFLAGS</tt> compiler flag settings in <tt>Makefile_header.mk</tt>. This will cause TOMAS-specific areas of code to be turned on.<br />
* Turning off OpenMP parallelization. For now the GEOS-Chem + TOMAS code needs to be run on a single processor. We continue to work on parallelizing the code.<br />
* Calling the Makefile in the <tt>GeosTomas/</tt> subdirectory to build the executable. The executable file is now named <tt>geostomas</tt> in order to denote that the TOMAS code is built in.<br />
<br />
The GEOS-Chem + TOMAS has been built on the following compilers<br />
<br />
* Intel Fortran compiler v10<br />
* Intel Fortran compiler v11.1 (20101201)<br />
* SunStudio 12<br />
<br />
--[[User:Bmy|Bob Y.]] 10:36, 27 January 2010 (EST)<br />
<br />
=== Compile from GeosTomas directory ===<br />
<br />
<div style="color: #aa0000; background: #eeeeee;border: 3px solid red; padding: 1em; margin: auto; width: 90%; ">'''<p>NOTE: This has been rendered obsolete by the [[#Update April 2013|re-integration of TOMAS into GEOS-Chem]], which was included in [[GEOS-Chem v9-02]]. We shall leave this post here for reference. (Bob Yantosca, 30 Jan 2014)</p>'''</div><br />
<br />
'''''[mailto:dwesterv@andrew.cmu.edu Dan Westervelt] wrote:'''''<br />
<br />
:I think there is something going wrong in my compilation, although errors have come up at both compile time and run time. The worst of the problems is this: I'll make a change to any fortran file in the code (even something meaningless like print*, 'foo') and hundreds of compile errors come out with fishy error messages such as (from ifort v10.1):<br />
<br />
***fortcom: Error: chemistry_mod.f, line 478: A kind type parameter must be a compile-time constant. [DP]<br />
REAL(kind=dp) :: RCNTRL(20)<br />
<br />
:Any advice? The errors I'm having are not unique to any version of GC, any type of met fields, any compiler, etc.<br />
<br />
'''''[mailto:yantosca@seas.harvard.edu Bob Yantosca] wrote:'''''<br />
<br />
:Make sure you are always in the GeosTomas subdirectory when you build the code. Sometimes there is a problem if you build the code from a higher level directory. This may have to do with the VPATH in the makefile.<br />
<br />
'''''[mailto:dwesterv@andrew.cmu.edu Dan Westervelt] wrote:'''''<br />
<br />
:Thanks, that seems to do the trick.<br />
<br />
--[[User:Bmy|Bob Y.]] 14:37, 14 April 2010 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_setup_guide&diff=16061TOMAS setup guide2014-03-03T21:27:52Z<p>Salvatore Farina: /* Other Advice / Issues */</p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with [[TOMAS aerosol microphysics]] on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
However, the 'bleeding edge' code has many recent developments in GEOS-Chem/TOMAS that are not included in the public release, including parallel computing.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Compiler ===<br />
GEOS-Chem works best ''(only)'' with the Intel Ifort Fortran compiler - v11.1<br />
There is an instance of the compiler installed on glooscap, which you can load by doing<br />
module load intel/11.1.073<br />
<br />
'''Alternatively''', I have installed ifort version 11.1.080. This also gives you access to the ''iidb'' debugger. To use this version, add the following to your .bashrc<br />
<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
export FC="ifort"<br />
<br />
=== Code ===<br />
The latest stable version of TOMAS will be included with the next public release. Currently, the latest code can be obtained from Bob Yantosca using git<br />
<br />
git clone git://git.as.harvard.edu/bmy/GEOS-Chem<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
Please follow the directions for [[Installing libraries for GEOS-Chem]] wiki before proceeding. You will need to install the netCDF-4.2 libraries.<br />
<br />
=== Environment ===<br />
After installing the libraries, your .bashrc should include a similar section to the following<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
export LD_LIBRARY_PATH=$GC_LIB:$LD_LIBRARY_PATH<br />
<br />
Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11''<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data (meteorology, emissions, land use, etc.) for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes and additions due to recent GC development and TOMAS specifics.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
=== Restart Files ===<br />
There are restart files for TOMAS at 4x5 resolution at<br />
/net/samqfs/pierce/sfarina/standard_run_directories/restart.TOMASXX<br />
Where ''XX'' is the number of bins. These restart files use an "empty" restart file for 2005/06/01 and spin-up times can be calculated accordingly. I will be adding to this directory in the coming week or two. Restart files for 2x2.5 are located at<br />
/net/samqfs/pierce/sfarina/standard_run_directories/2x2.5/restart.ires.TOMAS15<br />
<br />
So far, I have only used TOMAS15 at this model resolution.<br />
<br />
The North American nested grid is under active development for TOMAS.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== Bash versions ===<br />
Geos Chem now requires bash > v3.2 in order to compile properly. There are two ways to ensure you are using the correct version:<br />
*install your own instance of bash, and point to it in Makefile_header.mk<br />
*on glooscap, type 'module load bash' before compiling<br />
<br />
=== Compile Flags ===<br />
Choice of GEOS-Chem model resolution is now done using compile time flags. Full instructions are available [http://acmg.seas.harvard.edu/geos/doc/man/chapter_3.html#Compile here].<br />
<br />
Example: To build TOMAS for simulations on a global 4x5 degree grid, using geos5 meteorology, I invoke make as follows for each version:<br />
make GRID=4x5 MET=geos5 tomas12<br />
make GRID=4x5 MET=geos5 tomas15<br />
make GRID=4x5 MET=geos5 tomas<br />
make GRID=4x5 MET=geos5 tomas40<br />
<br />
Note: "make tomas" is shorthand for "make TOMAS=yes all"<br />
<br />
Note: "make tomas15" os shorthand for "make TOMAS=yes TOMAS15=yes all"<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 16:13, 3 March 2014 (EST)<br />
<br />
=== Make ===<br />
qrsh allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc:<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do, for example"<br />
cd YOUR_CODE_DIR/GeosCore<br />
pshell16<br />
make -j16 GRID=4x5 MET=geos5 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on the system.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
Alternatively, you can use the following to define a tomas version when compiling:<br />
make TOMAS=yes all<br />
make TOMAS40=yes all<br />
etc.<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
=== A Note about Speed ===<br />
Choosing the appropriate version of tomas for your needs includes consideration of time and resources.<br />
Using 16 processors on glooscap at 4x5 resolution, the model time : real time ratio is roughly as follows:<br />
version | speedup<br />
40 bin - 46<br />
30 bin - 60<br />
15 bin - 125<br />
12 bin - 131<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, '''ask'''.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. To use git on glooscap:<br />
module load git<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have your source code directory, make a separate branch for yourself before making any changes. This will simplify trading and tracking updates/advances/bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
Now that you've successfully run the model, there are a few more hurdles to inspect your data.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
==== Copy ====<br />
Copy the relevant files to your postprocessing directory for a given run<br />
ctm.bpch<br />
diaginfo.dat<br />
tracerinfo.dat<br />
proc_one.pro<br />
averageCNCCN_XX.py <-- XX is TOMAS version<br />
plotCNCCN.py<br />
<br />
==== Split ====<br />
Use the script Bpch_Sep_Sal interactively from within the IDL environment to ctm.bpch into separate months<br />
For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
<br />
==== Create netcdf output ====<br />
Using the IDL script proc_one.pro, we extract information from the monthly .bpch files and save it to the standard netCDF<br />
Edit proc_one.pro to use the correct infile/outfiles<br />
Execute proc_one from your shell:<br />
idl proc_one.pro<br />
<br />
==== Counting CN and CCN ====<br />
Run averageCNCCN_XX.py, where XX is the model version<br />
For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
<br />
==== Plotting the Results====<br />
Edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version.<br />
plotCNCCN.py will automatically detect the model version and customize map names.<br />
To plot the surface and zonal average concentrations of CN3, CN10, CN40, and CN80 for august:<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
<br />
==== NCview ====<br />
You can also use ncview on the file ctm.nc to view individual species concentrations or nucleation rates.<br />
ncview ctm.nc<br />
ncview ctm_nuc.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F. This is a hard to track bug related to ongoing development of grid independent geoschem.<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* If you are trying to run geoschem ''outside'' of a qrsh (grid engine) environment (i.e. on the head node), you will need to add '''ulimit -S -s unlimited''' to your .bashrc<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups in different physical locations is a better idea.<br />
* if you have any questions or you are running into trouble, ''please ask'' either myself, Sajeev, or Jeff for help. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_setup_guide&diff=16060TOMAS setup guide2014-03-03T21:27:21Z<p>Salvatore Farina: /* A Note about Speed */</p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with [[TOMAS aerosol microphysics]] on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
However, the 'bleeding edge' code has many recent developments in GEOS-Chem/TOMAS that are not included in the public release, including parallel computing.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Compiler ===<br />
GEOS-Chem works best ''(only)'' with the Intel Ifort Fortran compiler - v11.1<br />
There is an instance of the compiler installed on glooscap, which you can load by doing<br />
module load intel/11.1.073<br />
<br />
'''Alternatively''', I have installed ifort version 11.1.080. This also gives you access to the ''iidb'' debugger. To use this version, add the following to your .bashrc<br />
<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
export FC="ifort"<br />
<br />
=== Code ===<br />
The latest stable version of TOMAS will be included with the next public release. Currently, the latest code can be obtained from Bob Yantosca using git<br />
<br />
git clone git://git.as.harvard.edu/bmy/GEOS-Chem<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
Please follow the directions for [[Installing libraries for GEOS-Chem]] wiki before proceeding. You will need to install the netCDF-4.2 libraries.<br />
<br />
=== Environment ===<br />
After installing the libraries, your .bashrc should include a similar section to the following<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
export LD_LIBRARY_PATH=$GC_LIB:$LD_LIBRARY_PATH<br />
<br />
Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11''<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data (meteorology, emissions, land use, etc.) for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes and additions due to recent GC development and TOMAS specifics.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
=== Restart Files ===<br />
There are restart files for TOMAS at 4x5 resolution at<br />
/net/samqfs/pierce/sfarina/standard_run_directories/restart.TOMASXX<br />
Where ''XX'' is the number of bins. These restart files use an "empty" restart file for 2005/06/01 and spin-up times can be calculated accordingly. I will be adding to this directory in the coming week or two. Restart files for 2x2.5 are located at<br />
/net/samqfs/pierce/sfarina/standard_run_directories/2x2.5/restart.ires.TOMAS15<br />
<br />
So far, I have only used TOMAS15 at this model resolution.<br />
<br />
The North American nested grid is under active development for TOMAS.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== Bash versions ===<br />
Geos Chem now requires bash > v3.2 in order to compile properly. There are two ways to ensure you are using the correct version:<br />
*install your own instance of bash, and point to it in Makefile_header.mk<br />
*on glooscap, type 'module load bash' before compiling<br />
<br />
=== Compile Flags ===<br />
Choice of GEOS-Chem model resolution is now done using compile time flags. Full instructions are available [http://acmg.seas.harvard.edu/geos/doc/man/chapter_3.html#Compile here].<br />
<br />
Example: To build TOMAS for simulations on a global 4x5 degree grid, using geos5 meteorology, I invoke make as follows for each version:<br />
make GRID=4x5 MET=geos5 tomas12<br />
make GRID=4x5 MET=geos5 tomas15<br />
make GRID=4x5 MET=geos5 tomas<br />
make GRID=4x5 MET=geos5 tomas40<br />
<br />
Note: "make tomas" is shorthand for "make TOMAS=yes all"<br />
<br />
Note: "make tomas15" os shorthand for "make TOMAS=yes TOMAS15=yes all"<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 16:13, 3 March 2014 (EST)<br />
<br />
=== Make ===<br />
qrsh allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc:<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do, for example"<br />
cd YOUR_CODE_DIR/GeosCore<br />
pshell16<br />
make -j16 GRID=4x5 MET=geos5 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on the system.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
Alternatively, you can use the following to define a tomas version when compiling:<br />
make TOMAS=yes all<br />
make TOMAS40=yes all<br />
etc.<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
=== A Note about Speed ===<br />
Choosing the appropriate version of tomas for your needs includes consideration of time and resources.<br />
Using 16 processors on glooscap at 4x5 resolution, the model time : real time ratio is roughly as follows:<br />
version | speedup<br />
40 bin - 46<br />
30 bin - 60<br />
15 bin - 125<br />
12 bin - 131<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, '''ask'''.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. To use git on glooscap:<br />
module load git<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have your source code directory, make a separate branch for yourself before making any changes. This will simplify trading and tracking updates/advances/bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
Now that you've successfully run the model, there are a few more hurdles to inspect your data.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
==== Copy ====<br />
Copy the relevant files to your postprocessing directory for a given run<br />
ctm.bpch<br />
diaginfo.dat<br />
tracerinfo.dat<br />
proc_one.pro<br />
averageCNCCN_XX.py <-- XX is TOMAS version<br />
plotCNCCN.py<br />
<br />
==== Split ====<br />
Use the script Bpch_Sep_Sal interactively from within the IDL environment to ctm.bpch into separate months<br />
For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
<br />
==== Create netcdf output ====<br />
Using the IDL script proc_one.pro, we extract information from the monthly .bpch files and save it to the standard netCDF<br />
Edit proc_one.pro to use the correct infile/outfiles<br />
Execute proc_one from your shell:<br />
idl proc_one.pro<br />
<br />
==== Counting CN and CCN ====<br />
Run averageCNCCN_XX.py, where XX is the model version<br />
For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
<br />
==== Plotting the Results====<br />
Edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version.<br />
plotCNCCN.py will automatically detect the model version and customize map names.<br />
To plot the surface and zonal average concentrations of CN3, CN10, CN40, and CN80 for august:<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
<br />
==== NCview ====<br />
You can also use ncview on the file ctm.nc to view individual species concentrations or nucleation rates.<br />
ncview ctm.nc<br />
ncview ctm_nuc.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F. This is a hard to track bug related to ongoing development of grid independent geoschem.<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* If you are trying to run geoschem ''outside'' of a qrsh (grid engine) environment (i.e. on the head node), you will need to add '''ulimit -S -s unlimited''' to your .bashrc<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups is a better idea.<br />
* if you have any questions or you are running into trouble, ''please ask'' either myself, Sajeev, or Jeff for help. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_setup_guide&diff=16059TOMAS setup guide2014-03-03T21:19:52Z<p>Salvatore Farina: /* Bash versions */</p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with [[TOMAS aerosol microphysics]] on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
However, the 'bleeding edge' code has many recent developments in GEOS-Chem/TOMAS that are not included in the public release, including parallel computing.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Compiler ===<br />
GEOS-Chem works best ''(only)'' with the Intel Ifort Fortran compiler - v11.1<br />
There is an instance of the compiler installed on glooscap, which you can load by doing<br />
module load intel/11.1.073<br />
<br />
'''Alternatively''', I have installed ifort version 11.1.080. This also gives you access to the ''iidb'' debugger. To use this version, add the following to your .bashrc<br />
<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
export FC="ifort"<br />
<br />
=== Code ===<br />
The latest stable version of TOMAS will be included with the next public release. Currently, the latest code can be obtained from Bob Yantosca using git<br />
<br />
git clone git://git.as.harvard.edu/bmy/GEOS-Chem<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
Please follow the directions for [[Installing libraries for GEOS-Chem]] wiki before proceeding. You will need to install the netCDF-4.2 libraries.<br />
<br />
=== Environment ===<br />
After installing the libraries, your .bashrc should include a similar section to the following<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
export LD_LIBRARY_PATH=$GC_LIB:$LD_LIBRARY_PATH<br />
<br />
Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11''<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data (meteorology, emissions, land use, etc.) for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes and additions due to recent GC development and TOMAS specifics.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
=== Restart Files ===<br />
There are restart files for TOMAS at 4x5 resolution at<br />
/net/samqfs/pierce/sfarina/standard_run_directories/restart.TOMASXX<br />
Where ''XX'' is the number of bins. These restart files use an "empty" restart file for 2005/06/01 and spin-up times can be calculated accordingly. I will be adding to this directory in the coming week or two. Restart files for 2x2.5 are located at<br />
/net/samqfs/pierce/sfarina/standard_run_directories/2x2.5/restart.ires.TOMAS15<br />
<br />
So far, I have only used TOMAS15 at this model resolution.<br />
<br />
The North American nested grid is under active development for TOMAS.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== Bash versions ===<br />
Geos Chem now requires bash > v3.2 in order to compile properly. There are two ways to ensure you are using the correct version:<br />
*install your own instance of bash, and point to it in Makefile_header.mk<br />
*on glooscap, type 'module load bash' before compiling<br />
<br />
=== Compile Flags ===<br />
Choice of GEOS-Chem model resolution is now done using compile time flags. Full instructions are available [http://acmg.seas.harvard.edu/geos/doc/man/chapter_3.html#Compile here].<br />
<br />
Example: To build TOMAS for simulations on a global 4x5 degree grid, using geos5 meteorology, I invoke make as follows for each version:<br />
make GRID=4x5 MET=geos5 tomas12<br />
make GRID=4x5 MET=geos5 tomas15<br />
make GRID=4x5 MET=geos5 tomas<br />
make GRID=4x5 MET=geos5 tomas40<br />
<br />
Note: "make tomas" is shorthand for "make TOMAS=yes all"<br />
<br />
Note: "make tomas15" os shorthand for "make TOMAS=yes TOMAS15=yes all"<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 16:13, 3 March 2014 (EST)<br />
<br />
=== Make ===<br />
qrsh allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc:<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do, for example"<br />
cd YOUR_CODE_DIR/GeosCore<br />
pshell16<br />
make -j16 GRID=4x5 MET=geos5 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on the system.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
Alternatively, you can use the following to define a tomas version when compiling:<br />
make TOMAS=yes all<br />
make TOMAS40=yes all<br />
etc.<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
=== A Note about Speed ===<br />
Choosing the appropriate version of tomas for your needs includes consideration of time and resources.<br />
Using 16 processors on glooscap at 4x5 resolution, the model time : real time ratio is roughly as follows:<br />
version | speedup<br />
40 bin - 64<br />
30 bin - 82<br />
15 bin - 144<br />
12 bin - 170<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, '''ask'''.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. To use git on glooscap:<br />
module load git<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have your source code directory, make a separate branch for yourself before making any changes. This will simplify trading and tracking updates/advances/bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
Now that you've successfully run the model, there are a few more hurdles to inspect your data.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
==== Copy ====<br />
Copy the relevant files to your postprocessing directory for a given run<br />
ctm.bpch<br />
diaginfo.dat<br />
tracerinfo.dat<br />
proc_one.pro<br />
averageCNCCN_XX.py <-- XX is TOMAS version<br />
plotCNCCN.py<br />
<br />
==== Split ====<br />
Use the script Bpch_Sep_Sal interactively from within the IDL environment to ctm.bpch into separate months<br />
For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
<br />
==== Create netcdf output ====<br />
Using the IDL script proc_one.pro, we extract information from the monthly .bpch files and save it to the standard netCDF<br />
Edit proc_one.pro to use the correct infile/outfiles<br />
Execute proc_one from your shell:<br />
idl proc_one.pro<br />
<br />
==== Counting CN and CCN ====<br />
Run averageCNCCN_XX.py, where XX is the model version<br />
For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
<br />
==== Plotting the Results====<br />
Edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version.<br />
plotCNCCN.py will automatically detect the model version and customize map names.<br />
To plot the surface and zonal average concentrations of CN3, CN10, CN40, and CN80 for august:<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
<br />
==== NCview ====<br />
You can also use ncview on the file ctm.nc to view individual species concentrations or nucleation rates.<br />
ncview ctm.nc<br />
ncview ctm_nuc.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F. This is a hard to track bug related to ongoing development of grid independent geoschem.<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* If you are trying to run geoschem ''outside'' of a qrsh (grid engine) environment (i.e. on the head node), you will need to add '''ulimit -S -s unlimited''' to your .bashrc<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups is a better idea.<br />
* if you have any questions or you are running into trouble, ''please ask'' either myself, Sajeev, or Jeff for help. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_setup_guide&diff=16058TOMAS setup guide2014-03-03T21:19:26Z<p>Salvatore Farina: /* Bash versions */</p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with [[TOMAS aerosol microphysics]] on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
However, the 'bleeding edge' code has many recent developments in GEOS-Chem/TOMAS that are not included in the public release, including parallel computing.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Compiler ===<br />
GEOS-Chem works best ''(only)'' with the Intel Ifort Fortran compiler - v11.1<br />
There is an instance of the compiler installed on glooscap, which you can load by doing<br />
module load intel/11.1.073<br />
<br />
'''Alternatively''', I have installed ifort version 11.1.080. This also gives you access to the ''iidb'' debugger. To use this version, add the following to your .bashrc<br />
<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
export FC="ifort"<br />
<br />
=== Code ===<br />
The latest stable version of TOMAS will be included with the next public release. Currently, the latest code can be obtained from Bob Yantosca using git<br />
<br />
git clone git://git.as.harvard.edu/bmy/GEOS-Chem<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
Please follow the directions for [[Installing libraries for GEOS-Chem]] wiki before proceeding. You will need to install the netCDF-4.2 libraries.<br />
<br />
=== Environment ===<br />
After installing the libraries, your .bashrc should include a similar section to the following<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
export LD_LIBRARY_PATH=$GC_LIB:$LD_LIBRARY_PATH<br />
<br />
Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11''<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data (meteorology, emissions, land use, etc.) for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes and additions due to recent GC development and TOMAS specifics.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
=== Restart Files ===<br />
There are restart files for TOMAS at 4x5 resolution at<br />
/net/samqfs/pierce/sfarina/standard_run_directories/restart.TOMASXX<br />
Where ''XX'' is the number of bins. These restart files use an "empty" restart file for 2005/06/01 and spin-up times can be calculated accordingly. I will be adding to this directory in the coming week or two. Restart files for 2x2.5 are located at<br />
/net/samqfs/pierce/sfarina/standard_run_directories/2x2.5/restart.ires.TOMAS15<br />
<br />
So far, I have only used TOMAS15 at this model resolution.<br />
<br />
The North American nested grid is under active development for TOMAS.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== Bash versions ===<br />
Geos Chem now requires bash > v3.2 in order to compile properly. There are two ways to ensure you are using the correct version:<br />
install your own instance of bash, and point to it in Makefile_header.mk<br />
<br />
on glooscap, type 'module load bash' before compiling<br />
<br />
=== Compile Flags ===<br />
Choice of GEOS-Chem model resolution is now done using compile time flags. Full instructions are available [http://acmg.seas.harvard.edu/geos/doc/man/chapter_3.html#Compile here].<br />
<br />
Example: To build TOMAS for simulations on a global 4x5 degree grid, using geos5 meteorology, I invoke make as follows for each version:<br />
make GRID=4x5 MET=geos5 tomas12<br />
make GRID=4x5 MET=geos5 tomas15<br />
make GRID=4x5 MET=geos5 tomas<br />
make GRID=4x5 MET=geos5 tomas40<br />
<br />
Note: "make tomas" is shorthand for "make TOMAS=yes all"<br />
<br />
Note: "make tomas15" os shorthand for "make TOMAS=yes TOMAS15=yes all"<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 16:13, 3 March 2014 (EST)<br />
<br />
=== Make ===<br />
qrsh allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc:<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do, for example"<br />
cd YOUR_CODE_DIR/GeosCore<br />
pshell16<br />
make -j16 GRID=4x5 MET=geos5 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on the system.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
Alternatively, you can use the following to define a tomas version when compiling:<br />
make TOMAS=yes all<br />
make TOMAS40=yes all<br />
etc.<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
=== A Note about Speed ===<br />
Choosing the appropriate version of tomas for your needs includes consideration of time and resources.<br />
Using 16 processors on glooscap at 4x5 resolution, the model time : real time ratio is roughly as follows:<br />
version | speedup<br />
40 bin - 64<br />
30 bin - 82<br />
15 bin - 144<br />
12 bin - 170<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, '''ask'''.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. To use git on glooscap:<br />
module load git<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have your source code directory, make a separate branch for yourself before making any changes. This will simplify trading and tracking updates/advances/bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
Now that you've successfully run the model, there are a few more hurdles to inspect your data.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
==== Copy ====<br />
Copy the relevant files to your postprocessing directory for a given run<br />
ctm.bpch<br />
diaginfo.dat<br />
tracerinfo.dat<br />
proc_one.pro<br />
averageCNCCN_XX.py <-- XX is TOMAS version<br />
plotCNCCN.py<br />
<br />
==== Split ====<br />
Use the script Bpch_Sep_Sal interactively from within the IDL environment to ctm.bpch into separate months<br />
For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
<br />
==== Create netcdf output ====<br />
Using the IDL script proc_one.pro, we extract information from the monthly .bpch files and save it to the standard netCDF<br />
Edit proc_one.pro to use the correct infile/outfiles<br />
Execute proc_one from your shell:<br />
idl proc_one.pro<br />
<br />
==== Counting CN and CCN ====<br />
Run averageCNCCN_XX.py, where XX is the model version<br />
For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
<br />
==== Plotting the Results====<br />
Edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version.<br />
plotCNCCN.py will automatically detect the model version and customize map names.<br />
To plot the surface and zonal average concentrations of CN3, CN10, CN40, and CN80 for august:<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
<br />
==== NCview ====<br />
You can also use ncview on the file ctm.nc to view individual species concentrations or nucleation rates.<br />
ncview ctm.nc<br />
ncview ctm_nuc.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F. This is a hard to track bug related to ongoing development of grid independent geoschem.<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* If you are trying to run geoschem ''outside'' of a qrsh (grid engine) environment (i.e. on the head node), you will need to add '''ulimit -S -s unlimited''' to your .bashrc<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups is a better idea.<br />
* if you have any questions or you are running into trouble, ''please ask'' either myself, Sajeev, or Jeff for help. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_setup_guide&diff=16057TOMAS setup guide2014-03-03T21:19:01Z<p>Salvatore Farina: /* Bash versions */</p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with [[TOMAS aerosol microphysics]] on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
However, the 'bleeding edge' code has many recent developments in GEOS-Chem/TOMAS that are not included in the public release, including parallel computing.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Compiler ===<br />
GEOS-Chem works best ''(only)'' with the Intel Ifort Fortran compiler - v11.1<br />
There is an instance of the compiler installed on glooscap, which you can load by doing<br />
module load intel/11.1.073<br />
<br />
'''Alternatively''', I have installed ifort version 11.1.080. This also gives you access to the ''iidb'' debugger. To use this version, add the following to your .bashrc<br />
<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
export FC="ifort"<br />
<br />
=== Code ===<br />
The latest stable version of TOMAS will be included with the next public release. Currently, the latest code can be obtained from Bob Yantosca using git<br />
<br />
git clone git://git.as.harvard.edu/bmy/GEOS-Chem<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
Please follow the directions for [[Installing libraries for GEOS-Chem]] wiki before proceeding. You will need to install the netCDF-4.2 libraries.<br />
<br />
=== Environment ===<br />
After installing the libraries, your .bashrc should include a similar section to the following<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
export LD_LIBRARY_PATH=$GC_LIB:$LD_LIBRARY_PATH<br />
<br />
Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11''<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data (meteorology, emissions, land use, etc.) for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes and additions due to recent GC development and TOMAS specifics.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
=== Restart Files ===<br />
There are restart files for TOMAS at 4x5 resolution at<br />
/net/samqfs/pierce/sfarina/standard_run_directories/restart.TOMASXX<br />
Where ''XX'' is the number of bins. These restart files use an "empty" restart file for 2005/06/01 and spin-up times can be calculated accordingly. I will be adding to this directory in the coming week or two. Restart files for 2x2.5 are located at<br />
/net/samqfs/pierce/sfarina/standard_run_directories/2x2.5/restart.ires.TOMAS15<br />
<br />
So far, I have only used TOMAS15 at this model resolution.<br />
<br />
The North American nested grid is under active development for TOMAS.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== Bash versions ===<br />
Geos Chem now requires bash > v3.2 in order to compile properly. There are two ways to ensure you are using the correct version:<br />
install your own instance of bash, and point to it in Makefile_header.mk<br />
OR<br />
on glooscap, type module load bash before compiling<br />
<br />
=== Compile Flags ===<br />
Choice of GEOS-Chem model resolution is now done using compile time flags. Full instructions are available [http://acmg.seas.harvard.edu/geos/doc/man/chapter_3.html#Compile here].<br />
<br />
Example: To build TOMAS for simulations on a global 4x5 degree grid, using geos5 meteorology, I invoke make as follows for each version:<br />
make GRID=4x5 MET=geos5 tomas12<br />
make GRID=4x5 MET=geos5 tomas15<br />
make GRID=4x5 MET=geos5 tomas<br />
make GRID=4x5 MET=geos5 tomas40<br />
<br />
Note: "make tomas" is shorthand for "make TOMAS=yes all"<br />
<br />
Note: "make tomas15" os shorthand for "make TOMAS=yes TOMAS15=yes all"<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 16:13, 3 March 2014 (EST)<br />
<br />
=== Make ===<br />
qrsh allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc:<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do, for example"<br />
cd YOUR_CODE_DIR/GeosCore<br />
pshell16<br />
make -j16 GRID=4x5 MET=geos5 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on the system.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
Alternatively, you can use the following to define a tomas version when compiling:<br />
make TOMAS=yes all<br />
make TOMAS40=yes all<br />
etc.<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
=== A Note about Speed ===<br />
Choosing the appropriate version of tomas for your needs includes consideration of time and resources.<br />
Using 16 processors on glooscap at 4x5 resolution, the model time : real time ratio is roughly as follows:<br />
version | speedup<br />
40 bin - 64<br />
30 bin - 82<br />
15 bin - 144<br />
12 bin - 170<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, '''ask'''.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. To use git on glooscap:<br />
module load git<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have your source code directory, make a separate branch for yourself before making any changes. This will simplify trading and tracking updates/advances/bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
Now that you've successfully run the model, there are a few more hurdles to inspect your data.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
==== Copy ====<br />
Copy the relevant files to your postprocessing directory for a given run<br />
ctm.bpch<br />
diaginfo.dat<br />
tracerinfo.dat<br />
proc_one.pro<br />
averageCNCCN_XX.py <-- XX is TOMAS version<br />
plotCNCCN.py<br />
<br />
==== Split ====<br />
Use the script Bpch_Sep_Sal interactively from within the IDL environment to ctm.bpch into separate months<br />
For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
<br />
==== Create netcdf output ====<br />
Using the IDL script proc_one.pro, we extract information from the monthly .bpch files and save it to the standard netCDF<br />
Edit proc_one.pro to use the correct infile/outfiles<br />
Execute proc_one from your shell:<br />
idl proc_one.pro<br />
<br />
==== Counting CN and CCN ====<br />
Run averageCNCCN_XX.py, where XX is the model version<br />
For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
<br />
==== Plotting the Results====<br />
Edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version.<br />
plotCNCCN.py will automatically detect the model version and customize map names.<br />
To plot the surface and zonal average concentrations of CN3, CN10, CN40, and CN80 for august:<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
<br />
==== NCview ====<br />
You can also use ncview on the file ctm.nc to view individual species concentrations or nucleation rates.<br />
ncview ctm.nc<br />
ncview ctm_nuc.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F. This is a hard to track bug related to ongoing development of grid independent geoschem.<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* If you are trying to run geoschem ''outside'' of a qrsh (grid engine) environment (i.e. on the head node), you will need to add '''ulimit -S -s unlimited''' to your .bashrc<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups is a better idea.<br />
* if you have any questions or you are running into trouble, ''please ask'' either myself, Sajeev, or Jeff for help. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_setup_guide&diff=16055TOMAS setup guide2014-03-03T21:18:51Z<p>Salvatore Farina: /* Building GEOS-Chem/TOMAS */</p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with [[TOMAS aerosol microphysics]] on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
However, the 'bleeding edge' code has many recent developments in GEOS-Chem/TOMAS that are not included in the public release, including parallel computing.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Compiler ===<br />
GEOS-Chem works best ''(only)'' with the Intel Ifort Fortran compiler - v11.1<br />
There is an instance of the compiler installed on glooscap, which you can load by doing<br />
module load intel/11.1.073<br />
<br />
'''Alternatively''', I have installed ifort version 11.1.080. This also gives you access to the ''iidb'' debugger. To use this version, add the following to your .bashrc<br />
<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
export FC="ifort"<br />
<br />
=== Code ===<br />
The latest stable version of TOMAS will be included with the next public release. Currently, the latest code can be obtained from Bob Yantosca using git<br />
<br />
git clone git://git.as.harvard.edu/bmy/GEOS-Chem<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
Please follow the directions for [[Installing libraries for GEOS-Chem]] wiki before proceeding. You will need to install the netCDF-4.2 libraries.<br />
<br />
=== Environment ===<br />
After installing the libraries, your .bashrc should include a similar section to the following<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
export LD_LIBRARY_PATH=$GC_LIB:$LD_LIBRARY_PATH<br />
<br />
Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11''<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data (meteorology, emissions, land use, etc.) for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes and additions due to recent GC development and TOMAS specifics.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
=== Restart Files ===<br />
There are restart files for TOMAS at 4x5 resolution at<br />
/net/samqfs/pierce/sfarina/standard_run_directories/restart.TOMASXX<br />
Where ''XX'' is the number of bins. These restart files use an "empty" restart file for 2005/06/01 and spin-up times can be calculated accordingly. I will be adding to this directory in the coming week or two. Restart files for 2x2.5 are located at<br />
/net/samqfs/pierce/sfarina/standard_run_directories/2x2.5/restart.ires.TOMAS15<br />
<br />
So far, I have only used TOMAS15 at this model resolution.<br />
<br />
The North American nested grid is under active development for TOMAS.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== Bash versions ===<br />
Geos Chem now requires bash > v3.2 in order to compile properly. There are two ways to ensure you are using the correct version:<br />
install your own instance of bash, and point to it in Makefile_header.mk<br />
on glooscap, type module load bash before compiling<br />
<br />
<br />
=== Compile Flags ===<br />
Choice of GEOS-Chem model resolution is now done using compile time flags. Full instructions are available [http://acmg.seas.harvard.edu/geos/doc/man/chapter_3.html#Compile here].<br />
<br />
Example: To build TOMAS for simulations on a global 4x5 degree grid, using geos5 meteorology, I invoke make as follows for each version:<br />
make GRID=4x5 MET=geos5 tomas12<br />
make GRID=4x5 MET=geos5 tomas15<br />
make GRID=4x5 MET=geos5 tomas<br />
make GRID=4x5 MET=geos5 tomas40<br />
<br />
Note: "make tomas" is shorthand for "make TOMAS=yes all"<br />
<br />
Note: "make tomas15" os shorthand for "make TOMAS=yes TOMAS15=yes all"<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 16:13, 3 March 2014 (EST)<br />
<br />
=== Make ===<br />
qrsh allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc:<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do, for example"<br />
cd YOUR_CODE_DIR/GeosCore<br />
pshell16<br />
make -j16 GRID=4x5 MET=geos5 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on the system.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
Alternatively, you can use the following to define a tomas version when compiling:<br />
make TOMAS=yes all<br />
make TOMAS40=yes all<br />
etc.<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
=== A Note about Speed ===<br />
Choosing the appropriate version of tomas for your needs includes consideration of time and resources.<br />
Using 16 processors on glooscap at 4x5 resolution, the model time : real time ratio is roughly as follows:<br />
version | speedup<br />
40 bin - 64<br />
30 bin - 82<br />
15 bin - 144<br />
12 bin - 170<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, '''ask'''.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. To use git on glooscap:<br />
module load git<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have your source code directory, make a separate branch for yourself before making any changes. This will simplify trading and tracking updates/advances/bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
Now that you've successfully run the model, there are a few more hurdles to inspect your data.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
==== Copy ====<br />
Copy the relevant files to your postprocessing directory for a given run<br />
ctm.bpch<br />
diaginfo.dat<br />
tracerinfo.dat<br />
proc_one.pro<br />
averageCNCCN_XX.py <-- XX is TOMAS version<br />
plotCNCCN.py<br />
<br />
==== Split ====<br />
Use the script Bpch_Sep_Sal interactively from within the IDL environment to ctm.bpch into separate months<br />
For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
<br />
==== Create netcdf output ====<br />
Using the IDL script proc_one.pro, we extract information from the monthly .bpch files and save it to the standard netCDF<br />
Edit proc_one.pro to use the correct infile/outfiles<br />
Execute proc_one from your shell:<br />
idl proc_one.pro<br />
<br />
==== Counting CN and CCN ====<br />
Run averageCNCCN_XX.py, where XX is the model version<br />
For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
<br />
==== Plotting the Results====<br />
Edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version.<br />
plotCNCCN.py will automatically detect the model version and customize map names.<br />
To plot the surface and zonal average concentrations of CN3, CN10, CN40, and CN80 for august:<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
<br />
==== NCview ====<br />
You can also use ncview on the file ctm.nc to view individual species concentrations or nucleation rates.<br />
ncview ctm.nc<br />
ncview ctm_nuc.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F. This is a hard to track bug related to ongoing development of grid independent geoschem.<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* If you are trying to run geoschem ''outside'' of a qrsh (grid engine) environment (i.e. on the head node), you will need to add '''ulimit -S -s unlimited''' to your .bashrc<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups is a better idea.<br />
* if you have any questions or you are running into trouble, ''please ask'' either myself, Sajeev, or Jeff for help. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_setup_guide&diff=16053TOMAS setup guide2014-03-03T21:16:40Z<p>Salvatore Farina: /* Make */</p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with [[TOMAS aerosol microphysics]] on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
However, the 'bleeding edge' code has many recent developments in GEOS-Chem/TOMAS that are not included in the public release, including parallel computing.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Compiler ===<br />
GEOS-Chem works best ''(only)'' with the Intel Ifort Fortran compiler - v11.1<br />
There is an instance of the compiler installed on glooscap, which you can load by doing<br />
module load intel/11.1.073<br />
<br />
'''Alternatively''', I have installed ifort version 11.1.080. This also gives you access to the ''iidb'' debugger. To use this version, add the following to your .bashrc<br />
<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
export FC="ifort"<br />
<br />
=== Code ===<br />
The latest stable version of TOMAS will be included with the next public release. Currently, the latest code can be obtained from Bob Yantosca using git<br />
<br />
git clone git://git.as.harvard.edu/bmy/GEOS-Chem<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
Please follow the directions for [[Installing libraries for GEOS-Chem]] wiki before proceeding. You will need to install the netCDF-4.2 libraries.<br />
<br />
=== Environment ===<br />
After installing the libraries, your .bashrc should include a similar section to the following<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
export LD_LIBRARY_PATH=$GC_LIB:$LD_LIBRARY_PATH<br />
<br />
Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11''<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data (meteorology, emissions, land use, etc.) for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes and additions due to recent GC development and TOMAS specifics.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
=== Restart Files ===<br />
There are restart files for TOMAS at 4x5 resolution at<br />
/net/samqfs/pierce/sfarina/standard_run_directories/restart.TOMASXX<br />
Where ''XX'' is the number of bins. These restart files use an "empty" restart file for 2005/06/01 and spin-up times can be calculated accordingly. I will be adding to this directory in the coming week or two. Restart files for 2x2.5 are located at<br />
/net/samqfs/pierce/sfarina/standard_run_directories/2x2.5/restart.ires.TOMAS15<br />
<br />
So far, I have only used TOMAS15 at this model resolution.<br />
<br />
The North American nested grid is under active development for TOMAS.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== Compile Flags ===<br />
Choice of GEOS-Chem model resolution is now done using compile time flags. Full instructions are available [http://acmg.seas.harvard.edu/geos/doc/man/chapter_3.html#Compile here].<br />
<br />
Example: To build TOMAS for simulations on a global 4x5 degree grid, using geos5 meteorology, I invoke make as follows for each version:<br />
make GRID=4x5 MET=geos5 tomas12<br />
make GRID=4x5 MET=geos5 tomas15<br />
make GRID=4x5 MET=geos5 tomas<br />
make GRID=4x5 MET=geos5 tomas40<br />
<br />
Note: "make tomas" is shorthand for "make TOMAS=yes all"<br />
<br />
Note: "make tomas15" os shorthand for "make TOMAS=yes TOMAS15=yes all"<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 16:13, 3 March 2014 (EST)<br />
<br />
=== Make ===<br />
qrsh allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc:<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do, for example"<br />
cd YOUR_CODE_DIR/GeosCore<br />
pshell16<br />
make -j16 GRID=4x5 MET=geos5 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on the system.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
Alternatively, you can use the following to define a tomas version when compiling:<br />
make TOMAS=yes all<br />
make TOMAS40=yes all<br />
etc.<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
=== A Note about Speed ===<br />
Choosing the appropriate version of tomas for your needs includes consideration of time and resources.<br />
Using 16 processors on glooscap at 4x5 resolution, the model time : real time ratio is roughly as follows:<br />
version | speedup<br />
40 bin - 64<br />
30 bin - 82<br />
15 bin - 144<br />
12 bin - 170<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, '''ask'''.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. To use git on glooscap:<br />
module load git<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have your source code directory, make a separate branch for yourself before making any changes. This will simplify trading and tracking updates/advances/bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
Now that you've successfully run the model, there are a few more hurdles to inspect your data.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
==== Copy ====<br />
Copy the relevant files to your postprocessing directory for a given run<br />
ctm.bpch<br />
diaginfo.dat<br />
tracerinfo.dat<br />
proc_one.pro<br />
averageCNCCN_XX.py <-- XX is TOMAS version<br />
plotCNCCN.py<br />
<br />
==== Split ====<br />
Use the script Bpch_Sep_Sal interactively from within the IDL environment to ctm.bpch into separate months<br />
For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
<br />
==== Create netcdf output ====<br />
Using the IDL script proc_one.pro, we extract information from the monthly .bpch files and save it to the standard netCDF<br />
Edit proc_one.pro to use the correct infile/outfiles<br />
Execute proc_one from your shell:<br />
idl proc_one.pro<br />
<br />
==== Counting CN and CCN ====<br />
Run averageCNCCN_XX.py, where XX is the model version<br />
For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
<br />
==== Plotting the Results====<br />
Edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version.<br />
plotCNCCN.py will automatically detect the model version and customize map names.<br />
To plot the surface and zonal average concentrations of CN3, CN10, CN40, and CN80 for august:<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
<br />
==== NCview ====<br />
You can also use ncview on the file ctm.nc to view individual species concentrations or nucleation rates.<br />
ncview ctm.nc<br />
ncview ctm_nuc.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F. This is a hard to track bug related to ongoing development of grid independent geoschem.<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* If you are trying to run geoschem ''outside'' of a qrsh (grid engine) environment (i.e. on the head node), you will need to add '''ulimit -S -s unlimited''' to your .bashrc<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups is a better idea.<br />
* if you have any questions or you are running into trouble, ''please ask'' either myself, Sajeev, or Jeff for help. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_setup_guide&diff=16052TOMAS setup guide2014-03-03T21:15:30Z<p>Salvatore Farina: /* Make */</p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with [[TOMAS aerosol microphysics]] on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
However, the 'bleeding edge' code has many recent developments in GEOS-Chem/TOMAS that are not included in the public release, including parallel computing.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Compiler ===<br />
GEOS-Chem works best ''(only)'' with the Intel Ifort Fortran compiler - v11.1<br />
There is an instance of the compiler installed on glooscap, which you can load by doing<br />
module load intel/11.1.073<br />
<br />
'''Alternatively''', I have installed ifort version 11.1.080. This also gives you access to the ''iidb'' debugger. To use this version, add the following to your .bashrc<br />
<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
export FC="ifort"<br />
<br />
=== Code ===<br />
The latest stable version of TOMAS will be included with the next public release. Currently, the latest code can be obtained from Bob Yantosca using git<br />
<br />
git clone git://git.as.harvard.edu/bmy/GEOS-Chem<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
Please follow the directions for [[Installing libraries for GEOS-Chem]] wiki before proceeding. You will need to install the netCDF-4.2 libraries.<br />
<br />
=== Environment ===<br />
After installing the libraries, your .bashrc should include a similar section to the following<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
export LD_LIBRARY_PATH=$GC_LIB:$LD_LIBRARY_PATH<br />
<br />
Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11''<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data (meteorology, emissions, land use, etc.) for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes and additions due to recent GC development and TOMAS specifics.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
=== Restart Files ===<br />
There are restart files for TOMAS at 4x5 resolution at<br />
/net/samqfs/pierce/sfarina/standard_run_directories/restart.TOMASXX<br />
Where ''XX'' is the number of bins. These restart files use an "empty" restart file for 2005/06/01 and spin-up times can be calculated accordingly. I will be adding to this directory in the coming week or two. Restart files for 2x2.5 are located at<br />
/net/samqfs/pierce/sfarina/standard_run_directories/2x2.5/restart.ires.TOMAS15<br />
<br />
So far, I have only used TOMAS15 at this model resolution.<br />
<br />
The North American nested grid is under active development for TOMAS.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== Compile Flags ===<br />
Choice of GEOS-Chem model resolution is now done using compile time flags. Full instructions are available [http://acmg.seas.harvard.edu/geos/doc/man/chapter_3.html#Compile here].<br />
<br />
Example: To build TOMAS for simulations on a global 4x5 degree grid, using geos5 meteorology, I invoke make as follows for each version:<br />
make GRID=4x5 MET=geos5 tomas12<br />
make GRID=4x5 MET=geos5 tomas15<br />
make GRID=4x5 MET=geos5 tomas<br />
make GRID=4x5 MET=geos5 tomas40<br />
<br />
Note: "make tomas" is shorthand for "make TOMAS=yes all"<br />
<br />
Note: "make tomas15" os shorthand for "make TOMAS=yes TOMAS15=yes all"<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 16:13, 3 March 2014 (EST)<br />
<br />
=== Make ===<br />
Glooscap allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do<br />
cd YOUR_CODE_DIR/GC_Bleeding_Edge/GeosCore<br />
pshell16<br />
make -j16 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on glooscap.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
Alternatively, you can use the following to define a tomas version when compiling:<br />
make TOMAS=yes all<br />
make TOMAS40=yes all<br />
etc.<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
=== A Note about Speed ===<br />
Choosing the appropriate version of tomas for your needs includes consideration of time and resources.<br />
Using 16 processors on glooscap at 4x5 resolution, the model time : real time ratio is roughly as follows:<br />
version | speedup<br />
40 bin - 64<br />
30 bin - 82<br />
15 bin - 144<br />
12 bin - 170<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, '''ask'''.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. To use git on glooscap:<br />
module load git<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have your source code directory, make a separate branch for yourself before making any changes. This will simplify trading and tracking updates/advances/bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
Now that you've successfully run the model, there are a few more hurdles to inspect your data.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
==== Copy ====<br />
Copy the relevant files to your postprocessing directory for a given run<br />
ctm.bpch<br />
diaginfo.dat<br />
tracerinfo.dat<br />
proc_one.pro<br />
averageCNCCN_XX.py <-- XX is TOMAS version<br />
plotCNCCN.py<br />
<br />
==== Split ====<br />
Use the script Bpch_Sep_Sal interactively from within the IDL environment to ctm.bpch into separate months<br />
For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
<br />
==== Create netcdf output ====<br />
Using the IDL script proc_one.pro, we extract information from the monthly .bpch files and save it to the standard netCDF<br />
Edit proc_one.pro to use the correct infile/outfiles<br />
Execute proc_one from your shell:<br />
idl proc_one.pro<br />
<br />
==== Counting CN and CCN ====<br />
Run averageCNCCN_XX.py, where XX is the model version<br />
For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
<br />
==== Plotting the Results====<br />
Edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version.<br />
plotCNCCN.py will automatically detect the model version and customize map names.<br />
To plot the surface and zonal average concentrations of CN3, CN10, CN40, and CN80 for august:<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
<br />
==== NCview ====<br />
You can also use ncview on the file ctm.nc to view individual species concentrations or nucleation rates.<br />
ncview ctm.nc<br />
ncview ctm_nuc.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F. This is a hard to track bug related to ongoing development of grid independent geoschem.<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* If you are trying to run geoschem ''outside'' of a qrsh (grid engine) environment (i.e. on the head node), you will need to add '''ulimit -S -s unlimited''' to your .bashrc<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups is a better idea.<br />
* if you have any questions or you are running into trouble, ''please ask'' either myself, Sajeev, or Jeff for help. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_setup_guide&diff=16051TOMAS setup guide2014-03-03T21:14:57Z<p>Salvatore Farina: /* Compile Flags */</p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with [[TOMAS aerosol microphysics]] on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
However, the 'bleeding edge' code has many recent developments in GEOS-Chem/TOMAS that are not included in the public release, including parallel computing.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Compiler ===<br />
GEOS-Chem works best ''(only)'' with the Intel Ifort Fortran compiler - v11.1<br />
There is an instance of the compiler installed on glooscap, which you can load by doing<br />
module load intel/11.1.073<br />
<br />
'''Alternatively''', I have installed ifort version 11.1.080. This also gives you access to the ''iidb'' debugger. To use this version, add the following to your .bashrc<br />
<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
export FC="ifort"<br />
<br />
=== Code ===<br />
The latest stable version of TOMAS will be included with the next public release. Currently, the latest code can be obtained from Bob Yantosca using git<br />
<br />
git clone git://git.as.harvard.edu/bmy/GEOS-Chem<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
Please follow the directions for [[Installing libraries for GEOS-Chem]] wiki before proceeding. You will need to install the netCDF-4.2 libraries.<br />
<br />
=== Environment ===<br />
After installing the libraries, your .bashrc should include a similar section to the following<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
export LD_LIBRARY_PATH=$GC_LIB:$LD_LIBRARY_PATH<br />
<br />
Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11''<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data (meteorology, emissions, land use, etc.) for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes and additions due to recent GC development and TOMAS specifics.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
=== Restart Files ===<br />
There are restart files for TOMAS at 4x5 resolution at<br />
/net/samqfs/pierce/sfarina/standard_run_directories/restart.TOMASXX<br />
Where ''XX'' is the number of bins. These restart files use an "empty" restart file for 2005/06/01 and spin-up times can be calculated accordingly. I will be adding to this directory in the coming week or two. Restart files for 2x2.5 are located at<br />
/net/samqfs/pierce/sfarina/standard_run_directories/2x2.5/restart.ires.TOMAS15<br />
<br />
So far, I have only used TOMAS15 at this model resolution.<br />
<br />
The North American nested grid is under active development for TOMAS.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== Compile Flags ===<br />
Choice of GEOS-Chem model resolution is now done using compile time flags. Full instructions are available [http://acmg.seas.harvard.edu/geos/doc/man/chapter_3.html#Compile here].<br />
<br />
Example: To build TOMAS for simulations on a global 4x5 degree grid, using geos5 meteorology, I invoke make as follows for each version:<br />
make GRID=4x5 MET=geos5 tomas12<br />
make GRID=4x5 MET=geos5 tomas15<br />
make GRID=4x5 MET=geos5 tomas<br />
make GRID=4x5 MET=geos5 tomas40<br />
<br />
Note: "make tomas" is shorthand for "make TOMAS=yes all"<br />
<br />
Note: "make tomas15" os shorthand for "make TOMAS=yes TOMAS15=yes all"<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 16:13, 3 March 2014 (EST)<br />
<br />
=== Make ===<br />
Glooscap allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do<br />
cd YOUR_CODE_DIR/GC_Bleeding_Edge/GeosCore<br />
pshell16<br />
make -j16 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on glooscap.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
Alternatively, you can use the following to define a tomas version when compiling:<br />
make TOMAS=yes geos<br />
make TOMAS40=yes geos<br />
etc.<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
=== A Note about Speed ===<br />
Choosing the appropriate version of tomas for your needs includes consideration of time and resources.<br />
Using 16 processors on glooscap at 4x5 resolution, the model time : real time ratio is roughly as follows:<br />
version | speedup<br />
40 bin - 64<br />
30 bin - 82<br />
15 bin - 144<br />
12 bin - 170<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, '''ask'''.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. To use git on glooscap:<br />
module load git<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have your source code directory, make a separate branch for yourself before making any changes. This will simplify trading and tracking updates/advances/bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
Now that you've successfully run the model, there are a few more hurdles to inspect your data.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
==== Copy ====<br />
Copy the relevant files to your postprocessing directory for a given run<br />
ctm.bpch<br />
diaginfo.dat<br />
tracerinfo.dat<br />
proc_one.pro<br />
averageCNCCN_XX.py <-- XX is TOMAS version<br />
plotCNCCN.py<br />
<br />
==== Split ====<br />
Use the script Bpch_Sep_Sal interactively from within the IDL environment to ctm.bpch into separate months<br />
For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
<br />
==== Create netcdf output ====<br />
Using the IDL script proc_one.pro, we extract information from the monthly .bpch files and save it to the standard netCDF<br />
Edit proc_one.pro to use the correct infile/outfiles<br />
Execute proc_one from your shell:<br />
idl proc_one.pro<br />
<br />
==== Counting CN and CCN ====<br />
Run averageCNCCN_XX.py, where XX is the model version<br />
For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
<br />
==== Plotting the Results====<br />
Edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version.<br />
plotCNCCN.py will automatically detect the model version and customize map names.<br />
To plot the surface and zonal average concentrations of CN3, CN10, CN40, and CN80 for august:<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
<br />
==== NCview ====<br />
You can also use ncview on the file ctm.nc to view individual species concentrations or nucleation rates.<br />
ncview ctm.nc<br />
ncview ctm_nuc.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F. This is a hard to track bug related to ongoing development of grid independent geoschem.<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* If you are trying to run geoschem ''outside'' of a qrsh (grid engine) environment (i.e. on the head node), you will need to add '''ulimit -S -s unlimited''' to your .bashrc<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups is a better idea.<br />
* if you have any questions or you are running into trouble, ''please ask'' either myself, Sajeev, or Jeff for help. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_setup_guide&diff=16050TOMAS setup guide2014-03-03T21:14:37Z<p>Salvatore Farina: /* Compile Flags */</p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with [[TOMAS aerosol microphysics]] on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
However, the 'bleeding edge' code has many recent developments in GEOS-Chem/TOMAS that are not included in the public release, including parallel computing.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Compiler ===<br />
GEOS-Chem works best ''(only)'' with the Intel Ifort Fortran compiler - v11.1<br />
There is an instance of the compiler installed on glooscap, which you can load by doing<br />
module load intel/11.1.073<br />
<br />
'''Alternatively''', I have installed ifort version 11.1.080. This also gives you access to the ''iidb'' debugger. To use this version, add the following to your .bashrc<br />
<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
export FC="ifort"<br />
<br />
=== Code ===<br />
The latest stable version of TOMAS will be included with the next public release. Currently, the latest code can be obtained from Bob Yantosca using git<br />
<br />
git clone git://git.as.harvard.edu/bmy/GEOS-Chem<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
Please follow the directions for [[Installing libraries for GEOS-Chem]] wiki before proceeding. You will need to install the netCDF-4.2 libraries.<br />
<br />
=== Environment ===<br />
After installing the libraries, your .bashrc should include a similar section to the following<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
export LD_LIBRARY_PATH=$GC_LIB:$LD_LIBRARY_PATH<br />
<br />
Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11''<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data (meteorology, emissions, land use, etc.) for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes and additions due to recent GC development and TOMAS specifics.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
=== Restart Files ===<br />
There are restart files for TOMAS at 4x5 resolution at<br />
/net/samqfs/pierce/sfarina/standard_run_directories/restart.TOMASXX<br />
Where ''XX'' is the number of bins. These restart files use an "empty" restart file for 2005/06/01 and spin-up times can be calculated accordingly. I will be adding to this directory in the coming week or two. Restart files for 2x2.5 are located at<br />
/net/samqfs/pierce/sfarina/standard_run_directories/2x2.5/restart.ires.TOMAS15<br />
<br />
So far, I have only used TOMAS15 at this model resolution.<br />
<br />
The North American nested grid is under active development for TOMAS.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== Compile Flags ===<br />
Choice of GEOS-Chem model resolution is now done using compile time flags. Full instructions are available [http://acmg.seas.harvard.edu/geos/doc/man/chapter_3.html#Compile here].<br />
<br />
Example: To build TOMAS for simulations on a global 4x5 degree grid, using geos5 meteorology, I invoke make as follows:<br />
make GRID=4x5 MET=geos5 tomas12<br />
make GRID=4x5 MET=geos5 tomas15<br />
make GRID=4x5 MET=geos5 tomas<br />
make GRID=4x5 MET=geos5 tomas40<br />
<br />
Note: "make tomas" is shorthand for "make TOMAS=yes all"<br />
<br />
Note: "make tomas15" os shorthand for "make TOMAS=yes TOMAS15=yes all"<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 16:13, 3 March 2014 (EST)<br />
<br />
=== Make ===<br />
Glooscap allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do<br />
cd YOUR_CODE_DIR/GC_Bleeding_Edge/GeosCore<br />
pshell16<br />
make -j16 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on glooscap.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
Alternatively, you can use the following to define a tomas version when compiling:<br />
make TOMAS=yes geos<br />
make TOMAS40=yes geos<br />
etc.<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
=== A Note about Speed ===<br />
Choosing the appropriate version of tomas for your needs includes consideration of time and resources.<br />
Using 16 processors on glooscap at 4x5 resolution, the model time : real time ratio is roughly as follows:<br />
version | speedup<br />
40 bin - 64<br />
30 bin - 82<br />
15 bin - 144<br />
12 bin - 170<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, '''ask'''.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. To use git on glooscap:<br />
module load git<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have your source code directory, make a separate branch for yourself before making any changes. This will simplify trading and tracking updates/advances/bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
Now that you've successfully run the model, there are a few more hurdles to inspect your data.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
==== Copy ====<br />
Copy the relevant files to your postprocessing directory for a given run<br />
ctm.bpch<br />
diaginfo.dat<br />
tracerinfo.dat<br />
proc_one.pro<br />
averageCNCCN_XX.py <-- XX is TOMAS version<br />
plotCNCCN.py<br />
<br />
==== Split ====<br />
Use the script Bpch_Sep_Sal interactively from within the IDL environment to ctm.bpch into separate months<br />
For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
<br />
==== Create netcdf output ====<br />
Using the IDL script proc_one.pro, we extract information from the monthly .bpch files and save it to the standard netCDF<br />
Edit proc_one.pro to use the correct infile/outfiles<br />
Execute proc_one from your shell:<br />
idl proc_one.pro<br />
<br />
==== Counting CN and CCN ====<br />
Run averageCNCCN_XX.py, where XX is the model version<br />
For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
<br />
==== Plotting the Results====<br />
Edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version.<br />
plotCNCCN.py will automatically detect the model version and customize map names.<br />
To plot the surface and zonal average concentrations of CN3, CN10, CN40, and CN80 for august:<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
<br />
==== NCview ====<br />
You can also use ncview on the file ctm.nc to view individual species concentrations or nucleation rates.<br />
ncview ctm.nc<br />
ncview ctm_nuc.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F. This is a hard to track bug related to ongoing development of grid independent geoschem.<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* If you are trying to run geoschem ''outside'' of a qrsh (grid engine) environment (i.e. on the head node), you will need to add '''ulimit -S -s unlimited''' to your .bashrc<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups is a better idea.<br />
* if you have any questions or you are running into trouble, ''please ask'' either myself, Sajeev, or Jeff for help. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_setup_guide&diff=16049TOMAS setup guide2014-03-03T21:13:11Z<p>Salvatore Farina: /* Compile Flags */</p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with [[TOMAS aerosol microphysics]] on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
However, the 'bleeding edge' code has many recent developments in GEOS-Chem/TOMAS that are not included in the public release, including parallel computing.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Compiler ===<br />
GEOS-Chem works best ''(only)'' with the Intel Ifort Fortran compiler - v11.1<br />
There is an instance of the compiler installed on glooscap, which you can load by doing<br />
module load intel/11.1.073<br />
<br />
'''Alternatively''', I have installed ifort version 11.1.080. This also gives you access to the ''iidb'' debugger. To use this version, add the following to your .bashrc<br />
<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
export FC="ifort"<br />
<br />
=== Code ===<br />
The latest stable version of TOMAS will be included with the next public release. Currently, the latest code can be obtained from Bob Yantosca using git<br />
<br />
git clone git://git.as.harvard.edu/bmy/GEOS-Chem<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
Please follow the directions for [[Installing libraries for GEOS-Chem]] wiki before proceeding. You will need to install the netCDF-4.2 libraries.<br />
<br />
=== Environment ===<br />
After installing the libraries, your .bashrc should include a similar section to the following<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
export LD_LIBRARY_PATH=$GC_LIB:$LD_LIBRARY_PATH<br />
<br />
Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11''<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data (meteorology, emissions, land use, etc.) for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes and additions due to recent GC development and TOMAS specifics.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
=== Restart Files ===<br />
There are restart files for TOMAS at 4x5 resolution at<br />
/net/samqfs/pierce/sfarina/standard_run_directories/restart.TOMASXX<br />
Where ''XX'' is the number of bins. These restart files use an "empty" restart file for 2005/06/01 and spin-up times can be calculated accordingly. I will be adding to this directory in the coming week or two. Restart files for 2x2.5 are located at<br />
/net/samqfs/pierce/sfarina/standard_run_directories/2x2.5/restart.ires.TOMAS15<br />
<br />
So far, I have only used TOMAS15 at this model resolution.<br />
<br />
The North American nested grid is under active development for TOMAS.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== Compile Flags ===<br />
Choice of GEOS-Chem model resolution is now done using compile time flags. Full instructions are available [http://acmg.seas.harvard.edu/geos/doc/man/chapter_3.html#Compile here].<br />
<br />
Example: To build TOMAS15 for simulations on a global 4x5 degree grid, using geos5 meteorology, I invoke make as follows:<br />
make GRID=4x5 MET=geos5 tomas15<br />
<br />
Note that "make tomas15" is shorthand for "make TOMAS=yes TOMAS15=yes all"<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 16:13, 3 March 2014 (EST)<br />
<br />
=== Make ===<br />
Glooscap allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do<br />
cd YOUR_CODE_DIR/GC_Bleeding_Edge/GeosCore<br />
pshell16<br />
make -j16 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on glooscap.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
Alternatively, you can use the following to define a tomas version when compiling:<br />
make TOMAS=yes geos<br />
make TOMAS40=yes geos<br />
etc.<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
=== A Note about Speed ===<br />
Choosing the appropriate version of tomas for your needs includes consideration of time and resources.<br />
Using 16 processors on glooscap at 4x5 resolution, the model time : real time ratio is roughly as follows:<br />
version | speedup<br />
40 bin - 64<br />
30 bin - 82<br />
15 bin - 144<br />
12 bin - 170<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, '''ask'''.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. To use git on glooscap:<br />
module load git<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have your source code directory, make a separate branch for yourself before making any changes. This will simplify trading and tracking updates/advances/bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
Now that you've successfully run the model, there are a few more hurdles to inspect your data.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
==== Copy ====<br />
Copy the relevant files to your postprocessing directory for a given run<br />
ctm.bpch<br />
diaginfo.dat<br />
tracerinfo.dat<br />
proc_one.pro<br />
averageCNCCN_XX.py <-- XX is TOMAS version<br />
plotCNCCN.py<br />
<br />
==== Split ====<br />
Use the script Bpch_Sep_Sal interactively from within the IDL environment to ctm.bpch into separate months<br />
For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
<br />
==== Create netcdf output ====<br />
Using the IDL script proc_one.pro, we extract information from the monthly .bpch files and save it to the standard netCDF<br />
Edit proc_one.pro to use the correct infile/outfiles<br />
Execute proc_one from your shell:<br />
idl proc_one.pro<br />
<br />
==== Counting CN and CCN ====<br />
Run averageCNCCN_XX.py, where XX is the model version<br />
For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
<br />
==== Plotting the Results====<br />
Edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version.<br />
plotCNCCN.py will automatically detect the model version and customize map names.<br />
To plot the surface and zonal average concentrations of CN3, CN10, CN40, and CN80 for august:<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
<br />
==== NCview ====<br />
You can also use ncview on the file ctm.nc to view individual species concentrations or nucleation rates.<br />
ncview ctm.nc<br />
ncview ctm_nuc.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F. This is a hard to track bug related to ongoing development of grid independent geoschem.<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* If you are trying to run geoschem ''outside'' of a qrsh (grid engine) environment (i.e. on the head node), you will need to add '''ulimit -S -s unlimited''' to your .bashrc<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups is a better idea.<br />
* if you have any questions or you are running into trouble, ''please ask'' either myself, Sajeev, or Jeff for help. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_setup_guide&diff=16048TOMAS setup guide2014-03-03T21:08:01Z<p>Salvatore Farina: /* define.h */</p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with [[TOMAS aerosol microphysics]] on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
However, the 'bleeding edge' code has many recent developments in GEOS-Chem/TOMAS that are not included in the public release, including parallel computing.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Compiler ===<br />
GEOS-Chem works best ''(only)'' with the Intel Ifort Fortran compiler - v11.1<br />
There is an instance of the compiler installed on glooscap, which you can load by doing<br />
module load intel/11.1.073<br />
<br />
'''Alternatively''', I have installed ifort version 11.1.080. This also gives you access to the ''iidb'' debugger. To use this version, add the following to your .bashrc<br />
<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
export FC="ifort"<br />
<br />
=== Code ===<br />
The latest stable version of TOMAS will be included with the next public release. Currently, the latest code can be obtained from Bob Yantosca using git<br />
<br />
git clone git://git.as.harvard.edu/bmy/GEOS-Chem<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
Please follow the directions for [[Installing libraries for GEOS-Chem]] wiki before proceeding. You will need to install the netCDF-4.2 libraries.<br />
<br />
=== Environment ===<br />
After installing the libraries, your .bashrc should include a similar section to the following<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
export LD_LIBRARY_PATH=$GC_LIB:$LD_LIBRARY_PATH<br />
<br />
Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11''<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data (meteorology, emissions, land use, etc.) for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes and additions due to recent GC development and TOMAS specifics.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
=== Restart Files ===<br />
There are restart files for TOMAS at 4x5 resolution at<br />
/net/samqfs/pierce/sfarina/standard_run_directories/restart.TOMASXX<br />
Where ''XX'' is the number of bins. These restart files use an "empty" restart file for 2005/06/01 and spin-up times can be calculated accordingly. I will be adding to this directory in the coming week or two. Restart files for 2x2.5 are located at<br />
/net/samqfs/pierce/sfarina/standard_run_directories/2x2.5/restart.ires.TOMAS15<br />
<br />
So far, I have only used TOMAS15 at this model resolution.<br />
<br />
The North American nested grid is under active development for TOMAS.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== Compile Flags ===<br />
Choice of GEOS-Chem model resolution is now done using compile time flags. Follow the instructions [http://acmg.seas.harvard.edu/geos/doc/man/chapter_3.html#3.4 here].<br />
<br />
=== Make ===<br />
Glooscap allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do<br />
cd YOUR_CODE_DIR/GC_Bleeding_Edge/GeosCore<br />
pshell16<br />
make -j16 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on glooscap.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
Alternatively, you can use the following to define a tomas version when compiling:<br />
make TOMAS=yes geos<br />
make TOMAS40=yes geos<br />
etc.<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
=== A Note about Speed ===<br />
Choosing the appropriate version of tomas for your needs includes consideration of time and resources.<br />
Using 16 processors on glooscap at 4x5 resolution, the model time : real time ratio is roughly as follows:<br />
version | speedup<br />
40 bin - 64<br />
30 bin - 82<br />
15 bin - 144<br />
12 bin - 170<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, '''ask'''.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. To use git on glooscap:<br />
module load git<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have your source code directory, make a separate branch for yourself before making any changes. This will simplify trading and tracking updates/advances/bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
Now that you've successfully run the model, there are a few more hurdles to inspect your data.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
==== Copy ====<br />
Copy the relevant files to your postprocessing directory for a given run<br />
ctm.bpch<br />
diaginfo.dat<br />
tracerinfo.dat<br />
proc_one.pro<br />
averageCNCCN_XX.py <-- XX is TOMAS version<br />
plotCNCCN.py<br />
<br />
==== Split ====<br />
Use the script Bpch_Sep_Sal interactively from within the IDL environment to ctm.bpch into separate months<br />
For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
<br />
==== Create netcdf output ====<br />
Using the IDL script proc_one.pro, we extract information from the monthly .bpch files and save it to the standard netCDF<br />
Edit proc_one.pro to use the correct infile/outfiles<br />
Execute proc_one from your shell:<br />
idl proc_one.pro<br />
<br />
==== Counting CN and CCN ====<br />
Run averageCNCCN_XX.py, where XX is the model version<br />
For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
<br />
==== Plotting the Results====<br />
Edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version.<br />
plotCNCCN.py will automatically detect the model version and customize map names.<br />
To plot the surface and zonal average concentrations of CN3, CN10, CN40, and CN80 for august:<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
<br />
==== NCview ====<br />
You can also use ncview on the file ctm.nc to view individual species concentrations or nucleation rates.<br />
ncview ctm.nc<br />
ncview ctm_nuc.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F. This is a hard to track bug related to ongoing development of grid independent geoschem.<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* If you are trying to run geoschem ''outside'' of a qrsh (grid engine) environment (i.e. on the head node), you will need to add '''ulimit -S -s unlimited''' to your .bashrc<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups is a better idea.<br />
* if you have any questions or you are running into trouble, ''please ask'' either myself, Sajeev, or Jeff for help. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_setup_guide&diff=16047TOMAS setup guide2014-03-03T21:06:56Z<p>Salvatore Farina: /* Code */</p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with [[TOMAS aerosol microphysics]] on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
However, the 'bleeding edge' code has many recent developments in GEOS-Chem/TOMAS that are not included in the public release, including parallel computing.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Compiler ===<br />
GEOS-Chem works best ''(only)'' with the Intel Ifort Fortran compiler - v11.1<br />
There is an instance of the compiler installed on glooscap, which you can load by doing<br />
module load intel/11.1.073<br />
<br />
'''Alternatively''', I have installed ifort version 11.1.080. This also gives you access to the ''iidb'' debugger. To use this version, add the following to your .bashrc<br />
<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
export FC="ifort"<br />
<br />
=== Code ===<br />
The latest stable version of TOMAS will be included with the next public release. Currently, the latest code can be obtained from Bob Yantosca using git<br />
<br />
git clone git://git.as.harvard.edu/bmy/GEOS-Chem<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
Please follow the directions for [[Installing libraries for GEOS-Chem]] wiki before proceeding. You will need to install the netCDF-4.2 libraries.<br />
<br />
=== Environment ===<br />
After installing the libraries, your .bashrc should include a similar section to the following<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
export LD_LIBRARY_PATH=$GC_LIB:$LD_LIBRARY_PATH<br />
<br />
Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11''<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data (meteorology, emissions, land use, etc.) for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes and additions due to recent GC development and TOMAS specifics.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
=== Restart Files ===<br />
There are restart files for TOMAS at 4x5 resolution at<br />
/net/samqfs/pierce/sfarina/standard_run_directories/restart.TOMASXX<br />
Where ''XX'' is the number of bins. These restart files use an "empty" restart file for 2005/06/01 and spin-up times can be calculated accordingly. I will be adding to this directory in the coming week or two. Restart files for 2x2.5 are located at<br />
/net/samqfs/pierce/sfarina/standard_run_directories/2x2.5/restart.ires.TOMAS15<br />
<br />
So far, I have only used TOMAS15 at this model resolution.<br />
<br />
The North American nested grid is under active development for TOMAS.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== define.h ===<br />
Choice of GEOS-Chem model resolution is done in Headers/define.h . Follow the instructions [http://acmg.seas.harvard.edu/geos/doc/man/chapter_3.html#3.4 here].<br />
<br />
=== Make ===<br />
Glooscap allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do<br />
cd YOUR_CODE_DIR/GC_Bleeding_Edge/GeosCore<br />
pshell16<br />
make -j16 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on glooscap.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
Alternatively, you can use the following to define a tomas version when compiling:<br />
make TOMAS=yes geos<br />
make TOMAS40=yes geos<br />
etc.<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
=== A Note about Speed ===<br />
Choosing the appropriate version of tomas for your needs includes consideration of time and resources.<br />
Using 16 processors on glooscap at 4x5 resolution, the model time : real time ratio is roughly as follows:<br />
version | speedup<br />
40 bin - 64<br />
30 bin - 82<br />
15 bin - 144<br />
12 bin - 170<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, '''ask'''.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. To use git on glooscap:<br />
module load git<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have your source code directory, make a separate branch for yourself before making any changes. This will simplify trading and tracking updates/advances/bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
Now that you've successfully run the model, there are a few more hurdles to inspect your data.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
==== Copy ====<br />
Copy the relevant files to your postprocessing directory for a given run<br />
ctm.bpch<br />
diaginfo.dat<br />
tracerinfo.dat<br />
proc_one.pro<br />
averageCNCCN_XX.py <-- XX is TOMAS version<br />
plotCNCCN.py<br />
<br />
==== Split ====<br />
Use the script Bpch_Sep_Sal interactively from within the IDL environment to ctm.bpch into separate months<br />
For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
<br />
==== Create netcdf output ====<br />
Using the IDL script proc_one.pro, we extract information from the monthly .bpch files and save it to the standard netCDF<br />
Edit proc_one.pro to use the correct infile/outfiles<br />
Execute proc_one from your shell:<br />
idl proc_one.pro<br />
<br />
==== Counting CN and CCN ====<br />
Run averageCNCCN_XX.py, where XX is the model version<br />
For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
<br />
==== Plotting the Results====<br />
Edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version.<br />
plotCNCCN.py will automatically detect the model version and customize map names.<br />
To plot the surface and zonal average concentrations of CN3, CN10, CN40, and CN80 for august:<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
<br />
==== NCview ====<br />
You can also use ncview on the file ctm.nc to view individual species concentrations or nucleation rates.<br />
ncview ctm.nc<br />
ncview ctm_nuc.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F. This is a hard to track bug related to ongoing development of grid independent geoschem.<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* If you are trying to run geoschem ''outside'' of a qrsh (grid engine) environment (i.e. on the head node), you will need to add '''ulimit -S -s unlimited''' to your .bashrc<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups is a better idea.<br />
* if you have any questions or you are running into trouble, ''please ask'' either myself, Sajeev, or Jeff for help. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_aerosol_microphysics&diff=16046TOMAS aerosol microphysics2014-03-03T21:02:06Z<p>Salvatore Farina: /* Fixes for convenience */</p>
<hr />
<div>This page describes the TOMAS aerosol microphysics option in GEOS-Chem. TOMAS is one of two aerosol microphysics packages being incorporated into GEOS-Chem, the other being [[APM aerosol microphysics|APM]].<br />
<br />
== Overview ==<br />
<br />
The TwO-Moment Aerosol Sectional (TOMAS) microphysics package was developed for implementation into GEOS-Chem at Carnegie-Mellon University. Using a moving sectional and moment-based approach, TOMAS tracks two independent moments (number and mass) of the aerosol size distribution for a number of discrete size bins. It also contains codes to simulate nucleation, condensation, and coagulation processes. The aerosol species that are considered with high size resolution are sulfate, sea-salt, OC, EC, and dust. An advantage of TOMAS is the full size resolution for all chemical species and the conservation of aerosol number, the latter of which allows one to construct aerosol and CCN number budgets that will balance.<br />
<br />
=== Authors and collaborators ===<br />
* [mailto:petera@andrew.cmu.edu Peter Adams] ''(Carnegie-Mellon U.)'' -- Principal Investigator<br />
* [mailto:wtrivita@staffmail.ed.ac.uk Win Trivitayanurak] ''(Department of Highways, Thailand)''<br />
* [mailto:dwesterv@andrew.cmu.edu Dan Westervelt] ''(Carnegie-Mellon U.)''<br />
* [mailto:jeffrey.pierce@dal.ca Jeffrey Pierce] ''(Dalhousie U.)''<br />
* [mailto:sal.farina@gmail.com Salvatore Farina] ''(Colorado State U.)''<br />
<br />
Questions regarding TOMAS can be directed at Dan (e-mail linked above).<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 11:53, 27 January 2010 (EST)<br />
<br />
=== TOMAS User Groups ===<br />
<br />
{| border=1 cellspacing=0 cellpadding=5<br />
|- bgcolor="#cccccc"<br />
!User Group<br />
!Personnel<br />
!Projects<br />
|-valign="top"<br />
|[http://www.ce.cmu.edu/%7Eadams/ Carnegie-Mellon University]<br />
|[http://www.ce.cmu.edu/%7Eadams/people.htm#peter Peter Adams]<br>[http://www.ce.cmu.edu/~dwesterv/Site/Home.html Dan Westervelt]<br />
| [http://www.atmos-chem-phys-discuss.net/13/8333/2013/acpd-13-8333-2013.html New particle formation evaluation in GC-TOMAS] <br> Sensitivity of CCN to nucleation rates <br> Development of number tagging and source apportionment model for GC-TOMAS<br />
|-valign="top"<br />
|[http://fizz.phys.dal.ca/%7Epierce/ Dalhousie University] <br> [http://www.atmos.colostate.edu/faculty/pierce.php Colorado State]<br />
|[http://atm.dal.ca/Faculty/Jeffrey_Pierce.php Jeffrey Pierce]<br>Sal Farina<br>Stephen D'Andrea<br />
|Sensitivity of CCN to condensational growth rates <br> TOMAS parallelization <br> Others...<br />
|-valign="top"<br />
|Add yours here<br />
|<br />
|<br />
|}<br />
<br />
== TOMAS-specific setup ==<br />
TOMAS has its own run directories (run.Tomas) that can be downloaded from the Harvard FTP. The <tt>input.geos</tt> file will look slightly different from standard GEOS-Chem, and between versions.<br />
<br />
Pre- v9.02:<br />
To turn on TOMAS, see the "Microphysics menu" in <tt>input.geos</tt> and make sure TOMAS is set to '''T'''. <br />
<br />
v9.02 and later:<br />
TOMAS is enabled or disabled at compile time - the TOMAS flag in input.geos has been removed.<br />
<br />
<br />
TOMAS is a simulation type 3 and utilizes 171-423 tracers. Each aerosol species requires 30 tracers for the 30 bin size resolution, 12 for the 12 bin, etc. Here is the (abbreviated) default setup in input.geos for TOMAS-30 in v9.02 and later (see run.Tomas directory):<br />
<br />
Tracer # Description <br />
1- 62 Std Geos Chem <br />
63 H2SO4 <br />
64- 93 Number <br />
94-123 Sulfate <br />
124-153 Sea-salt <br />
154-183 Hydrophilic EC <br />
184-213 Hydrophobic EC <br />
214-243 Hydrophilic OC <br />
244-273 Hydrophobic OC <br />
274-303 Mineral dust <br />
304-333 Aerosol water<br />
<br />
TOMAS-40 requires 423 tracers (~360 TOMAS tracers for each of the 40-bin species, and ~62 standard GEOS-Chem tracers) <br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 18:48, 8 July 2013 (EDT)<br />
<br />
== Implementation notes ==<br />
<br />
TOMAS validation in [[GEOS-Chem v8-03-01]] was completed on 24 Feb 2010.<br />
<br />
=== Update April 2013 ===<br />
<br />
'''''This update was tested in the 1-month benchmark simulation [[GEOS-Chem_v9-02_benchmark_history#v9-02k|v9-02k]] and approved on 07 Jun 2013.'''''<br />
<br />
Sal Farina has been working with the GEOS-Chem Support Team to inline the TOMAS aerosol microphysics code into the <tt>GeosCore</tt> directory. All TOMAS-specific sections of code are now segregated from the rest of GEOS-Chem with C-preprocessor statements such as:<br />
<br />
#if defined( TOMAS )<br />
<br />
# if defined( TOMAS40 ) <br />
... Code for 40 bin TOMAS simulation (optional) goes here ...<br />
# elif defined( TOMAS12 )<br />
... Code for 12 bin TOMAS simulation (optional) goes here ...<br />
# elif defined( TOMAS15 )<br />
... Code for 15 bin TOMAS simulation (optional) goes here ...<br />
# else<br />
... Code for 30 bin TOMAS simulation (default) goes here ...<br />
# endif<br />
<br />
#endif <br />
<br />
TOMAS is now invoked by compiling GEOS-Chem with one of the following options:<br />
<br />
make -j4 TOMAS=yes ... # Compiles GEOS-Chem for the 30 bin (default) TOMAS simulation<br />
# -j4 compiles 4 files at a time; this reduces overall compilation time<br />
<br />
or<br />
<br />
make -j4 TOMAS40=yes ... # Compiles GEOS-Chem for the 40 bin (optional) TOMAS simulation<br />
# -j4 compiles 4 files at a time; this reduces overall compilation time<br />
<br />
All files in the old <tt>GeosTomas/</tt> directory have now been deleted, as these have been rendered obsolete.<br />
<br />
These updates are included in [[GEOS-Chem v9-02]]. These modifications will not affect the existing GEOS-Chem simulations, as all TOMAS code is not compiled into the executable unless you specify either <tt>TOMAS=yes</tt> or <tt>TOMAS40=yes</tt> at compile time.<br />
<br />
We are in the process of updating the wiki to reflect these changes as they are implemented. <br />
<br />
--[[User:Bmy|Bob Y.]] 13:59, 23 April 2013 (EDT)<br><br />
--[[User:Salvatore Farina|Salvatore Farina]] 13:49, 4 June 2013 (EDT)<br />
<br />
== Computational Information ==<br />
<br />
GC-TOMAS v9-02 (30 sections) on 8 processors: <br />
One year simulation = 7-8 days wall clock time<br />
<br />
More speedups are available using lower aerosol size resolution<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 11:00, 07 May 2013 (EST)<br />
<br />
GC-TOMAS v9-03 on 16 processors (glooscap)<br />
<br />
12 bin: 2.8 days wall time per sim year<br />
<br />
15 bin: 3.3 days wall time per sim year<br />
<br />
30 bin: 6.1 days wall time per sim year<br />
<br />
40 bin: 7.8 days wall time per sim year<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 15:51, 3 March 2014 (EST)<br />
<br />
== Microphysics Code==<br />
The aerosol microphysics code is largely contained within the file <tt>tomas_mod.f</tt>. Tomas_mod and its subroutines are modular -- they use all their own internal variables. For details, see tomas_mod.f and comments. <br />
<br />
=== Nucleation ===<br />
The choice of nucleation theory is selected in the header section of <tt>tomas_mod.f</tt>. The choices are currently binary homogeneous nucleation as in Vehkamaki, 2001 or ternary homogenous nucleation as in Napari et al., 2002. The ternary nucleation rate is typically scaled by a globally uniform tuning factor of 10^-4 or 10^-5. Binary nucleation (Vehkamaki et al. 2002), ion-mediated nucleation (Yu, 2008) and activation nucleation (Kulmala, 2006) are options as well.<br />
<br />
In TOMAS-12 and TOMAS-30, nucleated particles follow the Kerminen approximation to grow to the smallest size bin. This has a tendency to overpredict the number of particles in the smallest bins of those models. See Y. H. Lee, J. R. Pierce, and P. J. Adams 2013 [http://www.geosci-model-dev-discuss.net/6/893/2013/gmdd-6-893-2013.html here] for more details on the consequences of this.<br />
<br />
=== Condensation ===<br />
<br />
=== Coagulation ===<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 14:08, 9 May 2011 (EST)<br />
<br />
== Validation ==<br />
<br />
GC-TOMAS [[GEOS-Chem v8-03-01|v8-03-01]] generally compares very well with observations and other models. Please see our [http://acmg.seas.harvard.edu/geos/wiki_docs/TOMAS/TOMAS_benchmark_ForHarvard.pdf GC-TOMAS v8-02-05 validation document] for more information and figures. <br />
<br />
Below are some results of benchmarking GC-TOMAS with earlier versions of the model as well as observations:<br />
<br />
[[Image:CN10_smaller.jpg]]<br />
<br />
'''Figure 1: CN10 concentrations predicted by GC-TOMAS v8-02-05 against observations. Descriptions of observational data can be found on p 5454 of Pierce et al, Atmos. Chem. Phys., 7, 2007.'''<br />
<br />
----<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 20:13, 10 February 2010 (EST)<br />
<br />
== Other features of TOMAS ==<br />
Other varieties of TOMAS are suited for specific science questions, for example with nucleation studies where explicit aerosol dynamics are needed for nanometer-sized particles. <br />
<br />
=== Set-up Guide ===<br />
<br />
This [[TOMAS setup guide]] was written for users on ACE-NET's Glooscap cluster, but may be more generally applicable.<br />
Please contact [mailto:sal.farina@gmail.com Salvatore Farina] for help in obtaining the latest development version of GEOS-Chem with TOMAS.<br />
This will allow you to take advantage of parallel computation in TOMAS.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 11:55, 26 July 2013 (EDT)<br />
<br />
=== Size Resolution ===<br />
<br />
;TOMAS-30: All 7 chemical species have size resolution ranging from 10 nm to 10 µm, spanned by 30 logarithmically spaced (mass doubling) bins.<br />
;TOMAS-40: Same as TOMAS-30 with 10 additional (mass doubling) sub-10nm bins with a lower limit ~1nm<br />
;TOMAS-12: All 7 chemical species have size resolution ranging from 10 nm to 1 µm spanned by 10 logarithmically spaced (mass quadrupling) bins and two supermicron bins. Coarser resolution than TOMAS-30 - Improved computation time. <br />
;TOMAS-15: Same as TOMAS-12 with 3 additional (mass quadrupling) sub-10nm bins with a lower limit ~2nm. Analogous to TOMAS40 with improved computation time.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 12:51, 4 June 2013 (EDT)<br />
<br />
=== Nesting and grid size ===<br />
TOMAS is implemented on a 2x2.5 North American domain. Developed by Jeffrey Pierce (jeffrey.pierce@dal.ca)<br />
<br />
=== AOD, CCN post-processing code ===<br />
Codes available for calculating aerosol optical depth using TOMAS predicted aerosol composition and size and Mie Theory. Also CCN concentrations calculated from TOMAS size-resolved composition and Kohler theory. Developed by Yunha Lee and Jeffrey Pierce, adapted for GEOS-Chem output by Jeffrey Pierce.<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 2:00, 9 May 2011 (EST)<br />
<br />
== References ==<br />
<br />
In this section we provide references relevant to TOMAS aerosl microphysics simulations.<br />
<br />
=== Studies using TOMAS simulations ===<br />
#'''Nucleation in GEOS-Chem:''' Westervelt, D. M., Pierce, J. R., Riipinen, I., Trivitayanurak, W., Hamed, A., Kulmala, M., Laaksonen, A., Decesari, S., and Adams, P. J.: ''Formation and growth of nucleated particles into cloud condensation nuclei: model-measurement comparison'', <u>Atmos. Chem. Phys. Discuss.</u>, '''13''', 8333-8386, doi:10.5194/acpd-13-8333-2013, 2013. [http://www.atmos-chem-phys-discuss.net/13/8333/2013/acpd-13-8333-2013.html LINK]<br />
#'''TOMAS implementation in GEOS-Chem:''' Trivitayanurak, W., Adams, P. J., Spracklen, D. V. and Carslaw, K. S.: ''Tropospheric aerosol microphysics simulation with assimilated meteorology: model description and intermodel comparison'', <u>Atmos. Chem. Phys.</u>, '''8'''(12), 3149-3168, 2008.<br />
#'''TOMAS initial paper, sulfate only:''' Adams, P. J. and Seinfeld, J. H.: ''redicting global aerosol size distributions in general circulation models'', <u>J. Geophys. Res.-Atmos.</u>, '''107'''(D19), -, doi:Artn 4370 Doi 10.1029/2001jd001010, 2002.<br />
#'''TOMAS with sea-salt:''' Pierce, J.R., and Adams P.J., ''Global evaluation of CCN formation by direct emission of sea salt and growth of ultrafine sea salt'', <u>J. Geophys. Res.-Atmos.</u>, '''111''' (D6), doi:10.1029/2005JD006186, 2006.<br />
#'''TOMAS with carbonaceous aerosol:''' Pierce, J. R., Chen, K. and Adams, P. J.: ''Contribution of primary carbonaceous aerosol to cloud condensation nuclei: processes and uncertainties evaluated with a global aerosol microphysics model'', <u>Atmos. Chem. Phys.</u>, '''7'''(20), 5447-5466, doi:10.5194/acp-7-5447-2007, 2007.<br />
#'''TOMAS with dust:''' Lee, Y.H., K. Chen, and P.J. Adams, 2009: ''Development of a global model of mineral dust aerosol microphysics''. <u>Atmos. Chem. Phys.</u>, '''8''', 2441-2558, doi:10.5194/acp-9-2441-2009.<br />
<br />
--[[User:Bmy|Bob Y.]] 17:04, 24 February 2014 (EST)<br />
<br />
=== Input data used by TOMAS ===<br />
#Usoskin, I. G. and Kovaltsov, G. A., ''Cosmic ray induced ionization in the atmosphere: Full modeling and practical applications'', <u>J. Geophys. Res.</u>, '''111''', doi:10.1029/2006JD007150, 2006..<br />
#Yu, Fangqun, et al, ''Ion-mediated nucleation in the atmosphere: Key controlling parameters, implications, and look-up table'', <u>J. Geophys. Res.</u>, '''115''', D03206, doi:10.1029/2009JD012630, 2010.<br />
<br />
--[[User:Bmy|Bob Y.]] 17:03, 24 February 2014 (EST)<br />
<br />
== Previous issues now resolved ==<br />
<br />
=== Minor bug in TOMAS sulfate emissions ===<br />
<br />
'''''This update was tested in the 1-month benchmark simulation [[GEOS-Chem_v9-02_benchmark_history#v9-02o|v9-02o]] and approved on 03 Sep 2013.'''''<br />
<br />
'''''[mailto:sal.farina@gmail.com Sal Farina] wrote:'''''<br />
:Calling mnfix before and after emission ensures the size distribution is well behaved, and eliminates "Negative SF emis" warnings. An edit to mnfix was also introduced, whereby "tiny" mass added to zero mass, "epsilon" number situations resulted in very high mass per particle results - necessitating excessive error detection, correction, and verbosity.<br />
<br />
--[[User:Melissa Payer|Melissa Sulprizio]] 15:08, 7 August 2013 (EDT)<br />
<br />
=== Segmentation Fault ===<br />
You may get an early segfault if your stacksize is not set to either unlimited or a very large number. To avoid this, you either have to change the value of an environmental variable (setenv command in <tt>.cshrc</tt>) or use the <tt>ulimit</tt> command. See [http://wiki.seas.harvard.edu/geos-chem/index.php/Machine_issues_%26_portability#Resetting_stacksize_for_Linux this page] for details.<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 20:20, 10 February 2010 (EST)<br />
<br />
=== Updates for GEOS-Chem v9-02 public release ===<br />
<br />
<div style="color: #aa0000; background: #eeeeee;border: 3px solid red; padding: 1em; margin: auto; width: 90%; ">'''<p>NOTE: As described below, there appears to be [[#Potential parallelization problems|a potential parallelizaiton problem with the TOMAS ND60 diagnostic]]. We are currently looking into this. This issue, however, does not affect the tracer concentrations computed by TOMAS, but only the output of the ND60 diagnostic itself. For this reason we are moving ahead with the TOMAS benchmarks for v9-02. (Bob Yantosca, 21 Feb 2014)</p>'''</div><br />
<br />
We have found and fixed several minor numerical and coding issues prior to the public release of [[GEOS-Chem v9-02]] (01 Mar 2014). The TOMAS40 simulation has been validated with the [[GEOS-Chem Unit Tester]]. Below is the [[GEOS-Chem Unit Tester#Interpreting_results_generated_by_the_GEOS-Chem_Unit_Tester|output of a unit test]] that was submitted on 2014/02/21 at 12:47:26 PM:<br />
<br />
###############################################################################<br />
### VALIDATION OF GEOS-CHEM OUTPUT FILES<br />
### In directory: geos5_4x5_TOMAS40<br />
###<br />
### File 1 : trac_avg.geos5_4x5_TOMAS40.2005070100.sp<br />
### File 2 : trac_avg.geos5_4x5_TOMAS40.2005070100.mp<br />
### Sizes : IDENTICAL (680420788 and 680420788)<br />
### Checksums : IDENTICAL (179613338 and 179613338)<br />
### Diffs : IDENTICAL<br />
###<br />
### File 1 : trac_rst.geos5_4x5_TOMAS40.2005070101.sp<br />
### File 2 : trac_rst.geos5_4x5_TOMAS40.2005070101.mp<br />
### Sizes : IDENTICAL (263480068 and 263480068)<br />
### Checksums : IDENTICAL (1925551193 and 1925551193)<br />
### Diffs : IDENTICAL<br />
###<br />
### File 1 : soil_rst.geos5_4x5_TOMAS40.2005070101.sp<br />
### File 2 : soil_rst.geos5_4x5_TOMAS40.2005070101.mp<br />
### Sizes : IDENTICAL (54040 and 54040)<br />
### Checksums : IDENTICAL (3229970876 and 3229970876)<br />
### Diffs : IDENTICAL<br />
###############################################################################<br />
<br />
In the subsections below, we describe in more detail the fixes that we made for [[GEOS-Chem v9-02]]:<br />
<br />
==== Fixes for minor coding errors ====<br />
<br />
#<p>In <tt>GeosCore/main.F</tt>, we now replaced <tt>CALL FLUSH()</tt> with <tt>CALL FLUSH(6)</tt>. The <tt>FLUSH</tt> routine needs to take an argument. Unit #6 is the unit stdout (i.e. the screen and/or log file).</p><br />
#<p>In routine <tt>CHEM_SO2</tt> (in module <tt>GeosCore/sulfate_mod.F</tt>), we now avoid referencing the dust tracers DST1, DST2, DST3, and DST4 tracers for TOMAS simulations. TOMAS uses size-resolved dust tracers, and therefore does not carry DST1-4 tracers. This error seems to have been introduced when the fix for cloud pH was introduced in Sep 2013.</p><br />
#<p>In routine <tt>COND_NUC</tt> (in module <tt>GeosCore/tomas_mod.F</tt>), we added error traps to avoid division-by-zero errors that occurred when the variable <tt>CSCH</tt> is zero. When <tt>CSCH</tt> is zero, we now set variable <tt>ADDT</tt> to zero. When <tt>ADDT</tt> is zero, it will get reassigned to a minimum time step, so this fix should work OK.</p><br />
#<p>In <tt>GeosCore/gamap_mod.F</tt>, we now have restored several entries to <tt>tracerinfo.dat</tt> for the ND44 diagnostic that were not getting properly printed out when the TOMAS simuation was being used.</p><br />
#<p>In module <tt>GeosCore/drydep_mod.F</tt>, we Now set <tt>MAXDEP=105</tt> for all simulations, including TOMAS. Formerly, TOMAS had <tt>MAXDEP=100</tt>. This is close enough.</p><br />
#<p>In module <tt>GeosCore/diag3.F</tt>, we now avoid an out-of-bounds error in <tt>DEPNAME(N)</tt> during TOMAS simulations. We save the drydep species name from <tt>DEPNAME(N)</tt> into an new variable <tt>DRYDEP_NAME</tt> for <tt>N = 1..NUMDEP</tt>. We then set <tt>DRYDEP_NAME = ''</tt> for <tt>N > NUMDEP</tt>. This error occurs because we extend the # of drydep tracers during TOMAS simulations to account for the size bins.</p><br />
#<p>We have fixed a couple of logical errors that prevented dust emissions from happening. Minor modifications were made to IF statements in <tt>GeosCore/chemistry_mod.F</tt>, <tt>GeosCore/dust_mod.F</tt>, and <tt>GeosCore/input_mod.F</tt>.</p><br />
#<p>In file <tt>GeosCore/Makefile</tt>, make sure to add <tt>tomas_mod.o</tt> to the list of modules used by <tt>wetscav_mod.F</tt> (aka the "dependency listing"). The corrected code should look like this:</p><br />
<br />
wetscav_mod.o : wetscav_mod.F \<br />
dao_mod.o diag_mod.o \<br />
depo_mercury_mod.o get_ndep_mod.o \<br />
get_popsinfo_mod.o tracerid_mod.o \<br />
tracer_mod.o tomas_mod.o<br />
<br />
--[[User:Bmy|Bob Y.]] 10:20, 19 February 2014 (EST)<br />
<br />
==== Fixes for parallelization errors ====<br />
<br />
#<p>In routine <tt>AEROPHYS</tt> (in module <tt>GeosCore/tomas_mod.F</tt>), we need to add the following variables to the <tt>!$OMP+PRIVATE</tt> statement: <tt>TRACNUM</tt>, <tt>NH3_TO_NH4</tt>, and <tt>SURF_AREA</tt>. Adding these now causes TOMAS to have identical sp vs. mp results when chemistry and microphysics are turned on.</p><br />
#<p>In routine <tt>DEPVEL</tt> (in <tt>GeosCore/drydep_mod.F</tt>): Instead of holding <tt>A_RADI</tt> and <tt>A_DEN</tt> as <tt>!$OMP+PRIVATE</tt> in TOMAS simulations (in the main DO loop in <tt>DEPVEL</tt>), we now save the particle size and density values to private variables <tt>DIAM</tt> and <tt>DEN</tt>. We then pass those as arguments to function <tt>DUST_SFCRSII</tt>.</p> <br />
#<p>We have corrected an issue in routine <tt>NFCLDMX</tt> (in module <tt>GeosCore/convection_mod.F</tt>) that potentially impacts the TOMAS wet scavenging, as described below:</p><br />
#*<p>We think there are different results for parallel and serial because of an assumption that's true for normal simulations but fails on TOMAS. The assumption is "tracers are independent through wet scavenging." Since TOMAS scavenging is size dependent, removing material from the distribution before calculating the soluble fraction of another component is "wrong." We now compute the fractions explicitly before the removal step. To do this, we now call routine <tt>COMPUTE_F</tt> in its own parallel DO loop located immediately before the main parallel do loop in <tt>NFCLDMX</tt>.</p><br />
#*<p>This modification also required the ND37 diagnostic IF block to be put into the same loop as <tt>COMPUTE_F</tt>. Furthermore, because <tt>COMPUTE_F</tt> returns the value of diagnostic index <tt>ISOL</tt>, and because <tt>ISOL</tt> is also used for the ND38 diagnostic in the main parallel loop below, we must also save the values of <tt>ISOL</tt> in a 1-D vector. This will allow the values of ISOL to be passed from the first parallel loop to the second. This ensures that the ND37 and ND38 diagnostics will be computed properly for all GEOS-5 simulations that have soluble tracers.</p><br />
#*<p>This modification has been tested in the [[GEOS-Chem Unit Tester]] by Bob Yantosca (04 Feb 2014) and it has yielded identical results for <tt>geos5_4x5_fullchem</tt>, <tt>geos5_4x5_Hg</tt>, <tt>geos5_4x5_RnPbBe</tt>, <tt>geos5_4x5_soa</tt> and <tt>geos5_4x5_soa_svpoa</tt> simulations.</p><br />
#<p>We have made some fixes in <tt>GeosCore/wetscav_mod.F</tt> that caused single-processor TOMAS runs to have different output than multi-processor runs. A few instances of code were computing quantities sequentially and then storing them for later use. These were technically thread-safe, but were susceptible to error because the order of computation would be different when running with parallelization turned on. These sections of code have now been rewritten accordingly.</p><br />
<br />
--[[User:Bmy|Bob Y.]] 14:09, 21 February 2014 (EST)<br />
<br />
==== Removed inefficient subroutine calls ====<br />
<br />
#<p>In <tt>GeosCore/diag3.F</tt>, we now use a 2-D array <tt>(J-L)</tt> for archiving into the ND60 TOMAS diagnostic. This eliminates an array temporary in the call to routine BPCH2.</p><br />
#<p>In routine <tt>AEROPHYS</tt> (in module <tt>GeosCore/tomas_mod.F</tt>), we now use an array <tt>ERR_IND</tt> to pass the I,J,L,N indices to error checking routine <tt>CHECK_VALUE</tt>. We previously used an array descriptor <tt>(/I,J,L,0/)</tt> which caused an array temporary to be created.</p><br />
#<p>In routine <tt>EMISSCARBON</tt> (in module <tt>GeosCore/carbon_mod.F</tt>), we removed array temporaries from the calls to subroutine <tt>EMITSGC</tt>. We now sum two arrays into a temporary array, and then pass that to <tt>EMITSGC</tt>.</p><br />
#<p>We rewrote the subroutine calls to NH4BULKTOBIN to avoid the creation of array temporaries. In most cases this was done by replacing <tt>MK(1:IBINS,SRTSO4)</tt> with <tt>MK(:,SRTSO4)</tt>, etc. By explicitly stating the sub-slice <tt>MK(1:IBINS,SRTSO4)</tt>, this causes the compiler to create an array temporary. Using <tt>MK(:,SRTSO4)</tt> instead allows for a more efficient pointer slice to be passed.</p><br />
<br />
--[[User:Bmy|Bob Y.]] 14:47, 31 January 2014 (EST)<br />
<br />
==== Fixes for convenience ====<br />
<br />
<p>We now read many of the TOMAS data files from the directory <tt>TRIM( DATA_DIR_1x1 ) // 'TOMAS_201402/'</tt>. This avoids us from having to keep these big files (some of which approach 100 MB in size) in individual users' run directories.</p><br />
<br />
--[[User:Bmy|Bob Y.]] 16:20, 31 January 2014 (EST)<br />
<br />
<p>Standard GC bulk dust is now unavailable in tomas simulations. Including the option for bulk dust in tomas simulations led to very confusing logical constructs, causing neither to function in a TOMAS simulation. </p><br />
--[[User:Salvatore Farina|Salvatore Farina]] 16:01, 3 March 2014 (EST)<br />
<br />
== Outstanding issues ==<br />
<br />
=== Potential parallelization problems ===<br />
<br />
We have noticed that there may be a parallelization error in the TOMAS [http://acmg.seas.harvard.edu/geos/doc/man/appendix_5.html ND60 diagnostic]. This may be caused by a coding error; in particular, one or more variables that may have been omitted from an <tt>!$OMP+PRIVATE</tt> declaration.<br />
<br />
This is illustrated by the following [[GEOS-Chem_Unit_Tester#Interpreting_results_generated_by_the_GEOS-Chem_Unit_Tester|unit test simulation]] of the [[GEOS-Chem v9-01-02]] provisional release code (submitted at 2:11 PM on 21 Feb 2014):<br />
<br />
###############################################################################<br />
### VALIDATION OF GEOS-CHEM OUTPUT FILES<br />
### In directory: geos5_4x5_TOMAS40<br />
###<br />
### File 1 : trac_avg.geos5_4x5_TOMAS40.2005070100.sp<br />
### File 2 : trac_avg.geos5_4x5_TOMAS40.2005070100.mp<br />
### Sizes : IDENTICAL (707260156 and 707260156)<br />
### Checksums : DIFFERENT (895530022 and 2949483685)<br />
### Diffs : DIFFERENT<br />
###<br />
### File 1 : trac_rst.geos5_4x5_TOMAS40.2005070101.sp<br />
### File 2 : trac_rst.geos5_4x5_TOMAS40.2005070101.mp<br />
### Sizes : IDENTICAL (263480068 and 263480068)<br />
### Checksums : IDENTICAL (1925551193 and 1925551193)<br />
### Diffs : IDENTICAL<br />
###<br />
### File 1 : soil_rst.geos5_4x5_TOMAS40.2005070101.sp<br />
### File 2 : soil_rst.geos5_4x5_TOMAS40.2005070101.mp<br />
### Sizes : IDENTICAL (54040 and 54040)<br />
### Checksums : IDENTICAL (3229970876 and 3229970876)<br />
### Diffs : IDENTICAL<br />
###############################################################################<br />
<br />
In the above test, all TOMAS diagnostics (ND59, ND60, and ND61) were turned on. The restart files (here named <tt>trac_rst.*</tt>) from the single-processor and multi-processor stages of the unit test are identical, but the <tt>ctm.bpch</tt> files (here named <tt>trac_avg.*</tt>) were different. When the restart files are identical, that means single-processor and multi-processor stages produced the identical tracer concentrations (and soil NOx quantities). <br />
<br />
The only differences in the <tt>trac.avg.*</tt> files between the single-processor and multi-processor stages of the unit test were in TOMAS diagnostic quantities. The affected categories appear to be <tt>TMS-COND</tt>, <tt>TMS-COAG</tt>, <tt>TMS-NUCL</tt>, <tt>AERO-FIX</tt>, which points to the ND60 diagnostic.<br />
<br />
In order to confirm that the ND60 diagnostic exhibits the problem, we ran an additional unit test with ND59 and ND61 turned on, but ND60 turned off. This unit test, which was submitted at 3:33PM on 21 Feb 2014, yielded identical results.<br />
<br />
###############################################################################<br />
### VALIDATION OF GEOS-CHEM OUTPUT FILES<br />
### In directory: geos5_4x5_TOMAS40<br />
###<br />
### File 1 : trac_avg.geos5_4x5_TOMAS40.2005070100.sp<br />
### File 2 : trac_avg.geos5_4x5_TOMAS40.2005070100.mp<br />
### Sizes : IDENTICAL (690218236 and 690218236)<br />
### Checksums : IDENTICAL (4196844107 and 4196844107)<br />
### Diffs : IDENTICAL<br />
###<br />
### File 1 : trac_rst.geos5_4x5_TOMAS40.2005070101.sp<br />
### File 2 : trac_rst.geos5_4x5_TOMAS40.2005070101.mp<br />
### Sizes : IDENTICAL (263480068 and 263480068)<br />
### Checksums : IDENTICAL (1925551193 and 1925551193)<br />
### Diffs : IDENTICAL<br />
###<br />
### File 1 : soil_rst.geos5_4x5_TOMAS40.2005070101.sp<br />
### File 2 : soil_rst.geos5_4x5_TOMAS40.2005070101.mp<br />
### Sizes : IDENTICAL (54040 and 54040)<br />
### Checksums : IDENTICAL (3229970876 and 3229970876)<br />
### Diffs : IDENTICAL<br />
###############################################################################<br />
<br />
We are still looking into this issue. Because this issue only affects the ND60 diagnostic output, but not tracer concentrations, we are moving ahead with the TOMAS benchmarks for [[GEOS-Chem v9-02]] (as of 21 Feb 2014). <br />
<br />
--[[User:Bmy|Bob Y.]] 16:17, 21 February 2014 (EST)<br />
<br />
=== Offline Dust ===<br />
Currently, GEOS-Chem with TOMAS uses proscribed offline dust aerosol data in radiative transfer / photolysis calculations. Due to complications, this is turned off entirely for 2x2.5 resolution.<br />
<br />
=== Vertical Grids ===<br />
Currently, GC-TOMAS is only compatible with the reduced vertical grids:<br />
* [http://acmg.seas.harvard.edu/geos/doc/man/appendix_3.html#A3.3.1 GEOS3_30L]<br />
* [http://acmg.seas.harvard.edu/geos/doc/man/appendix_3.html#A3.4.1 GEOS4_30L]<br />
* [http://acmg.seas.harvard.edu/geos/doc/man/appendix_3.html#A3.5.1 GEOS5_47L]<br />
<br />
Development for the full vertical grids is ongoing.<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 20:43, 10 February 2010 (EST)<br />
<br />
== Obsolete versions of TOMAS ==<br />
<br />
In this section we preserve information that pertained to older versions of TOMAS (before the [[GEOS-Chem v9-02]] release).<br />
<br />
=== Code structure ===<br />
<br />
<div style="color: #aa0000; background: #eeeeee;border: 3px solid red; padding: 1em; margin: auto; width: 90%; ">'''<p>NOTE: This has been rendered obsolete by the [[#Update April 2013|re-integration of TOMAS into GEOS-Chem]], which was included in [[GEOS-Chem v9-02]]. All of the TOMAS routines have now been ported into the <tt>GeosCore</tt> directory. We shall leave this post here for reference. (Bob Yantosca, 30 Jan 2014)</p>'''</div><br />
<br />
The main-level <tt>Code</tt> directory has now been divided into several subdirectories:<br />
<br />
GeosCore/ GEOS-Chem "core" routines<br />
GeosTomas/ Parallel copies of GEOS-Chem routines that reference TOMAS<br />
GeosUtil/ "Utility" modules (e.g. error_mod.f, file_mod.f, time_mod.f, etc.<br />
Headers/ Header files (define.h, CMN_SIZE, CMN_DIAG, etc.)<br />
KPP/ KPP solver directory structure<br />
bin/ Directory where executables are placed<br />
doc/ Directory where documentation is created<br />
help/ Directory for GEOS-Chem Help Screen<br />
lib/ Directory where library files are placed<br />
mod/ Directory where module files are placed<br />
obsolete/ Directory where obsolete versions of code are archived<br />
<br />
Because there were a lot of TOMAS-related modifications in several GEOS-Chem "core" routines, the routines that need to "talk" to TOMAS were placed into a separate subdirectory named <tt>GeosTomas/</tt>. The files in <tt>GeosTomas</tt> are:<br />
<br />
Files:<br />
------<br />
Makefile -- GEOS-Chem routines that have been<br />
aero_drydep.f modified to reference the TOMAS aerosol<br />
carbon_mod.f microphysics package. These are kept<br />
chemdr.f in a separate GeosTomas directory so that<br />
chemistry_mod.f they do not interfere with the routines<br />
cleanup.f in the GeosCore directory.<br />
diag3.f<br />
diag_mod.f The GeosTomas directory only needs to<br />
diag_pl_mod.f contain the files that have been modified<br />
drydep_mod.f for TOMAS. The Makefile will look for<br />
dust_mod.f all other files from the GeosCore directory<br />
emissions_mod.f using the VPATH option in GNU Make.<br />
gamap_mod.f<br />
initialize.f NOTE to GEOS-Chem developers: When you<br />
input_mod.f make changes to any of these routines<br />
isoropia_mod.f in the GeosCore directory, you must also<br />
logical_mod.f make the same modifications to the<br />
ndxx_setup.f corresponding routines in the GeosTomas<br />
planeflight_mod.f directory.<br />
seasalt_mod.f<br />
sulfate_mod.f Maybe in the near future we can work<br />
tomas_mod.f towards integrating TOMAS into the GeosCore<br />
tomas_tpcore_mod.f90 directory more cleanly. However, due to<br />
tpcore_mod.f the large number of modifications that were<br />
tpcore_window_mod.f necessary for TOMAS, it was quicker to<br />
tracerid_mod.f implement the TOMAS code in a separate<br />
wetscav_mod.f subdirectory. <br />
xtra_read_mod.f -- Bob Y. (1/25/10)<br />
<br />
Each of these files were merged with the corresponding files in the <tt>GeosCore</tt> subdirectory. Therefore, in addition to having the GEOS-Chem modifications from [[GEOS-Chem v8-02-05|v8-02-05]], these files also have the relevant TOMAS references.<br />
<br />
A few technical considerations dictated the placing of these files into a separate <tt>GeosTomas/</tt> directory:<br />
<br />
* The ND60 diagnostic in the standard GEOS-Chem code (in <tt>GeosCore/</tt>) is now used for the CH4 offline simulation, but in TOMAS it's used for something else. <br />
* Some parameters needed to be declared differently with for simulations with TOMAS. <br />
* Because not all GEOS-Chem users will choose to use TOMAS, we did not want to unnecessarily bog down the code in <tt>GeosCore/</tt> with references to TOMAS-specific routines. <br />
<br />
All of these concerns could be best solved by keeping parallel copies of the affected routines in the <tt>GeosTomas</tt> directory.<br />
<br />
--[[User:Bmy|Bob Y.]] 13:35, 25 February 2010 (EST)<br />
<br />
=== Building GEOS-Chem with TOMAS ===<br />
<br />
<div style="color: #aa0000; background: #eeeeee;border: 3px solid red; padding: 1em; margin: auto; width: 90%; ">'''<p>NOTE: This has been rendered obsolete by the [[#Update April 2013|re-integration of TOMAS into GEOS-Chem]], which was included in [[GEOS-Chem v9-02]]. All of the TOMAS routines have now been ported into the <tt>GeosCore</tt> directory. We shall leave this post here for reference. (Bob Yantosca, 25 Feb 2014)</p>'''</div><br />
<br />
The <tt>VPATH</tt> feature of [http://www.gnu.org/software/make/manual/make.html GNU Make] is used to simplify the compilation. When GEOS-Chem is compiled with the tomas target, the GNU Make utility will search for files in the <tt>GeosTomas/</tt> directory first. If it cannot find files there, it will then search the <tt>GeosCore/</tt> directory. Thus, if we make a change to a "core" GEOS-Chem routine in the <tt>GeosCore/</tt> subdirectory (say in <tt>dao_mod.f</tt> or <tt>diag49_mod.f</tt>), then those changes will automatically be applied when you build GEOS-Chem with TOMAS. Thus, we only need to keep in <tt>GeosTomas/</tt> separate copies of those files that have to "talk" with TOMAS.<br />
<br />
Several new targets were added to the <tt>Makefile</tt> in the top-level <tt>Code/</tt> directory:<br />
<br />
#=============================================================================<br />
# Targets for TOMAS aerosol microphysics code (win, bmy, 1/25/10)<br />
#=============================================================================<br />
<br />
.PHONY: tomas libtomas exetomas cleantomas<br />
<br />
tomas:<br />
@$(MAKE) -C $(GEOSTOM) TOMAS=yes all<br />
<br />
libtomas:<br />
@$(MAKE) -C $(GEOSTOM) TOMAS=yes lib<br />
<br />
exetomas:<br />
@$(MAKE) -C $(GEOSTOM) TOMAS=yes exe<br />
<br />
cleantomas:<br />
@$(MAKE) -C $(GEOSTOM) TOMAS=yes clean<br />
<br />
You can build GEOS-Chem with the TOMAS option by typing:<br />
<br />
make tomas ...<br />
<br />
This will automatically do the proper things to build the TOMAS code into GEOS-Chem, such as:<br />
<br />
* Adding a <tt>-DTOMAS</tt> C-preprocessor switch to the <tt>FFLAGS</tt> compiler flag settings in <tt>Makefile_header.mk</tt>. This will cause TOMAS-specific areas of code to be turned on.<br />
* Turning off OpenMP parallelization. For now the GEOS-Chem + TOMAS code needs to be run on a single processor. We continue to work on parallelizing the code.<br />
* Calling the Makefile in the <tt>GeosTomas/</tt> subdirectory to build the executable. The executable file is now named <tt>geostomas</tt> in order to denote that the TOMAS code is built in.<br />
<br />
The GEOS-Chem + TOMAS has been built on the following compilers<br />
<br />
* Intel Fortran compiler v10<br />
* Intel Fortran compiler v11.1 (20101201)<br />
* SunStudio 12<br />
<br />
--[[User:Bmy|Bob Y.]] 10:36, 27 January 2010 (EST)<br />
<br />
=== Compile from GeosTomas directory ===<br />
<br />
<div style="color: #aa0000; background: #eeeeee;border: 3px solid red; padding: 1em; margin: auto; width: 90%; ">'''<p>NOTE: This has been rendered obsolete by the [[#Update April 2013|re-integration of TOMAS into GEOS-Chem]], which was included in [[GEOS-Chem v9-02]]. We shall leave this post here for reference. (Bob Yantosca, 30 Jan 2014)</p>'''</div><br />
<br />
'''''[mailto:dwesterv@andrew.cmu.edu Dan Westervelt] wrote:'''''<br />
<br />
:I think there is something going wrong in my compilation, although errors have come up at both compile time and run time. The worst of the problems is this: I'll make a change to any fortran file in the code (even something meaningless like print*, 'foo') and hundreds of compile errors come out with fishy error messages such as (from ifort v10.1):<br />
<br />
***fortcom: Error: chemistry_mod.f, line 478: A kind type parameter must be a compile-time constant. [DP]<br />
REAL(kind=dp) :: RCNTRL(20)<br />
<br />
:Any advice? The errors I'm having are not unique to any version of GC, any type of met fields, any compiler, etc.<br />
<br />
'''''[mailto:yantosca@seas.harvard.edu Bob Yantosca] wrote:'''''<br />
<br />
:Make sure you are always in the GeosTomas subdirectory when you build the code. Sometimes there is a problem if you build the code from a higher level directory. This may have to do with the VPATH in the makefile.<br />
<br />
'''''[mailto:dwesterv@andrew.cmu.edu Dan Westervelt] wrote:'''''<br />
<br />
:Thanks, that seems to do the trick.<br />
<br />
--[[User:Bmy|Bob Y.]] 14:37, 14 April 2010 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_aerosol_microphysics&diff=16045TOMAS aerosol microphysics2014-03-03T21:01:52Z<p>Salvatore Farina: /* Fixes for convenience */</p>
<hr />
<div>This page describes the TOMAS aerosol microphysics option in GEOS-Chem. TOMAS is one of two aerosol microphysics packages being incorporated into GEOS-Chem, the other being [[APM aerosol microphysics|APM]].<br />
<br />
== Overview ==<br />
<br />
The TwO-Moment Aerosol Sectional (TOMAS) microphysics package was developed for implementation into GEOS-Chem at Carnegie-Mellon University. Using a moving sectional and moment-based approach, TOMAS tracks two independent moments (number and mass) of the aerosol size distribution for a number of discrete size bins. It also contains codes to simulate nucleation, condensation, and coagulation processes. The aerosol species that are considered with high size resolution are sulfate, sea-salt, OC, EC, and dust. An advantage of TOMAS is the full size resolution for all chemical species and the conservation of aerosol number, the latter of which allows one to construct aerosol and CCN number budgets that will balance.<br />
<br />
=== Authors and collaborators ===<br />
* [mailto:petera@andrew.cmu.edu Peter Adams] ''(Carnegie-Mellon U.)'' -- Principal Investigator<br />
* [mailto:wtrivita@staffmail.ed.ac.uk Win Trivitayanurak] ''(Department of Highways, Thailand)''<br />
* [mailto:dwesterv@andrew.cmu.edu Dan Westervelt] ''(Carnegie-Mellon U.)''<br />
* [mailto:jeffrey.pierce@dal.ca Jeffrey Pierce] ''(Dalhousie U.)''<br />
* [mailto:sal.farina@gmail.com Salvatore Farina] ''(Colorado State U.)''<br />
<br />
Questions regarding TOMAS can be directed at Dan (e-mail linked above).<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 11:53, 27 January 2010 (EST)<br />
<br />
=== TOMAS User Groups ===<br />
<br />
{| border=1 cellspacing=0 cellpadding=5<br />
|- bgcolor="#cccccc"<br />
!User Group<br />
!Personnel<br />
!Projects<br />
|-valign="top"<br />
|[http://www.ce.cmu.edu/%7Eadams/ Carnegie-Mellon University]<br />
|[http://www.ce.cmu.edu/%7Eadams/people.htm#peter Peter Adams]<br>[http://www.ce.cmu.edu/~dwesterv/Site/Home.html Dan Westervelt]<br />
| [http://www.atmos-chem-phys-discuss.net/13/8333/2013/acpd-13-8333-2013.html New particle formation evaluation in GC-TOMAS] <br> Sensitivity of CCN to nucleation rates <br> Development of number tagging and source apportionment model for GC-TOMAS<br />
|-valign="top"<br />
|[http://fizz.phys.dal.ca/%7Epierce/ Dalhousie University] <br> [http://www.atmos.colostate.edu/faculty/pierce.php Colorado State]<br />
|[http://atm.dal.ca/Faculty/Jeffrey_Pierce.php Jeffrey Pierce]<br>Sal Farina<br>Stephen D'Andrea<br />
|Sensitivity of CCN to condensational growth rates <br> TOMAS parallelization <br> Others...<br />
|-valign="top"<br />
|Add yours here<br />
|<br />
|<br />
|}<br />
<br />
== TOMAS-specific setup ==<br />
TOMAS has its own run directories (run.Tomas) that can be downloaded from the Harvard FTP. The <tt>input.geos</tt> file will look slightly different from standard GEOS-Chem, and between versions.<br />
<br />
Pre- v9.02:<br />
To turn on TOMAS, see the "Microphysics menu" in <tt>input.geos</tt> and make sure TOMAS is set to '''T'''. <br />
<br />
v9.02 and later:<br />
TOMAS is enabled or disabled at compile time - the TOMAS flag in input.geos has been removed.<br />
<br />
<br />
TOMAS is a simulation type 3 and utilizes 171-423 tracers. Each aerosol species requires 30 tracers for the 30 bin size resolution, 12 for the 12 bin, etc. Here is the (abbreviated) default setup in input.geos for TOMAS-30 in v9.02 and later (see run.Tomas directory):<br />
<br />
Tracer # Description <br />
1- 62 Std Geos Chem <br />
63 H2SO4 <br />
64- 93 Number <br />
94-123 Sulfate <br />
124-153 Sea-salt <br />
154-183 Hydrophilic EC <br />
184-213 Hydrophobic EC <br />
214-243 Hydrophilic OC <br />
244-273 Hydrophobic OC <br />
274-303 Mineral dust <br />
304-333 Aerosol water<br />
<br />
TOMAS-40 requires 423 tracers (~360 TOMAS tracers for each of the 40-bin species, and ~62 standard GEOS-Chem tracers) <br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 18:48, 8 July 2013 (EDT)<br />
<br />
== Implementation notes ==<br />
<br />
TOMAS validation in [[GEOS-Chem v8-03-01]] was completed on 24 Feb 2010.<br />
<br />
=== Update April 2013 ===<br />
<br />
'''''This update was tested in the 1-month benchmark simulation [[GEOS-Chem_v9-02_benchmark_history#v9-02k|v9-02k]] and approved on 07 Jun 2013.'''''<br />
<br />
Sal Farina has been working with the GEOS-Chem Support Team to inline the TOMAS aerosol microphysics code into the <tt>GeosCore</tt> directory. All TOMAS-specific sections of code are now segregated from the rest of GEOS-Chem with C-preprocessor statements such as:<br />
<br />
#if defined( TOMAS )<br />
<br />
# if defined( TOMAS40 ) <br />
... Code for 40 bin TOMAS simulation (optional) goes here ...<br />
# elif defined( TOMAS12 )<br />
... Code for 12 bin TOMAS simulation (optional) goes here ...<br />
# elif defined( TOMAS15 )<br />
... Code for 15 bin TOMAS simulation (optional) goes here ...<br />
# else<br />
... Code for 30 bin TOMAS simulation (default) goes here ...<br />
# endif<br />
<br />
#endif <br />
<br />
TOMAS is now invoked by compiling GEOS-Chem with one of the following options:<br />
<br />
make -j4 TOMAS=yes ... # Compiles GEOS-Chem for the 30 bin (default) TOMAS simulation<br />
# -j4 compiles 4 files at a time; this reduces overall compilation time<br />
<br />
or<br />
<br />
make -j4 TOMAS40=yes ... # Compiles GEOS-Chem for the 40 bin (optional) TOMAS simulation<br />
# -j4 compiles 4 files at a time; this reduces overall compilation time<br />
<br />
All files in the old <tt>GeosTomas/</tt> directory have now been deleted, as these have been rendered obsolete.<br />
<br />
These updates are included in [[GEOS-Chem v9-02]]. These modifications will not affect the existing GEOS-Chem simulations, as all TOMAS code is not compiled into the executable unless you specify either <tt>TOMAS=yes</tt> or <tt>TOMAS40=yes</tt> at compile time.<br />
<br />
We are in the process of updating the wiki to reflect these changes as they are implemented. <br />
<br />
--[[User:Bmy|Bob Y.]] 13:59, 23 April 2013 (EDT)<br><br />
--[[User:Salvatore Farina|Salvatore Farina]] 13:49, 4 June 2013 (EDT)<br />
<br />
== Computational Information ==<br />
<br />
GC-TOMAS v9-02 (30 sections) on 8 processors: <br />
One year simulation = 7-8 days wall clock time<br />
<br />
More speedups are available using lower aerosol size resolution<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 11:00, 07 May 2013 (EST)<br />
<br />
GC-TOMAS v9-03 on 16 processors (glooscap)<br />
<br />
12 bin: 2.8 days wall time per sim year<br />
<br />
15 bin: 3.3 days wall time per sim year<br />
<br />
30 bin: 6.1 days wall time per sim year<br />
<br />
40 bin: 7.8 days wall time per sim year<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 15:51, 3 March 2014 (EST)<br />
<br />
== Microphysics Code==<br />
The aerosol microphysics code is largely contained within the file <tt>tomas_mod.f</tt>. Tomas_mod and its subroutines are modular -- they use all their own internal variables. For details, see tomas_mod.f and comments. <br />
<br />
=== Nucleation ===<br />
The choice of nucleation theory is selected in the header section of <tt>tomas_mod.f</tt>. The choices are currently binary homogeneous nucleation as in Vehkamaki, 2001 or ternary homogenous nucleation as in Napari et al., 2002. The ternary nucleation rate is typically scaled by a globally uniform tuning factor of 10^-4 or 10^-5. Binary nucleation (Vehkamaki et al. 2002), ion-mediated nucleation (Yu, 2008) and activation nucleation (Kulmala, 2006) are options as well.<br />
<br />
In TOMAS-12 and TOMAS-30, nucleated particles follow the Kerminen approximation to grow to the smallest size bin. This has a tendency to overpredict the number of particles in the smallest bins of those models. See Y. H. Lee, J. R. Pierce, and P. J. Adams 2013 [http://www.geosci-model-dev-discuss.net/6/893/2013/gmdd-6-893-2013.html here] for more details on the consequences of this.<br />
<br />
=== Condensation ===<br />
<br />
=== Coagulation ===<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 14:08, 9 May 2011 (EST)<br />
<br />
== Validation ==<br />
<br />
GC-TOMAS [[GEOS-Chem v8-03-01|v8-03-01]] generally compares very well with observations and other models. Please see our [http://acmg.seas.harvard.edu/geos/wiki_docs/TOMAS/TOMAS_benchmark_ForHarvard.pdf GC-TOMAS v8-02-05 validation document] for more information and figures. <br />
<br />
Below are some results of benchmarking GC-TOMAS with earlier versions of the model as well as observations:<br />
<br />
[[Image:CN10_smaller.jpg]]<br />
<br />
'''Figure 1: CN10 concentrations predicted by GC-TOMAS v8-02-05 against observations. Descriptions of observational data can be found on p 5454 of Pierce et al, Atmos. Chem. Phys., 7, 2007.'''<br />
<br />
----<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 20:13, 10 February 2010 (EST)<br />
<br />
== Other features of TOMAS ==<br />
Other varieties of TOMAS are suited for specific science questions, for example with nucleation studies where explicit aerosol dynamics are needed for nanometer-sized particles. <br />
<br />
=== Set-up Guide ===<br />
<br />
This [[TOMAS setup guide]] was written for users on ACE-NET's Glooscap cluster, but may be more generally applicable.<br />
Please contact [mailto:sal.farina@gmail.com Salvatore Farina] for help in obtaining the latest development version of GEOS-Chem with TOMAS.<br />
This will allow you to take advantage of parallel computation in TOMAS.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 11:55, 26 July 2013 (EDT)<br />
<br />
=== Size Resolution ===<br />
<br />
;TOMAS-30: All 7 chemical species have size resolution ranging from 10 nm to 10 µm, spanned by 30 logarithmically spaced (mass doubling) bins.<br />
;TOMAS-40: Same as TOMAS-30 with 10 additional (mass doubling) sub-10nm bins with a lower limit ~1nm<br />
;TOMAS-12: All 7 chemical species have size resolution ranging from 10 nm to 1 µm spanned by 10 logarithmically spaced (mass quadrupling) bins and two supermicron bins. Coarser resolution than TOMAS-30 - Improved computation time. <br />
;TOMAS-15: Same as TOMAS-12 with 3 additional (mass quadrupling) sub-10nm bins with a lower limit ~2nm. Analogous to TOMAS40 with improved computation time.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 12:51, 4 June 2013 (EDT)<br />
<br />
=== Nesting and grid size ===<br />
TOMAS is implemented on a 2x2.5 North American domain. Developed by Jeffrey Pierce (jeffrey.pierce@dal.ca)<br />
<br />
=== AOD, CCN post-processing code ===<br />
Codes available for calculating aerosol optical depth using TOMAS predicted aerosol composition and size and Mie Theory. Also CCN concentrations calculated from TOMAS size-resolved composition and Kohler theory. Developed by Yunha Lee and Jeffrey Pierce, adapted for GEOS-Chem output by Jeffrey Pierce.<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 2:00, 9 May 2011 (EST)<br />
<br />
== References ==<br />
<br />
In this section we provide references relevant to TOMAS aerosl microphysics simulations.<br />
<br />
=== Studies using TOMAS simulations ===<br />
#'''Nucleation in GEOS-Chem:''' Westervelt, D. M., Pierce, J. R., Riipinen, I., Trivitayanurak, W., Hamed, A., Kulmala, M., Laaksonen, A., Decesari, S., and Adams, P. J.: ''Formation and growth of nucleated particles into cloud condensation nuclei: model-measurement comparison'', <u>Atmos. Chem. Phys. Discuss.</u>, '''13''', 8333-8386, doi:10.5194/acpd-13-8333-2013, 2013. [http://www.atmos-chem-phys-discuss.net/13/8333/2013/acpd-13-8333-2013.html LINK]<br />
#'''TOMAS implementation in GEOS-Chem:''' Trivitayanurak, W., Adams, P. J., Spracklen, D. V. and Carslaw, K. S.: ''Tropospheric aerosol microphysics simulation with assimilated meteorology: model description and intermodel comparison'', <u>Atmos. Chem. Phys.</u>, '''8'''(12), 3149-3168, 2008.<br />
#'''TOMAS initial paper, sulfate only:''' Adams, P. J. and Seinfeld, J. H.: ''redicting global aerosol size distributions in general circulation models'', <u>J. Geophys. Res.-Atmos.</u>, '''107'''(D19), -, doi:Artn 4370 Doi 10.1029/2001jd001010, 2002.<br />
#'''TOMAS with sea-salt:''' Pierce, J.R., and Adams P.J., ''Global evaluation of CCN formation by direct emission of sea salt and growth of ultrafine sea salt'', <u>J. Geophys. Res.-Atmos.</u>, '''111''' (D6), doi:10.1029/2005JD006186, 2006.<br />
#'''TOMAS with carbonaceous aerosol:''' Pierce, J. R., Chen, K. and Adams, P. J.: ''Contribution of primary carbonaceous aerosol to cloud condensation nuclei: processes and uncertainties evaluated with a global aerosol microphysics model'', <u>Atmos. Chem. Phys.</u>, '''7'''(20), 5447-5466, doi:10.5194/acp-7-5447-2007, 2007.<br />
#'''TOMAS with dust:''' Lee, Y.H., K. Chen, and P.J. Adams, 2009: ''Development of a global model of mineral dust aerosol microphysics''. <u>Atmos. Chem. Phys.</u>, '''8''', 2441-2558, doi:10.5194/acp-9-2441-2009.<br />
<br />
--[[User:Bmy|Bob Y.]] 17:04, 24 February 2014 (EST)<br />
<br />
=== Input data used by TOMAS ===<br />
#Usoskin, I. G. and Kovaltsov, G. A., ''Cosmic ray induced ionization in the atmosphere: Full modeling and practical applications'', <u>J. Geophys. Res.</u>, '''111''', doi:10.1029/2006JD007150, 2006..<br />
#Yu, Fangqun, et al, ''Ion-mediated nucleation in the atmosphere: Key controlling parameters, implications, and look-up table'', <u>J. Geophys. Res.</u>, '''115''', D03206, doi:10.1029/2009JD012630, 2010.<br />
<br />
--[[User:Bmy|Bob Y.]] 17:03, 24 February 2014 (EST)<br />
<br />
== Previous issues now resolved ==<br />
<br />
=== Minor bug in TOMAS sulfate emissions ===<br />
<br />
'''''This update was tested in the 1-month benchmark simulation [[GEOS-Chem_v9-02_benchmark_history#v9-02o|v9-02o]] and approved on 03 Sep 2013.'''''<br />
<br />
'''''[mailto:sal.farina@gmail.com Sal Farina] wrote:'''''<br />
:Calling mnfix before and after emission ensures the size distribution is well behaved, and eliminates "Negative SF emis" warnings. An edit to mnfix was also introduced, whereby "tiny" mass added to zero mass, "epsilon" number situations resulted in very high mass per particle results - necessitating excessive error detection, correction, and verbosity.<br />
<br />
--[[User:Melissa Payer|Melissa Sulprizio]] 15:08, 7 August 2013 (EDT)<br />
<br />
=== Segmentation Fault ===<br />
You may get an early segfault if your stacksize is not set to either unlimited or a very large number. To avoid this, you either have to change the value of an environmental variable (setenv command in <tt>.cshrc</tt>) or use the <tt>ulimit</tt> command. See [http://wiki.seas.harvard.edu/geos-chem/index.php/Machine_issues_%26_portability#Resetting_stacksize_for_Linux this page] for details.<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 20:20, 10 February 2010 (EST)<br />
<br />
=== Updates for GEOS-Chem v9-02 public release ===<br />
<br />
<div style="color: #aa0000; background: #eeeeee;border: 3px solid red; padding: 1em; margin: auto; width: 90%; ">'''<p>NOTE: As described below, there appears to be [[#Potential parallelization problems|a potential parallelizaiton problem with the TOMAS ND60 diagnostic]]. We are currently looking into this. This issue, however, does not affect the tracer concentrations computed by TOMAS, but only the output of the ND60 diagnostic itself. For this reason we are moving ahead with the TOMAS benchmarks for v9-02. (Bob Yantosca, 21 Feb 2014)</p>'''</div><br />
<br />
We have found and fixed several minor numerical and coding issues prior to the public release of [[GEOS-Chem v9-02]] (01 Mar 2014). The TOMAS40 simulation has been validated with the [[GEOS-Chem Unit Tester]]. Below is the [[GEOS-Chem Unit Tester#Interpreting_results_generated_by_the_GEOS-Chem_Unit_Tester|output of a unit test]] that was submitted on 2014/02/21 at 12:47:26 PM:<br />
<br />
###############################################################################<br />
### VALIDATION OF GEOS-CHEM OUTPUT FILES<br />
### In directory: geos5_4x5_TOMAS40<br />
###<br />
### File 1 : trac_avg.geos5_4x5_TOMAS40.2005070100.sp<br />
### File 2 : trac_avg.geos5_4x5_TOMAS40.2005070100.mp<br />
### Sizes : IDENTICAL (680420788 and 680420788)<br />
### Checksums : IDENTICAL (179613338 and 179613338)<br />
### Diffs : IDENTICAL<br />
###<br />
### File 1 : trac_rst.geos5_4x5_TOMAS40.2005070101.sp<br />
### File 2 : trac_rst.geos5_4x5_TOMAS40.2005070101.mp<br />
### Sizes : IDENTICAL (263480068 and 263480068)<br />
### Checksums : IDENTICAL (1925551193 and 1925551193)<br />
### Diffs : IDENTICAL<br />
###<br />
### File 1 : soil_rst.geos5_4x5_TOMAS40.2005070101.sp<br />
### File 2 : soil_rst.geos5_4x5_TOMAS40.2005070101.mp<br />
### Sizes : IDENTICAL (54040 and 54040)<br />
### Checksums : IDENTICAL (3229970876 and 3229970876)<br />
### Diffs : IDENTICAL<br />
###############################################################################<br />
<br />
In the subsections below, we describe in more detail the fixes that we made for [[GEOS-Chem v9-02]]:<br />
<br />
==== Fixes for minor coding errors ====<br />
<br />
#<p>In <tt>GeosCore/main.F</tt>, we now replaced <tt>CALL FLUSH()</tt> with <tt>CALL FLUSH(6)</tt>. The <tt>FLUSH</tt> routine needs to take an argument. Unit #6 is the unit stdout (i.e. the screen and/or log file).</p><br />
#<p>In routine <tt>CHEM_SO2</tt> (in module <tt>GeosCore/sulfate_mod.F</tt>), we now avoid referencing the dust tracers DST1, DST2, DST3, and DST4 tracers for TOMAS simulations. TOMAS uses size-resolved dust tracers, and therefore does not carry DST1-4 tracers. This error seems to have been introduced when the fix for cloud pH was introduced in Sep 2013.</p><br />
#<p>In routine <tt>COND_NUC</tt> (in module <tt>GeosCore/tomas_mod.F</tt>), we added error traps to avoid division-by-zero errors that occurred when the variable <tt>CSCH</tt> is zero. When <tt>CSCH</tt> is zero, we now set variable <tt>ADDT</tt> to zero. When <tt>ADDT</tt> is zero, it will get reassigned to a minimum time step, so this fix should work OK.</p><br />
#<p>In <tt>GeosCore/gamap_mod.F</tt>, we now have restored several entries to <tt>tracerinfo.dat</tt> for the ND44 diagnostic that were not getting properly printed out when the TOMAS simuation was being used.</p><br />
#<p>In module <tt>GeosCore/drydep_mod.F</tt>, we Now set <tt>MAXDEP=105</tt> for all simulations, including TOMAS. Formerly, TOMAS had <tt>MAXDEP=100</tt>. This is close enough.</p><br />
#<p>In module <tt>GeosCore/diag3.F</tt>, we now avoid an out-of-bounds error in <tt>DEPNAME(N)</tt> during TOMAS simulations. We save the drydep species name from <tt>DEPNAME(N)</tt> into an new variable <tt>DRYDEP_NAME</tt> for <tt>N = 1..NUMDEP</tt>. We then set <tt>DRYDEP_NAME = ''</tt> for <tt>N > NUMDEP</tt>. This error occurs because we extend the # of drydep tracers during TOMAS simulations to account for the size bins.</p><br />
#<p>We have fixed a couple of logical errors that prevented dust emissions from happening. Minor modifications were made to IF statements in <tt>GeosCore/chemistry_mod.F</tt>, <tt>GeosCore/dust_mod.F</tt>, and <tt>GeosCore/input_mod.F</tt>.</p><br />
#<p>In file <tt>GeosCore/Makefile</tt>, make sure to add <tt>tomas_mod.o</tt> to the list of modules used by <tt>wetscav_mod.F</tt> (aka the "dependency listing"). The corrected code should look like this:</p><br />
<br />
wetscav_mod.o : wetscav_mod.F \<br />
dao_mod.o diag_mod.o \<br />
depo_mercury_mod.o get_ndep_mod.o \<br />
get_popsinfo_mod.o tracerid_mod.o \<br />
tracer_mod.o tomas_mod.o<br />
<br />
--[[User:Bmy|Bob Y.]] 10:20, 19 February 2014 (EST)<br />
<br />
==== Fixes for parallelization errors ====<br />
<br />
#<p>In routine <tt>AEROPHYS</tt> (in module <tt>GeosCore/tomas_mod.F</tt>), we need to add the following variables to the <tt>!$OMP+PRIVATE</tt> statement: <tt>TRACNUM</tt>, <tt>NH3_TO_NH4</tt>, and <tt>SURF_AREA</tt>. Adding these now causes TOMAS to have identical sp vs. mp results when chemistry and microphysics are turned on.</p><br />
#<p>In routine <tt>DEPVEL</tt> (in <tt>GeosCore/drydep_mod.F</tt>): Instead of holding <tt>A_RADI</tt> and <tt>A_DEN</tt> as <tt>!$OMP+PRIVATE</tt> in TOMAS simulations (in the main DO loop in <tt>DEPVEL</tt>), we now save the particle size and density values to private variables <tt>DIAM</tt> and <tt>DEN</tt>. We then pass those as arguments to function <tt>DUST_SFCRSII</tt>.</p> <br />
#<p>We have corrected an issue in routine <tt>NFCLDMX</tt> (in module <tt>GeosCore/convection_mod.F</tt>) that potentially impacts the TOMAS wet scavenging, as described below:</p><br />
#*<p>We think there are different results for parallel and serial because of an assumption that's true for normal simulations but fails on TOMAS. The assumption is "tracers are independent through wet scavenging." Since TOMAS scavenging is size dependent, removing material from the distribution before calculating the soluble fraction of another component is "wrong." We now compute the fractions explicitly before the removal step. To do this, we now call routine <tt>COMPUTE_F</tt> in its own parallel DO loop located immediately before the main parallel do loop in <tt>NFCLDMX</tt>.</p><br />
#*<p>This modification also required the ND37 diagnostic IF block to be put into the same loop as <tt>COMPUTE_F</tt>. Furthermore, because <tt>COMPUTE_F</tt> returns the value of diagnostic index <tt>ISOL</tt>, and because <tt>ISOL</tt> is also used for the ND38 diagnostic in the main parallel loop below, we must also save the values of <tt>ISOL</tt> in a 1-D vector. This will allow the values of ISOL to be passed from the first parallel loop to the second. This ensures that the ND37 and ND38 diagnostics will be computed properly for all GEOS-5 simulations that have soluble tracers.</p><br />
#*<p>This modification has been tested in the [[GEOS-Chem Unit Tester]] by Bob Yantosca (04 Feb 2014) and it has yielded identical results for <tt>geos5_4x5_fullchem</tt>, <tt>geos5_4x5_Hg</tt>, <tt>geos5_4x5_RnPbBe</tt>, <tt>geos5_4x5_soa</tt> and <tt>geos5_4x5_soa_svpoa</tt> simulations.</p><br />
#<p>We have made some fixes in <tt>GeosCore/wetscav_mod.F</tt> that caused single-processor TOMAS runs to have different output than multi-processor runs. A few instances of code were computing quantities sequentially and then storing them for later use. These were technically thread-safe, but were susceptible to error because the order of computation would be different when running with parallelization turned on. These sections of code have now been rewritten accordingly.</p><br />
<br />
--[[User:Bmy|Bob Y.]] 14:09, 21 February 2014 (EST)<br />
<br />
==== Removed inefficient subroutine calls ====<br />
<br />
#<p>In <tt>GeosCore/diag3.F</tt>, we now use a 2-D array <tt>(J-L)</tt> for archiving into the ND60 TOMAS diagnostic. This eliminates an array temporary in the call to routine BPCH2.</p><br />
#<p>In routine <tt>AEROPHYS</tt> (in module <tt>GeosCore/tomas_mod.F</tt>), we now use an array <tt>ERR_IND</tt> to pass the I,J,L,N indices to error checking routine <tt>CHECK_VALUE</tt>. We previously used an array descriptor <tt>(/I,J,L,0/)</tt> which caused an array temporary to be created.</p><br />
#<p>In routine <tt>EMISSCARBON</tt> (in module <tt>GeosCore/carbon_mod.F</tt>), we removed array temporaries from the calls to subroutine <tt>EMITSGC</tt>. We now sum two arrays into a temporary array, and then pass that to <tt>EMITSGC</tt>.</p><br />
#<p>We rewrote the subroutine calls to NH4BULKTOBIN to avoid the creation of array temporaries. In most cases this was done by replacing <tt>MK(1:IBINS,SRTSO4)</tt> with <tt>MK(:,SRTSO4)</tt>, etc. By explicitly stating the sub-slice <tt>MK(1:IBINS,SRTSO4)</tt>, this causes the compiler to create an array temporary. Using <tt>MK(:,SRTSO4)</tt> instead allows for a more efficient pointer slice to be passed.</p><br />
<br />
--[[User:Bmy|Bob Y.]] 14:47, 31 January 2014 (EST)<br />
<br />
==== Fixes for convenience ====<br />
<br />
#<p>We now read many of the TOMAS data files from the directory <tt>TRIM( DATA_DIR_1x1 ) // 'TOMAS_201402/'</tt>. This avoids us from having to keep these big files (some of which approach 100 MB in size) in individual users' run directories.</p><br />
<br />
--[[User:Bmy|Bob Y.]] 16:20, 31 January 2014 (EST)<br />
<br />
#<p>Standard GC bulk dust is now unavailable in tomas simulations. Including the option for bulk dust in tomas simulations led to very confusing logical constructs, causing neither to function in a TOMAS simulation. </p><br />
--[[User:Salvatore Farina|Salvatore Farina]] 16:01, 3 March 2014 (EST)<br />
<br />
== Outstanding issues ==<br />
<br />
=== Potential parallelization problems ===<br />
<br />
We have noticed that there may be a parallelization error in the TOMAS [http://acmg.seas.harvard.edu/geos/doc/man/appendix_5.html ND60 diagnostic]. This may be caused by a coding error; in particular, one or more variables that may have been omitted from an <tt>!$OMP+PRIVATE</tt> declaration.<br />
<br />
This is illustrated by the following [[GEOS-Chem_Unit_Tester#Interpreting_results_generated_by_the_GEOS-Chem_Unit_Tester|unit test simulation]] of the [[GEOS-Chem v9-01-02]] provisional release code (submitted at 2:11 PM on 21 Feb 2014):<br />
<br />
###############################################################################<br />
### VALIDATION OF GEOS-CHEM OUTPUT FILES<br />
### In directory: geos5_4x5_TOMAS40<br />
###<br />
### File 1 : trac_avg.geos5_4x5_TOMAS40.2005070100.sp<br />
### File 2 : trac_avg.geos5_4x5_TOMAS40.2005070100.mp<br />
### Sizes : IDENTICAL (707260156 and 707260156)<br />
### Checksums : DIFFERENT (895530022 and 2949483685)<br />
### Diffs : DIFFERENT<br />
###<br />
### File 1 : trac_rst.geos5_4x5_TOMAS40.2005070101.sp<br />
### File 2 : trac_rst.geos5_4x5_TOMAS40.2005070101.mp<br />
### Sizes : IDENTICAL (263480068 and 263480068)<br />
### Checksums : IDENTICAL (1925551193 and 1925551193)<br />
### Diffs : IDENTICAL<br />
###<br />
### File 1 : soil_rst.geos5_4x5_TOMAS40.2005070101.sp<br />
### File 2 : soil_rst.geos5_4x5_TOMAS40.2005070101.mp<br />
### Sizes : IDENTICAL (54040 and 54040)<br />
### Checksums : IDENTICAL (3229970876 and 3229970876)<br />
### Diffs : IDENTICAL<br />
###############################################################################<br />
<br />
In the above test, all TOMAS diagnostics (ND59, ND60, and ND61) were turned on. The restart files (here named <tt>trac_rst.*</tt>) from the single-processor and multi-processor stages of the unit test are identical, but the <tt>ctm.bpch</tt> files (here named <tt>trac_avg.*</tt>) were different. When the restart files are identical, that means single-processor and multi-processor stages produced the identical tracer concentrations (and soil NOx quantities). <br />
<br />
The only differences in the <tt>trac.avg.*</tt> files between the single-processor and multi-processor stages of the unit test were in TOMAS diagnostic quantities. The affected categories appear to be <tt>TMS-COND</tt>, <tt>TMS-COAG</tt>, <tt>TMS-NUCL</tt>, <tt>AERO-FIX</tt>, which points to the ND60 diagnostic.<br />
<br />
In order to confirm that the ND60 diagnostic exhibits the problem, we ran an additional unit test with ND59 and ND61 turned on, but ND60 turned off. This unit test, which was submitted at 3:33PM on 21 Feb 2014, yielded identical results.<br />
<br />
###############################################################################<br />
### VALIDATION OF GEOS-CHEM OUTPUT FILES<br />
### In directory: geos5_4x5_TOMAS40<br />
###<br />
### File 1 : trac_avg.geos5_4x5_TOMAS40.2005070100.sp<br />
### File 2 : trac_avg.geos5_4x5_TOMAS40.2005070100.mp<br />
### Sizes : IDENTICAL (690218236 and 690218236)<br />
### Checksums : IDENTICAL (4196844107 and 4196844107)<br />
### Diffs : IDENTICAL<br />
###<br />
### File 1 : trac_rst.geos5_4x5_TOMAS40.2005070101.sp<br />
### File 2 : trac_rst.geos5_4x5_TOMAS40.2005070101.mp<br />
### Sizes : IDENTICAL (263480068 and 263480068)<br />
### Checksums : IDENTICAL (1925551193 and 1925551193)<br />
### Diffs : IDENTICAL<br />
###<br />
### File 1 : soil_rst.geos5_4x5_TOMAS40.2005070101.sp<br />
### File 2 : soil_rst.geos5_4x5_TOMAS40.2005070101.mp<br />
### Sizes : IDENTICAL (54040 and 54040)<br />
### Checksums : IDENTICAL (3229970876 and 3229970876)<br />
### Diffs : IDENTICAL<br />
###############################################################################<br />
<br />
We are still looking into this issue. Because this issue only affects the ND60 diagnostic output, but not tracer concentrations, we are moving ahead with the TOMAS benchmarks for [[GEOS-Chem v9-02]] (as of 21 Feb 2014). <br />
<br />
--[[User:Bmy|Bob Y.]] 16:17, 21 February 2014 (EST)<br />
<br />
=== Offline Dust ===<br />
Currently, GEOS-Chem with TOMAS uses proscribed offline dust aerosol data in radiative transfer / photolysis calculations. Due to complications, this is turned off entirely for 2x2.5 resolution.<br />
<br />
=== Vertical Grids ===<br />
Currently, GC-TOMAS is only compatible with the reduced vertical grids:<br />
* [http://acmg.seas.harvard.edu/geos/doc/man/appendix_3.html#A3.3.1 GEOS3_30L]<br />
* [http://acmg.seas.harvard.edu/geos/doc/man/appendix_3.html#A3.4.1 GEOS4_30L]<br />
* [http://acmg.seas.harvard.edu/geos/doc/man/appendix_3.html#A3.5.1 GEOS5_47L]<br />
<br />
Development for the full vertical grids is ongoing.<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 20:43, 10 February 2010 (EST)<br />
<br />
== Obsolete versions of TOMAS ==<br />
<br />
In this section we preserve information that pertained to older versions of TOMAS (before the [[GEOS-Chem v9-02]] release).<br />
<br />
=== Code structure ===<br />
<br />
<div style="color: #aa0000; background: #eeeeee;border: 3px solid red; padding: 1em; margin: auto; width: 90%; ">'''<p>NOTE: This has been rendered obsolete by the [[#Update April 2013|re-integration of TOMAS into GEOS-Chem]], which was included in [[GEOS-Chem v9-02]]. All of the TOMAS routines have now been ported into the <tt>GeosCore</tt> directory. We shall leave this post here for reference. (Bob Yantosca, 30 Jan 2014)</p>'''</div><br />
<br />
The main-level <tt>Code</tt> directory has now been divided into several subdirectories:<br />
<br />
GeosCore/ GEOS-Chem "core" routines<br />
GeosTomas/ Parallel copies of GEOS-Chem routines that reference TOMAS<br />
GeosUtil/ "Utility" modules (e.g. error_mod.f, file_mod.f, time_mod.f, etc.<br />
Headers/ Header files (define.h, CMN_SIZE, CMN_DIAG, etc.)<br />
KPP/ KPP solver directory structure<br />
bin/ Directory where executables are placed<br />
doc/ Directory where documentation is created<br />
help/ Directory for GEOS-Chem Help Screen<br />
lib/ Directory where library files are placed<br />
mod/ Directory where module files are placed<br />
obsolete/ Directory where obsolete versions of code are archived<br />
<br />
Because there were a lot of TOMAS-related modifications in several GEOS-Chem "core" routines, the routines that need to "talk" to TOMAS were placed into a separate subdirectory named <tt>GeosTomas/</tt>. The files in <tt>GeosTomas</tt> are:<br />
<br />
Files:<br />
------<br />
Makefile -- GEOS-Chem routines that have been<br />
aero_drydep.f modified to reference the TOMAS aerosol<br />
carbon_mod.f microphysics package. These are kept<br />
chemdr.f in a separate GeosTomas directory so that<br />
chemistry_mod.f they do not interfere with the routines<br />
cleanup.f in the GeosCore directory.<br />
diag3.f<br />
diag_mod.f The GeosTomas directory only needs to<br />
diag_pl_mod.f contain the files that have been modified<br />
drydep_mod.f for TOMAS. The Makefile will look for<br />
dust_mod.f all other files from the GeosCore directory<br />
emissions_mod.f using the VPATH option in GNU Make.<br />
gamap_mod.f<br />
initialize.f NOTE to GEOS-Chem developers: When you<br />
input_mod.f make changes to any of these routines<br />
isoropia_mod.f in the GeosCore directory, you must also<br />
logical_mod.f make the same modifications to the<br />
ndxx_setup.f corresponding routines in the GeosTomas<br />
planeflight_mod.f directory.<br />
seasalt_mod.f<br />
sulfate_mod.f Maybe in the near future we can work<br />
tomas_mod.f towards integrating TOMAS into the GeosCore<br />
tomas_tpcore_mod.f90 directory more cleanly. However, due to<br />
tpcore_mod.f the large number of modifications that were<br />
tpcore_window_mod.f necessary for TOMAS, it was quicker to<br />
tracerid_mod.f implement the TOMAS code in a separate<br />
wetscav_mod.f subdirectory. <br />
xtra_read_mod.f -- Bob Y. (1/25/10)<br />
<br />
Each of these files were merged with the corresponding files in the <tt>GeosCore</tt> subdirectory. Therefore, in addition to having the GEOS-Chem modifications from [[GEOS-Chem v8-02-05|v8-02-05]], these files also have the relevant TOMAS references.<br />
<br />
A few technical considerations dictated the placing of these files into a separate <tt>GeosTomas/</tt> directory:<br />
<br />
* The ND60 diagnostic in the standard GEOS-Chem code (in <tt>GeosCore/</tt>) is now used for the CH4 offline simulation, but in TOMAS it's used for something else. <br />
* Some parameters needed to be declared differently with for simulations with TOMAS. <br />
* Because not all GEOS-Chem users will choose to use TOMAS, we did not want to unnecessarily bog down the code in <tt>GeosCore/</tt> with references to TOMAS-specific routines. <br />
<br />
All of these concerns could be best solved by keeping parallel copies of the affected routines in the <tt>GeosTomas</tt> directory.<br />
<br />
--[[User:Bmy|Bob Y.]] 13:35, 25 February 2010 (EST)<br />
<br />
=== Building GEOS-Chem with TOMAS ===<br />
<br />
<div style="color: #aa0000; background: #eeeeee;border: 3px solid red; padding: 1em; margin: auto; width: 90%; ">'''<p>NOTE: This has been rendered obsolete by the [[#Update April 2013|re-integration of TOMAS into GEOS-Chem]], which was included in [[GEOS-Chem v9-02]]. All of the TOMAS routines have now been ported into the <tt>GeosCore</tt> directory. We shall leave this post here for reference. (Bob Yantosca, 25 Feb 2014)</p>'''</div><br />
<br />
The <tt>VPATH</tt> feature of [http://www.gnu.org/software/make/manual/make.html GNU Make] is used to simplify the compilation. When GEOS-Chem is compiled with the tomas target, the GNU Make utility will search for files in the <tt>GeosTomas/</tt> directory first. If it cannot find files there, it will then search the <tt>GeosCore/</tt> directory. Thus, if we make a change to a "core" GEOS-Chem routine in the <tt>GeosCore/</tt> subdirectory (say in <tt>dao_mod.f</tt> or <tt>diag49_mod.f</tt>), then those changes will automatically be applied when you build GEOS-Chem with TOMAS. Thus, we only need to keep in <tt>GeosTomas/</tt> separate copies of those files that have to "talk" with TOMAS.<br />
<br />
Several new targets were added to the <tt>Makefile</tt> in the top-level <tt>Code/</tt> directory:<br />
<br />
#=============================================================================<br />
# Targets for TOMAS aerosol microphysics code (win, bmy, 1/25/10)<br />
#=============================================================================<br />
<br />
.PHONY: tomas libtomas exetomas cleantomas<br />
<br />
tomas:<br />
@$(MAKE) -C $(GEOSTOM) TOMAS=yes all<br />
<br />
libtomas:<br />
@$(MAKE) -C $(GEOSTOM) TOMAS=yes lib<br />
<br />
exetomas:<br />
@$(MAKE) -C $(GEOSTOM) TOMAS=yes exe<br />
<br />
cleantomas:<br />
@$(MAKE) -C $(GEOSTOM) TOMAS=yes clean<br />
<br />
You can build GEOS-Chem with the TOMAS option by typing:<br />
<br />
make tomas ...<br />
<br />
This will automatically do the proper things to build the TOMAS code into GEOS-Chem, such as:<br />
<br />
* Adding a <tt>-DTOMAS</tt> C-preprocessor switch to the <tt>FFLAGS</tt> compiler flag settings in <tt>Makefile_header.mk</tt>. This will cause TOMAS-specific areas of code to be turned on.<br />
* Turning off OpenMP parallelization. For now the GEOS-Chem + TOMAS code needs to be run on a single processor. We continue to work on parallelizing the code.<br />
* Calling the Makefile in the <tt>GeosTomas/</tt> subdirectory to build the executable. The executable file is now named <tt>geostomas</tt> in order to denote that the TOMAS code is built in.<br />
<br />
The GEOS-Chem + TOMAS has been built on the following compilers<br />
<br />
* Intel Fortran compiler v10<br />
* Intel Fortran compiler v11.1 (20101201)<br />
* SunStudio 12<br />
<br />
--[[User:Bmy|Bob Y.]] 10:36, 27 January 2010 (EST)<br />
<br />
=== Compile from GeosTomas directory ===<br />
<br />
<div style="color: #aa0000; background: #eeeeee;border: 3px solid red; padding: 1em; margin: auto; width: 90%; ">'''<p>NOTE: This has been rendered obsolete by the [[#Update April 2013|re-integration of TOMAS into GEOS-Chem]], which was included in [[GEOS-Chem v9-02]]. We shall leave this post here for reference. (Bob Yantosca, 30 Jan 2014)</p>'''</div><br />
<br />
'''''[mailto:dwesterv@andrew.cmu.edu Dan Westervelt] wrote:'''''<br />
<br />
:I think there is something going wrong in my compilation, although errors have come up at both compile time and run time. The worst of the problems is this: I'll make a change to any fortran file in the code (even something meaningless like print*, 'foo') and hundreds of compile errors come out with fishy error messages such as (from ifort v10.1):<br />
<br />
***fortcom: Error: chemistry_mod.f, line 478: A kind type parameter must be a compile-time constant. [DP]<br />
REAL(kind=dp) :: RCNTRL(20)<br />
<br />
:Any advice? The errors I'm having are not unique to any version of GC, any type of met fields, any compiler, etc.<br />
<br />
'''''[mailto:yantosca@seas.harvard.edu Bob Yantosca] wrote:'''''<br />
<br />
:Make sure you are always in the GeosTomas subdirectory when you build the code. Sometimes there is a problem if you build the code from a higher level directory. This may have to do with the VPATH in the makefile.<br />
<br />
'''''[mailto:dwesterv@andrew.cmu.edu Dan Westervelt] wrote:'''''<br />
<br />
:Thanks, that seems to do the trick.<br />
<br />
--[[User:Bmy|Bob Y.]] 14:37, 14 April 2010 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_aerosol_microphysics&diff=16042TOMAS aerosol microphysics2014-03-03T20:51:59Z<p>Salvatore Farina: /* Computational Information */</p>
<hr />
<div>This page describes the TOMAS aerosol microphysics option in GEOS-Chem. TOMAS is one of two aerosol microphysics packages being incorporated into GEOS-Chem, the other being [[APM aerosol microphysics|APM]].<br />
<br />
== Overview ==<br />
<br />
The TwO-Moment Aerosol Sectional (TOMAS) microphysics package was developed for implementation into GEOS-Chem at Carnegie-Mellon University. Using a moving sectional and moment-based approach, TOMAS tracks two independent moments (number and mass) of the aerosol size distribution for a number of discrete size bins. It also contains codes to simulate nucleation, condensation, and coagulation processes. The aerosol species that are considered with high size resolution are sulfate, sea-salt, OC, EC, and dust. An advantage of TOMAS is the full size resolution for all chemical species and the conservation of aerosol number, the latter of which allows one to construct aerosol and CCN number budgets that will balance.<br />
<br />
=== Authors and collaborators ===<br />
* [mailto:petera@andrew.cmu.edu Peter Adams] ''(Carnegie-Mellon U.)'' -- Principal Investigator<br />
* [mailto:wtrivita@staffmail.ed.ac.uk Win Trivitayanurak] ''(Department of Highways, Thailand)''<br />
* [mailto:dwesterv@andrew.cmu.edu Dan Westervelt] ''(Carnegie-Mellon U.)''<br />
* [mailto:jeffrey.pierce@dal.ca Jeffrey Pierce] ''(Dalhousie U.)''<br />
* [mailto:sal.farina@gmail.com Salvatore Farina] ''(Colorado State U.)''<br />
<br />
Questions regarding TOMAS can be directed at Dan (e-mail linked above).<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 11:53, 27 January 2010 (EST)<br />
<br />
=== TOMAS User Groups ===<br />
<br />
{| border=1 cellspacing=0 cellpadding=5<br />
|- bgcolor="#cccccc"<br />
!User Group<br />
!Personnel<br />
!Projects<br />
|-valign="top"<br />
|[http://www.ce.cmu.edu/%7Eadams/ Carnegie-Mellon University]<br />
|[http://www.ce.cmu.edu/%7Eadams/people.htm#peter Peter Adams]<br>[http://www.ce.cmu.edu/~dwesterv/Site/Home.html Dan Westervelt]<br />
| [http://www.atmos-chem-phys-discuss.net/13/8333/2013/acpd-13-8333-2013.html New particle formation evaluation in GC-TOMAS] <br> Sensitivity of CCN to nucleation rates <br> Development of number tagging and source apportionment model for GC-TOMAS<br />
|-valign="top"<br />
|[http://fizz.phys.dal.ca/%7Epierce/ Dalhousie University] <br> [http://www.atmos.colostate.edu/faculty/pierce.php Colorado State]<br />
|[http://atm.dal.ca/Faculty/Jeffrey_Pierce.php Jeffrey Pierce]<br>Sal Farina<br>Stephen D'Andrea<br />
|Sensitivity of CCN to condensational growth rates <br> TOMAS parallelization <br> Others...<br />
|-valign="top"<br />
|Add yours here<br />
|<br />
|<br />
|}<br />
<br />
== TOMAS-specific setup ==<br />
TOMAS has its own run directories (run.Tomas) that can be downloaded from the Harvard FTP. The <tt>input.geos</tt> file will look slightly different from standard GEOS-Chem, and between versions.<br />
<br />
Pre- v9.02:<br />
To turn on TOMAS, see the "Microphysics menu" in <tt>input.geos</tt> and make sure TOMAS is set to '''T'''. <br />
<br />
v9.02 and later:<br />
TOMAS is enabled or disabled at compile time - the TOMAS flag in input.geos has been removed.<br />
<br />
<br />
TOMAS is a simulation type 3 and utilizes 171-423 tracers. Each aerosol species requires 30 tracers for the 30 bin size resolution, 12 for the 12 bin, etc. Here is the (abbreviated) default setup in input.geos for TOMAS-30 in v9.02 and later (see run.Tomas directory):<br />
<br />
Tracer # Description <br />
1- 62 Std Geos Chem <br />
63 H2SO4 <br />
64- 93 Number <br />
94-123 Sulfate <br />
124-153 Sea-salt <br />
154-183 Hydrophilic EC <br />
184-213 Hydrophobic EC <br />
214-243 Hydrophilic OC <br />
244-273 Hydrophobic OC <br />
274-303 Mineral dust <br />
304-333 Aerosol water<br />
<br />
TOMAS-40 requires 423 tracers (~360 TOMAS tracers for each of the 40-bin species, and ~62 standard GEOS-Chem tracers) <br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 18:48, 8 July 2013 (EDT)<br />
<br />
== Implementation notes ==<br />
<br />
TOMAS validation in [[GEOS-Chem v8-03-01]] was completed on 24 Feb 2010.<br />
<br />
=== Update April 2013 ===<br />
<br />
'''''This update was tested in the 1-month benchmark simulation [[GEOS-Chem_v9-02_benchmark_history#v9-02k|v9-02k]] and approved on 07 Jun 2013.'''''<br />
<br />
Sal Farina has been working with the GEOS-Chem Support Team to inline the TOMAS aerosol microphysics code into the <tt>GeosCore</tt> directory. All TOMAS-specific sections of code are now segregated from the rest of GEOS-Chem with C-preprocessor statements such as:<br />
<br />
#if defined( TOMAS )<br />
<br />
# if defined( TOMAS40 ) <br />
... Code for 40 bin TOMAS simulation (optional) goes here ...<br />
# elif defined( TOMAS12 )<br />
... Code for 12 bin TOMAS simulation (optional) goes here ...<br />
# elif defined( TOMAS15 )<br />
... Code for 15 bin TOMAS simulation (optional) goes here ...<br />
# else<br />
... Code for 30 bin TOMAS simulation (default) goes here ...<br />
# endif<br />
<br />
#endif <br />
<br />
TOMAS is now invoked by compiling GEOS-Chem with one of the following options:<br />
<br />
make -j4 TOMAS=yes ... # Compiles GEOS-Chem for the 30 bin (default) TOMAS simulation<br />
# -j4 compiles 4 files at a time; this reduces overall compilation time<br />
<br />
or<br />
<br />
make -j4 TOMAS40=yes ... # Compiles GEOS-Chem for the 40 bin (optional) TOMAS simulation<br />
# -j4 compiles 4 files at a time; this reduces overall compilation time<br />
<br />
All files in the old <tt>GeosTomas/</tt> directory have now been deleted, as these have been rendered obsolete.<br />
<br />
These updates are included in [[GEOS-Chem v9-02]]. These modifications will not affect the existing GEOS-Chem simulations, as all TOMAS code is not compiled into the executable unless you specify either <tt>TOMAS=yes</tt> or <tt>TOMAS40=yes</tt> at compile time.<br />
<br />
We are in the process of updating the wiki to reflect these changes as they are implemented. <br />
<br />
--[[User:Bmy|Bob Y.]] 13:59, 23 April 2013 (EDT)<br><br />
--[[User:Salvatore Farina|Salvatore Farina]] 13:49, 4 June 2013 (EDT)<br />
<br />
== Computational Information ==<br />
<br />
GC-TOMAS v9-02 (30 sections) on 8 processors: <br />
One year simulation = 7-8 days wall clock time<br />
<br />
More speedups are available using lower aerosol size resolution<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 11:00, 07 May 2013 (EST)<br />
<br />
GC-TOMAS v9-03 on 16 processors (glooscap)<br />
<br />
12 bin: 2.8 days wall time per sim year<br />
<br />
15 bin: 3.3 days wall time per sim year<br />
<br />
30 bin: 6.1 days wall time per sim year<br />
<br />
40 bin: 7.8 days wall time per sim year<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 15:51, 3 March 2014 (EST)<br />
<br />
== Microphysics Code==<br />
The aerosol microphysics code is largely contained within the file <tt>tomas_mod.f</tt>. Tomas_mod and its subroutines are modular -- they use all their own internal variables. For details, see tomas_mod.f and comments. <br />
<br />
=== Nucleation ===<br />
The choice of nucleation theory is selected in the header section of <tt>tomas_mod.f</tt>. The choices are currently binary homogeneous nucleation as in Vehkamaki, 2001 or ternary homogenous nucleation as in Napari et al., 2002. The ternary nucleation rate is typically scaled by a globally uniform tuning factor of 10^-4 or 10^-5. Binary nucleation (Vehkamaki et al. 2002), ion-mediated nucleation (Yu, 2008) and activation nucleation (Kulmala, 2006) are options as well.<br />
<br />
In TOMAS-12 and TOMAS-30, nucleated particles follow the Kerminen approximation to grow to the smallest size bin. This has a tendency to overpredict the number of particles in the smallest bins of those models. See Y. H. Lee, J. R. Pierce, and P. J. Adams 2013 [http://www.geosci-model-dev-discuss.net/6/893/2013/gmdd-6-893-2013.html here] for more details on the consequences of this.<br />
<br />
=== Condensation ===<br />
<br />
=== Coagulation ===<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 14:08, 9 May 2011 (EST)<br />
<br />
== Validation ==<br />
<br />
GC-TOMAS [[GEOS-Chem v8-03-01|v8-03-01]] generally compares very well with observations and other models. Please see our [http://acmg.seas.harvard.edu/geos/wiki_docs/TOMAS/TOMAS_benchmark_ForHarvard.pdf GC-TOMAS v8-02-05 validation document] for more information and figures. <br />
<br />
Below are some results of benchmarking GC-TOMAS with earlier versions of the model as well as observations:<br />
<br />
[[Image:CN10_smaller.jpg]]<br />
<br />
'''Figure 1: CN10 concentrations predicted by GC-TOMAS v8-02-05 against observations. Descriptions of observational data can be found on p 5454 of Pierce et al, Atmos. Chem. Phys., 7, 2007.'''<br />
<br />
----<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 20:13, 10 February 2010 (EST)<br />
<br />
== Other features of TOMAS ==<br />
Other varieties of TOMAS are suited for specific science questions, for example with nucleation studies where explicit aerosol dynamics are needed for nanometer-sized particles. <br />
<br />
=== Set-up Guide ===<br />
<br />
This [[TOMAS setup guide]] was written for users on ACE-NET's Glooscap cluster, but may be more generally applicable.<br />
Please contact [mailto:sal.farina@gmail.com Salvatore Farina] for help in obtaining the latest development version of GEOS-Chem with TOMAS.<br />
This will allow you to take advantage of parallel computation in TOMAS.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 11:55, 26 July 2013 (EDT)<br />
<br />
=== Size Resolution ===<br />
<br />
;TOMAS-30: All 7 chemical species have size resolution ranging from 10 nm to 10 µm, spanned by 30 logarithmically spaced (mass doubling) bins.<br />
;TOMAS-40: Same as TOMAS-30 with 10 additional (mass doubling) sub-10nm bins with a lower limit ~1nm<br />
;TOMAS-12: All 7 chemical species have size resolution ranging from 10 nm to 1 µm spanned by 10 logarithmically spaced (mass quadrupling) bins and two supermicron bins. Coarser resolution than TOMAS-30 - Improved computation time. <br />
;TOMAS-15: Same as TOMAS-12 with 3 additional (mass quadrupling) sub-10nm bins with a lower limit ~2nm. Analogous to TOMAS40 with improved computation time.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 12:51, 4 June 2013 (EDT)<br />
<br />
=== Nesting and grid size ===<br />
TOMAS is implemented on a 2x2.5 North American domain. Developed by Jeffrey Pierce (jeffrey.pierce@dal.ca)<br />
<br />
=== AOD, CCN post-processing code ===<br />
Codes available for calculating aerosol optical depth using TOMAS predicted aerosol composition and size and Mie Theory. Also CCN concentrations calculated from TOMAS size-resolved composition and Kohler theory. Developed by Yunha Lee and Jeffrey Pierce, adapted for GEOS-Chem output by Jeffrey Pierce.<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 2:00, 9 May 2011 (EST)<br />
<br />
== References ==<br />
<br />
In this section we provide references relevant to TOMAS aerosl microphysics simulations.<br />
<br />
=== Studies using TOMAS simulations ===<br />
#'''Nucleation in GEOS-Chem:''' Westervelt, D. M., Pierce, J. R., Riipinen, I., Trivitayanurak, W., Hamed, A., Kulmala, M., Laaksonen, A., Decesari, S., and Adams, P. J.: ''Formation and growth of nucleated particles into cloud condensation nuclei: model-measurement comparison'', <u>Atmos. Chem. Phys. Discuss.</u>, '''13''', 8333-8386, doi:10.5194/acpd-13-8333-2013, 2013. [http://www.atmos-chem-phys-discuss.net/13/8333/2013/acpd-13-8333-2013.html LINK]<br />
#'''TOMAS implementation in GEOS-Chem:''' Trivitayanurak, W., Adams, P. J., Spracklen, D. V. and Carslaw, K. S.: ''Tropospheric aerosol microphysics simulation with assimilated meteorology: model description and intermodel comparison'', <u>Atmos. Chem. Phys.</u>, '''8'''(12), 3149-3168, 2008.<br />
#'''TOMAS initial paper, sulfate only:''' Adams, P. J. and Seinfeld, J. H.: ''redicting global aerosol size distributions in general circulation models'', <u>J. Geophys. Res.-Atmos.</u>, '''107'''(D19), -, doi:Artn 4370 Doi 10.1029/2001jd001010, 2002.<br />
#'''TOMAS with sea-salt:''' Pierce, J.R., and Adams P.J., ''Global evaluation of CCN formation by direct emission of sea salt and growth of ultrafine sea salt'', <u>J. Geophys. Res.-Atmos.</u>, '''111''' (D6), doi:10.1029/2005JD006186, 2006.<br />
#'''TOMAS with carbonaceous aerosol:''' Pierce, J. R., Chen, K. and Adams, P. J.: ''Contribution of primary carbonaceous aerosol to cloud condensation nuclei: processes and uncertainties evaluated with a global aerosol microphysics model'', <u>Atmos. Chem. Phys.</u>, '''7'''(20), 5447-5466, doi:10.5194/acp-7-5447-2007, 2007.<br />
#'''TOMAS with dust:''' Lee, Y.H., K. Chen, and P.J. Adams, 2009: ''Development of a global model of mineral dust aerosol microphysics''. <u>Atmos. Chem. Phys.</u>, '''8''', 2441-2558, doi:10.5194/acp-9-2441-2009.<br />
<br />
--[[User:Bmy|Bob Y.]] 17:04, 24 February 2014 (EST)<br />
<br />
=== Input data used by TOMAS ===<br />
#Usoskin, I. G. and Kovaltsov, G. A., ''Cosmic ray induced ionization in the atmosphere: Full modeling and practical applications'', <u>J. Geophys. Res.</u>, '''111''', doi:10.1029/2006JD007150, 2006..<br />
#Yu, Fangqun, et al, ''Ion-mediated nucleation in the atmosphere: Key controlling parameters, implications, and look-up table'', <u>J. Geophys. Res.</u>, '''115''', D03206, doi:10.1029/2009JD012630, 2010.<br />
<br />
--[[User:Bmy|Bob Y.]] 17:03, 24 February 2014 (EST)<br />
<br />
== Previous issues now resolved ==<br />
<br />
=== Minor bug in TOMAS sulfate emissions ===<br />
<br />
'''''This update was tested in the 1-month benchmark simulation [[GEOS-Chem_v9-02_benchmark_history#v9-02o|v9-02o]] and approved on 03 Sep 2013.'''''<br />
<br />
'''''[mailto:sal.farina@gmail.com Sal Farina] wrote:'''''<br />
:Calling mnfix before and after emission ensures the size distribution is well behaved, and eliminates "Negative SF emis" warnings. An edit to mnfix was also introduced, whereby "tiny" mass added to zero mass, "epsilon" number situations resulted in very high mass per particle results - necessitating excessive error detection, correction, and verbosity.<br />
<br />
--[[User:Melissa Payer|Melissa Sulprizio]] 15:08, 7 August 2013 (EDT)<br />
<br />
=== Segmentation Fault ===<br />
You may get an early segfault if your stacksize is not set to either unlimited or a very large number. To avoid this, you either have to change the value of an environmental variable (setenv command in <tt>.cshrc</tt>) or use the <tt>ulimit</tt> command. See [http://wiki.seas.harvard.edu/geos-chem/index.php/Machine_issues_%26_portability#Resetting_stacksize_for_Linux this page] for details.<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 20:20, 10 February 2010 (EST)<br />
<br />
=== Updates for GEOS-Chem v9-02 public release ===<br />
<br />
<div style="color: #aa0000; background: #eeeeee;border: 3px solid red; padding: 1em; margin: auto; width: 90%; ">'''<p>NOTE: As described below, there appears to be [[#Potential parallelization problems|a potential parallelizaiton problem with the TOMAS ND60 diagnostic]]. We are currently looking into this. This issue, however, does not affect the tracer concentrations computed by TOMAS, but only the output of the ND60 diagnostic itself. For this reason we are moving ahead with the TOMAS benchmarks for v9-02. (Bob Yantosca, 21 Feb 2014)</p>'''</div><br />
<br />
We have found and fixed several minor numerical and coding issues prior to the public release of [[GEOS-Chem v9-02]] (01 Mar 2014). The TOMAS40 simulation has been validated with the [[GEOS-Chem Unit Tester]]. Below is the [[GEOS-Chem Unit Tester#Interpreting_results_generated_by_the_GEOS-Chem_Unit_Tester|output of a unit test]] that was submitted on 2014/02/21 at 12:47:26 PM:<br />
<br />
###############################################################################<br />
### VALIDATION OF GEOS-CHEM OUTPUT FILES<br />
### In directory: geos5_4x5_TOMAS40<br />
###<br />
### File 1 : trac_avg.geos5_4x5_TOMAS40.2005070100.sp<br />
### File 2 : trac_avg.geos5_4x5_TOMAS40.2005070100.mp<br />
### Sizes : IDENTICAL (680420788 and 680420788)<br />
### Checksums : IDENTICAL (179613338 and 179613338)<br />
### Diffs : IDENTICAL<br />
###<br />
### File 1 : trac_rst.geos5_4x5_TOMAS40.2005070101.sp<br />
### File 2 : trac_rst.geos5_4x5_TOMAS40.2005070101.mp<br />
### Sizes : IDENTICAL (263480068 and 263480068)<br />
### Checksums : IDENTICAL (1925551193 and 1925551193)<br />
### Diffs : IDENTICAL<br />
###<br />
### File 1 : soil_rst.geos5_4x5_TOMAS40.2005070101.sp<br />
### File 2 : soil_rst.geos5_4x5_TOMAS40.2005070101.mp<br />
### Sizes : IDENTICAL (54040 and 54040)<br />
### Checksums : IDENTICAL (3229970876 and 3229970876)<br />
### Diffs : IDENTICAL<br />
###############################################################################<br />
<br />
In the subsections below, we describe in more detail the fixes that we made for [[GEOS-Chem v9-02]]:<br />
<br />
==== Fixes for minor coding errors ====<br />
<br />
#<p>In <tt>GeosCore/main.F</tt>, we now replaced <tt>CALL FLUSH()</tt> with <tt>CALL FLUSH(6)</tt>. The <tt>FLUSH</tt> routine needs to take an argument. Unit #6 is the unit stdout (i.e. the screen and/or log file).</p><br />
#<p>In routine <tt>CHEM_SO2</tt> (in module <tt>GeosCore/sulfate_mod.F</tt>), we now avoid referencing the dust tracers DST1, DST2, DST3, and DST4 tracers for TOMAS simulations. TOMAS uses size-resolved dust tracers, and therefore does not carry DST1-4 tracers. This error seems to have been introduced when the fix for cloud pH was introduced in Sep 2013.</p><br />
#<p>In routine <tt>COND_NUC</tt> (in module <tt>GeosCore/tomas_mod.F</tt>), we added error traps to avoid division-by-zero errors that occurred when the variable <tt>CSCH</tt> is zero. When <tt>CSCH</tt> is zero, we now set variable <tt>ADDT</tt> to zero. When <tt>ADDT</tt> is zero, it will get reassigned to a minimum time step, so this fix should work OK.</p><br />
#<p>In <tt>GeosCore/gamap_mod.F</tt>, we now have restored several entries to <tt>tracerinfo.dat</tt> for the ND44 diagnostic that were not getting properly printed out when the TOMAS simuation was being used.</p><br />
#<p>In module <tt>GeosCore/drydep_mod.F</tt>, we Now set <tt>MAXDEP=105</tt> for all simulations, including TOMAS. Formerly, TOMAS had <tt>MAXDEP=100</tt>. This is close enough.</p><br />
#<p>In module <tt>GeosCore/diag3.F</tt>, we now avoid an out-of-bounds error in <tt>DEPNAME(N)</tt> during TOMAS simulations. We save the drydep species name from <tt>DEPNAME(N)</tt> into an new variable <tt>DRYDEP_NAME</tt> for <tt>N = 1..NUMDEP</tt>. We then set <tt>DRYDEP_NAME = ''</tt> for <tt>N > NUMDEP</tt>. This error occurs because we extend the # of drydep tracers during TOMAS simulations to account for the size bins.</p><br />
#<p>We have fixed a couple of logical errors that prevented dust emissions from happening. Minor modifications were made to IF statements in <tt>GeosCore/chemistry_mod.F</tt>, <tt>GeosCore/dust_mod.F</tt>, and <tt>GeosCore/input_mod.F</tt>.</p><br />
#<p>In file <tt>GeosCore/Makefile</tt>, make sure to add <tt>tomas_mod.o</tt> to the list of modules used by <tt>wetscav_mod.F</tt> (aka the "dependency listing"). The corrected code should look like this:</p><br />
<br />
wetscav_mod.o : wetscav_mod.F \<br />
dao_mod.o diag_mod.o \<br />
depo_mercury_mod.o get_ndep_mod.o \<br />
get_popsinfo_mod.o tracerid_mod.o \<br />
tracer_mod.o tomas_mod.o<br />
<br />
--[[User:Bmy|Bob Y.]] 10:20, 19 February 2014 (EST)<br />
<br />
==== Fixes for parallelization errors ====<br />
<br />
#<p>In routine <tt>AEROPHYS</tt> (in module <tt>GeosCore/tomas_mod.F</tt>), we need to add the following variables to the <tt>!$OMP+PRIVATE</tt> statement: <tt>TRACNUM</tt>, <tt>NH3_TO_NH4</tt>, and <tt>SURF_AREA</tt>. Adding these now causes TOMAS to have identical sp vs. mp results when chemistry and microphysics are turned on.</p><br />
#<p>In routine <tt>DEPVEL</tt> (in <tt>GeosCore/drydep_mod.F</tt>): Instead of holding <tt>A_RADI</tt> and <tt>A_DEN</tt> as <tt>!$OMP+PRIVATE</tt> in TOMAS simulations (in the main DO loop in <tt>DEPVEL</tt>), we now save the particle size and density values to private variables <tt>DIAM</tt> and <tt>DEN</tt>. We then pass those as arguments to function <tt>DUST_SFCRSII</tt>.</p> <br />
#<p>We have corrected an issue in routine <tt>NFCLDMX</tt> (in module <tt>GeosCore/convection_mod.F</tt>) that potentially impacts the TOMAS wet scavenging, as described below:</p><br />
#*<p>We think there are different results for parallel and serial because of an assumption that's true for normal simulations but fails on TOMAS. The assumption is "tracers are independent through wet scavenging." Since TOMAS scavenging is size dependent, removing material from the distribution before calculating the soluble fraction of another component is "wrong." We now compute the fractions explicitly before the removal step. To do this, we now call routine <tt>COMPUTE_F</tt> in its own parallel DO loop located immediately before the main parallel do loop in <tt>NFCLDMX</tt>.</p><br />
#*<p>This modification also required the ND37 diagnostic IF block to be put into the same loop as <tt>COMPUTE_F</tt>. Furthermore, because <tt>COMPUTE_F</tt> returns the value of diagnostic index <tt>ISOL</tt>, and because <tt>ISOL</tt> is also used for the ND38 diagnostic in the main parallel loop below, we must also save the values of <tt>ISOL</tt> in a 1-D vector. This will allow the values of ISOL to be passed from the first parallel loop to the second. This ensures that the ND37 and ND38 diagnostics will be computed properly for all GEOS-5 simulations that have soluble tracers.</p><br />
#*<p>This modification has been tested in the [[GEOS-Chem Unit Tester]] by Bob Yantosca (04 Feb 2014) and it has yielded identical results for <tt>geos5_4x5_fullchem</tt>, <tt>geos5_4x5_Hg</tt>, <tt>geos5_4x5_RnPbBe</tt>, <tt>geos5_4x5_soa</tt> and <tt>geos5_4x5_soa_svpoa</tt> simulations.</p><br />
#<p>We have made some fixes in <tt>GeosCore/wetscav_mod.F</tt> that caused single-processor TOMAS runs to have different output than multi-processor runs. A few instances of code were computing quantities sequentially and then storing them for later use. These were technically thread-safe, but were susceptible to error because the order of computation would be different when running with parallelization turned on. These sections of code have now been rewritten accordingly.</p><br />
<br />
--[[User:Bmy|Bob Y.]] 14:09, 21 February 2014 (EST)<br />
<br />
==== Removed inefficient subroutine calls ====<br />
<br />
#<p>In <tt>GeosCore/diag3.F</tt>, we now use a 2-D array <tt>(J-L)</tt> for archiving into the ND60 TOMAS diagnostic. This eliminates an array temporary in the call to routine BPCH2.</p><br />
#<p>In routine <tt>AEROPHYS</tt> (in module <tt>GeosCore/tomas_mod.F</tt>), we now use an array <tt>ERR_IND</tt> to pass the I,J,L,N indices to error checking routine <tt>CHECK_VALUE</tt>. We previously used an array descriptor <tt>(/I,J,L,0/)</tt> which caused an array temporary to be created.</p><br />
#<p>In routine <tt>EMISSCARBON</tt> (in module <tt>GeosCore/carbon_mod.F</tt>), we removed array temporaries from the calls to subroutine <tt>EMITSGC</tt>. We now sum two arrays into a temporary array, and then pass that to <tt>EMITSGC</tt>.</p><br />
#<p>We rewrote the subroutine calls to NH4BULKTOBIN to avoid the creation of array temporaries. In most cases this was done by replacing <tt>MK(1:IBINS,SRTSO4)</tt> with <tt>MK(:,SRTSO4)</tt>, etc. By explicitly stating the sub-slice <tt>MK(1:IBINS,SRTSO4)</tt>, this causes the compiler to create an array temporary. Using <tt>MK(:,SRTSO4)</tt> instead allows for a more efficient pointer slice to be passed.</p><br />
<br />
--[[User:Bmy|Bob Y.]] 14:47, 31 January 2014 (EST)<br />
<br />
==== Fixes for convenience ====<br />
<br />
#<p>We now read many of the TOMAS data files from the directory <tt>TRIM( DATA_DIR_1x1 ) // 'TOMAS_201402/'</tt>. This avoids us from having to keep these big files (some of which approach 100 MB in size) in individual users' run directories.</p><br />
<br />
--[[User:Bmy|Bob Y.]] 16:20, 31 January 2014 (EST)<br />
<br />
== Outstanding issues ==<br />
<br />
=== Potential parallelization problems ===<br />
<br />
We have noticed that there may be a parallelization error in the TOMAS [http://acmg.seas.harvard.edu/geos/doc/man/appendix_5.html ND60 diagnostic]. This may be caused by a coding error; in particular, one or more variables that may have been omitted from an <tt>!$OMP+PRIVATE</tt> declaration.<br />
<br />
This is illustrated by the following [[GEOS-Chem_Unit_Tester#Interpreting_results_generated_by_the_GEOS-Chem_Unit_Tester|unit test simulation]] of the [[GEOS-Chem v9-01-02]] provisional release code (submitted at 2:11 PM on 21 Feb 2014):<br />
<br />
###############################################################################<br />
### VALIDATION OF GEOS-CHEM OUTPUT FILES<br />
### In directory: geos5_4x5_TOMAS40<br />
###<br />
### File 1 : trac_avg.geos5_4x5_TOMAS40.2005070100.sp<br />
### File 2 : trac_avg.geos5_4x5_TOMAS40.2005070100.mp<br />
### Sizes : IDENTICAL (707260156 and 707260156)<br />
### Checksums : DIFFERENT (895530022 and 2949483685)<br />
### Diffs : DIFFERENT<br />
###<br />
### File 1 : trac_rst.geos5_4x5_TOMAS40.2005070101.sp<br />
### File 2 : trac_rst.geos5_4x5_TOMAS40.2005070101.mp<br />
### Sizes : IDENTICAL (263480068 and 263480068)<br />
### Checksums : IDENTICAL (1925551193 and 1925551193)<br />
### Diffs : IDENTICAL<br />
###<br />
### File 1 : soil_rst.geos5_4x5_TOMAS40.2005070101.sp<br />
### File 2 : soil_rst.geos5_4x5_TOMAS40.2005070101.mp<br />
### Sizes : IDENTICAL (54040 and 54040)<br />
### Checksums : IDENTICAL (3229970876 and 3229970876)<br />
### Diffs : IDENTICAL<br />
###############################################################################<br />
<br />
In the above test, all TOMAS diagnostics (ND59, ND60, and ND61) were turned on. The restart files (here named <tt>trac_rst.*</tt>) from the single-processor and multi-processor stages of the unit test are identical, but the <tt>ctm.bpch</tt> files (here named <tt>trac_avg.*</tt>) were different. When the restart files are identical, that means single-processor and multi-processor stages produced the identical tracer concentrations (and soil NOx quantities). <br />
<br />
The only differences in the <tt>trac.avg.*</tt> files between the single-processor and multi-processor stages of the unit test were in TOMAS diagnostic quantities. The affected categories appear to be <tt>TMS-COND</tt>, <tt>TMS-COAG</tt>, <tt>TMS-NUCL</tt>, <tt>AERO-FIX</tt>, which points to the ND60 diagnostic.<br />
<br />
In order to confirm that the ND60 diagnostic exhibits the problem, we ran an additional unit test with ND59 and ND61 turned on, but ND60 turned off. This unit test, which was submitted at 3:33PM on 21 Feb 2014, yielded identical results.<br />
<br />
###############################################################################<br />
### VALIDATION OF GEOS-CHEM OUTPUT FILES<br />
### In directory: geos5_4x5_TOMAS40<br />
###<br />
### File 1 : trac_avg.geos5_4x5_TOMAS40.2005070100.sp<br />
### File 2 : trac_avg.geos5_4x5_TOMAS40.2005070100.mp<br />
### Sizes : IDENTICAL (690218236 and 690218236)<br />
### Checksums : IDENTICAL (4196844107 and 4196844107)<br />
### Diffs : IDENTICAL<br />
###<br />
### File 1 : trac_rst.geos5_4x5_TOMAS40.2005070101.sp<br />
### File 2 : trac_rst.geos5_4x5_TOMAS40.2005070101.mp<br />
### Sizes : IDENTICAL (263480068 and 263480068)<br />
### Checksums : IDENTICAL (1925551193 and 1925551193)<br />
### Diffs : IDENTICAL<br />
###<br />
### File 1 : soil_rst.geos5_4x5_TOMAS40.2005070101.sp<br />
### File 2 : soil_rst.geos5_4x5_TOMAS40.2005070101.mp<br />
### Sizes : IDENTICAL (54040 and 54040)<br />
### Checksums : IDENTICAL (3229970876 and 3229970876)<br />
### Diffs : IDENTICAL<br />
###############################################################################<br />
<br />
We are still looking into this issue. Because this issue only affects the ND60 diagnostic output, but not tracer concentrations, we are moving ahead with the TOMAS benchmarks for [[GEOS-Chem v9-02]] (as of 21 Feb 2014). <br />
<br />
--[[User:Bmy|Bob Y.]] 16:17, 21 February 2014 (EST)<br />
<br />
=== Offline Dust ===<br />
Currently, GEOS-Chem with TOMAS uses proscribed offline dust aerosol data in radiative transfer / photolysis calculations. Due to complications, this is turned off entirely for 2x2.5 resolution.<br />
<br />
=== Vertical Grids ===<br />
Currently, GC-TOMAS is only compatible with the reduced vertical grids:<br />
* [http://acmg.seas.harvard.edu/geos/doc/man/appendix_3.html#A3.3.1 GEOS3_30L]<br />
* [http://acmg.seas.harvard.edu/geos/doc/man/appendix_3.html#A3.4.1 GEOS4_30L]<br />
* [http://acmg.seas.harvard.edu/geos/doc/man/appendix_3.html#A3.5.1 GEOS5_47L]<br />
<br />
Development for the full vertical grids is ongoing.<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 20:43, 10 February 2010 (EST)<br />
<br />
== Obsolete versions of TOMAS ==<br />
<br />
In this section we preserve information that pertained to older versions of TOMAS (before the [[GEOS-Chem v9-02]] release).<br />
<br />
=== Code structure ===<br />
<br />
<div style="color: #aa0000; background: #eeeeee;border: 3px solid red; padding: 1em; margin: auto; width: 90%; ">'''<p>NOTE: This has been rendered obsolete by the [[#Update April 2013|re-integration of TOMAS into GEOS-Chem]], which was included in [[GEOS-Chem v9-02]]. All of the TOMAS routines have now been ported into the <tt>GeosCore</tt> directory. We shall leave this post here for reference. (Bob Yantosca, 30 Jan 2014)</p>'''</div><br />
<br />
The main-level <tt>Code</tt> directory has now been divided into several subdirectories:<br />
<br />
GeosCore/ GEOS-Chem "core" routines<br />
GeosTomas/ Parallel copies of GEOS-Chem routines that reference TOMAS<br />
GeosUtil/ "Utility" modules (e.g. error_mod.f, file_mod.f, time_mod.f, etc.<br />
Headers/ Header files (define.h, CMN_SIZE, CMN_DIAG, etc.)<br />
KPP/ KPP solver directory structure<br />
bin/ Directory where executables are placed<br />
doc/ Directory where documentation is created<br />
help/ Directory for GEOS-Chem Help Screen<br />
lib/ Directory where library files are placed<br />
mod/ Directory where module files are placed<br />
obsolete/ Directory where obsolete versions of code are archived<br />
<br />
Because there were a lot of TOMAS-related modifications in several GEOS-Chem "core" routines, the routines that need to "talk" to TOMAS were placed into a separate subdirectory named <tt>GeosTomas/</tt>. The files in <tt>GeosTomas</tt> are:<br />
<br />
Files:<br />
------<br />
Makefile -- GEOS-Chem routines that have been<br />
aero_drydep.f modified to reference the TOMAS aerosol<br />
carbon_mod.f microphysics package. These are kept<br />
chemdr.f in a separate GeosTomas directory so that<br />
chemistry_mod.f they do not interfere with the routines<br />
cleanup.f in the GeosCore directory.<br />
diag3.f<br />
diag_mod.f The GeosTomas directory only needs to<br />
diag_pl_mod.f contain the files that have been modified<br />
drydep_mod.f for TOMAS. The Makefile will look for<br />
dust_mod.f all other files from the GeosCore directory<br />
emissions_mod.f using the VPATH option in GNU Make.<br />
gamap_mod.f<br />
initialize.f NOTE to GEOS-Chem developers: When you<br />
input_mod.f make changes to any of these routines<br />
isoropia_mod.f in the GeosCore directory, you must also<br />
logical_mod.f make the same modifications to the<br />
ndxx_setup.f corresponding routines in the GeosTomas<br />
planeflight_mod.f directory.<br />
seasalt_mod.f<br />
sulfate_mod.f Maybe in the near future we can work<br />
tomas_mod.f towards integrating TOMAS into the GeosCore<br />
tomas_tpcore_mod.f90 directory more cleanly. However, due to<br />
tpcore_mod.f the large number of modifications that were<br />
tpcore_window_mod.f necessary for TOMAS, it was quicker to<br />
tracerid_mod.f implement the TOMAS code in a separate<br />
wetscav_mod.f subdirectory. <br />
xtra_read_mod.f -- Bob Y. (1/25/10)<br />
<br />
Each of these files were merged with the corresponding files in the <tt>GeosCore</tt> subdirectory. Therefore, in addition to having the GEOS-Chem modifications from [[GEOS-Chem v8-02-05|v8-02-05]], these files also have the relevant TOMAS references.<br />
<br />
A few technical considerations dictated the placing of these files into a separate <tt>GeosTomas/</tt> directory:<br />
<br />
* The ND60 diagnostic in the standard GEOS-Chem code (in <tt>GeosCore/</tt>) is now used for the CH4 offline simulation, but in TOMAS it's used for something else. <br />
* Some parameters needed to be declared differently with for simulations with TOMAS. <br />
* Because not all GEOS-Chem users will choose to use TOMAS, we did not want to unnecessarily bog down the code in <tt>GeosCore/</tt> with references to TOMAS-specific routines. <br />
<br />
All of these concerns could be best solved by keeping parallel copies of the affected routines in the <tt>GeosTomas</tt> directory.<br />
<br />
--[[User:Bmy|Bob Y.]] 13:35, 25 February 2010 (EST)<br />
<br />
=== Building GEOS-Chem with TOMAS ===<br />
<br />
<div style="color: #aa0000; background: #eeeeee;border: 3px solid red; padding: 1em; margin: auto; width: 90%; ">'''<p>NOTE: This has been rendered obsolete by the [[#Update April 2013|re-integration of TOMAS into GEOS-Chem]], which was included in [[GEOS-Chem v9-02]]. All of the TOMAS routines have now been ported into the <tt>GeosCore</tt> directory. We shall leave this post here for reference. (Bob Yantosca, 25 Feb 2014)</p>'''</div><br />
<br />
The <tt>VPATH</tt> feature of [http://www.gnu.org/software/make/manual/make.html GNU Make] is used to simplify the compilation. When GEOS-Chem is compiled with the tomas target, the GNU Make utility will search for files in the <tt>GeosTomas/</tt> directory first. If it cannot find files there, it will then search the <tt>GeosCore/</tt> directory. Thus, if we make a change to a "core" GEOS-Chem routine in the <tt>GeosCore/</tt> subdirectory (say in <tt>dao_mod.f</tt> or <tt>diag49_mod.f</tt>), then those changes will automatically be applied when you build GEOS-Chem with TOMAS. Thus, we only need to keep in <tt>GeosTomas/</tt> separate copies of those files that have to "talk" with TOMAS.<br />
<br />
Several new targets were added to the <tt>Makefile</tt> in the top-level <tt>Code/</tt> directory:<br />
<br />
#=============================================================================<br />
# Targets for TOMAS aerosol microphysics code (win, bmy, 1/25/10)<br />
#=============================================================================<br />
<br />
.PHONY: tomas libtomas exetomas cleantomas<br />
<br />
tomas:<br />
@$(MAKE) -C $(GEOSTOM) TOMAS=yes all<br />
<br />
libtomas:<br />
@$(MAKE) -C $(GEOSTOM) TOMAS=yes lib<br />
<br />
exetomas:<br />
@$(MAKE) -C $(GEOSTOM) TOMAS=yes exe<br />
<br />
cleantomas:<br />
@$(MAKE) -C $(GEOSTOM) TOMAS=yes clean<br />
<br />
You can build GEOS-Chem with the TOMAS option by typing:<br />
<br />
make tomas ...<br />
<br />
This will automatically do the proper things to build the TOMAS code into GEOS-Chem, such as:<br />
<br />
* Adding a <tt>-DTOMAS</tt> C-preprocessor switch to the <tt>FFLAGS</tt> compiler flag settings in <tt>Makefile_header.mk</tt>. This will cause TOMAS-specific areas of code to be turned on.<br />
* Turning off OpenMP parallelization. For now the GEOS-Chem + TOMAS code needs to be run on a single processor. We continue to work on parallelizing the code.<br />
* Calling the Makefile in the <tt>GeosTomas/</tt> subdirectory to build the executable. The executable file is now named <tt>geostomas</tt> in order to denote that the TOMAS code is built in.<br />
<br />
The GEOS-Chem + TOMAS has been built on the following compilers<br />
<br />
* Intel Fortran compiler v10<br />
* Intel Fortran compiler v11.1 (20101201)<br />
* SunStudio 12<br />
<br />
--[[User:Bmy|Bob Y.]] 10:36, 27 January 2010 (EST)<br />
<br />
=== Compile from GeosTomas directory ===<br />
<br />
<div style="color: #aa0000; background: #eeeeee;border: 3px solid red; padding: 1em; margin: auto; width: 90%; ">'''<p>NOTE: This has been rendered obsolete by the [[#Update April 2013|re-integration of TOMAS into GEOS-Chem]], which was included in [[GEOS-Chem v9-02]]. We shall leave this post here for reference. (Bob Yantosca, 30 Jan 2014)</p>'''</div><br />
<br />
'''''[mailto:dwesterv@andrew.cmu.edu Dan Westervelt] wrote:'''''<br />
<br />
:I think there is something going wrong in my compilation, although errors have come up at both compile time and run time. The worst of the problems is this: I'll make a change to any fortran file in the code (even something meaningless like print*, 'foo') and hundreds of compile errors come out with fishy error messages such as (from ifort v10.1):<br />
<br />
***fortcom: Error: chemistry_mod.f, line 478: A kind type parameter must be a compile-time constant. [DP]<br />
REAL(kind=dp) :: RCNTRL(20)<br />
<br />
:Any advice? The errors I'm having are not unique to any version of GC, any type of met fields, any compiler, etc.<br />
<br />
'''''[mailto:yantosca@seas.harvard.edu Bob Yantosca] wrote:'''''<br />
<br />
:Make sure you are always in the GeosTomas subdirectory when you build the code. Sometimes there is a problem if you build the code from a higher level directory. This may have to do with the VPATH in the makefile.<br />
<br />
'''''[mailto:dwesterv@andrew.cmu.edu Dan Westervelt] wrote:'''''<br />
<br />
:Thanks, that seems to do the trick.<br />
<br />
--[[User:Bmy|Bob Y.]] 14:37, 14 April 2010 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_aerosol_microphysics&diff=16039TOMAS aerosol microphysics2014-03-03T20:51:20Z<p>Salvatore Farina: /* Computational Information */</p>
<hr />
<div>This page describes the TOMAS aerosol microphysics option in GEOS-Chem. TOMAS is one of two aerosol microphysics packages being incorporated into GEOS-Chem, the other being [[APM aerosol microphysics|APM]].<br />
<br />
== Overview ==<br />
<br />
The TwO-Moment Aerosol Sectional (TOMAS) microphysics package was developed for implementation into GEOS-Chem at Carnegie-Mellon University. Using a moving sectional and moment-based approach, TOMAS tracks two independent moments (number and mass) of the aerosol size distribution for a number of discrete size bins. It also contains codes to simulate nucleation, condensation, and coagulation processes. The aerosol species that are considered with high size resolution are sulfate, sea-salt, OC, EC, and dust. An advantage of TOMAS is the full size resolution for all chemical species and the conservation of aerosol number, the latter of which allows one to construct aerosol and CCN number budgets that will balance.<br />
<br />
=== Authors and collaborators ===<br />
* [mailto:petera@andrew.cmu.edu Peter Adams] ''(Carnegie-Mellon U.)'' -- Principal Investigator<br />
* [mailto:wtrivita@staffmail.ed.ac.uk Win Trivitayanurak] ''(Department of Highways, Thailand)''<br />
* [mailto:dwesterv@andrew.cmu.edu Dan Westervelt] ''(Carnegie-Mellon U.)''<br />
* [mailto:jeffrey.pierce@dal.ca Jeffrey Pierce] ''(Dalhousie U.)''<br />
* [mailto:sal.farina@gmail.com Salvatore Farina] ''(Colorado State U.)''<br />
<br />
Questions regarding TOMAS can be directed at Dan (e-mail linked above).<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 11:53, 27 January 2010 (EST)<br />
<br />
=== TOMAS User Groups ===<br />
<br />
{| border=1 cellspacing=0 cellpadding=5<br />
|- bgcolor="#cccccc"<br />
!User Group<br />
!Personnel<br />
!Projects<br />
|-valign="top"<br />
|[http://www.ce.cmu.edu/%7Eadams/ Carnegie-Mellon University]<br />
|[http://www.ce.cmu.edu/%7Eadams/people.htm#peter Peter Adams]<br>[http://www.ce.cmu.edu/~dwesterv/Site/Home.html Dan Westervelt]<br />
| [http://www.atmos-chem-phys-discuss.net/13/8333/2013/acpd-13-8333-2013.html New particle formation evaluation in GC-TOMAS] <br> Sensitivity of CCN to nucleation rates <br> Development of number tagging and source apportionment model for GC-TOMAS<br />
|-valign="top"<br />
|[http://fizz.phys.dal.ca/%7Epierce/ Dalhousie University] <br> [http://www.atmos.colostate.edu/faculty/pierce.php Colorado State]<br />
|[http://atm.dal.ca/Faculty/Jeffrey_Pierce.php Jeffrey Pierce]<br>Sal Farina<br>Stephen D'Andrea<br />
|Sensitivity of CCN to condensational growth rates <br> TOMAS parallelization <br> Others...<br />
|-valign="top"<br />
|Add yours here<br />
|<br />
|<br />
|}<br />
<br />
== TOMAS-specific setup ==<br />
TOMAS has its own run directories (run.Tomas) that can be downloaded from the Harvard FTP. The <tt>input.geos</tt> file will look slightly different from standard GEOS-Chem, and between versions.<br />
<br />
Pre- v9.02:<br />
To turn on TOMAS, see the "Microphysics menu" in <tt>input.geos</tt> and make sure TOMAS is set to '''T'''. <br />
<br />
v9.02 and later:<br />
TOMAS is enabled or disabled at compile time - the TOMAS flag in input.geos has been removed.<br />
<br />
<br />
TOMAS is a simulation type 3 and utilizes 171-423 tracers. Each aerosol species requires 30 tracers for the 30 bin size resolution, 12 for the 12 bin, etc. Here is the (abbreviated) default setup in input.geos for TOMAS-30 in v9.02 and later (see run.Tomas directory):<br />
<br />
Tracer # Description <br />
1- 62 Std Geos Chem <br />
63 H2SO4 <br />
64- 93 Number <br />
94-123 Sulfate <br />
124-153 Sea-salt <br />
154-183 Hydrophilic EC <br />
184-213 Hydrophobic EC <br />
214-243 Hydrophilic OC <br />
244-273 Hydrophobic OC <br />
274-303 Mineral dust <br />
304-333 Aerosol water<br />
<br />
TOMAS-40 requires 423 tracers (~360 TOMAS tracers for each of the 40-bin species, and ~62 standard GEOS-Chem tracers) <br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 18:48, 8 July 2013 (EDT)<br />
<br />
== Implementation notes ==<br />
<br />
TOMAS validation in [[GEOS-Chem v8-03-01]] was completed on 24 Feb 2010.<br />
<br />
=== Update April 2013 ===<br />
<br />
'''''This update was tested in the 1-month benchmark simulation [[GEOS-Chem_v9-02_benchmark_history#v9-02k|v9-02k]] and approved on 07 Jun 2013.'''''<br />
<br />
Sal Farina has been working with the GEOS-Chem Support Team to inline the TOMAS aerosol microphysics code into the <tt>GeosCore</tt> directory. All TOMAS-specific sections of code are now segregated from the rest of GEOS-Chem with C-preprocessor statements such as:<br />
<br />
#if defined( TOMAS )<br />
<br />
# if defined( TOMAS40 ) <br />
... Code for 40 bin TOMAS simulation (optional) goes here ...<br />
# elif defined( TOMAS12 )<br />
... Code for 12 bin TOMAS simulation (optional) goes here ...<br />
# elif defined( TOMAS15 )<br />
... Code for 15 bin TOMAS simulation (optional) goes here ...<br />
# else<br />
... Code for 30 bin TOMAS simulation (default) goes here ...<br />
# endif<br />
<br />
#endif <br />
<br />
TOMAS is now invoked by compiling GEOS-Chem with one of the following options:<br />
<br />
make -j4 TOMAS=yes ... # Compiles GEOS-Chem for the 30 bin (default) TOMAS simulation<br />
# -j4 compiles 4 files at a time; this reduces overall compilation time<br />
<br />
or<br />
<br />
make -j4 TOMAS40=yes ... # Compiles GEOS-Chem for the 40 bin (optional) TOMAS simulation<br />
# -j4 compiles 4 files at a time; this reduces overall compilation time<br />
<br />
All files in the old <tt>GeosTomas/</tt> directory have now been deleted, as these have been rendered obsolete.<br />
<br />
These updates are included in [[GEOS-Chem v9-02]]. These modifications will not affect the existing GEOS-Chem simulations, as all TOMAS code is not compiled into the executable unless you specify either <tt>TOMAS=yes</tt> or <tt>TOMAS40=yes</tt> at compile time.<br />
<br />
We are in the process of updating the wiki to reflect these changes as they are implemented. <br />
<br />
--[[User:Bmy|Bob Y.]] 13:59, 23 April 2013 (EDT)<br><br />
--[[User:Salvatore Farina|Salvatore Farina]] 13:49, 4 June 2013 (EDT)<br />
<br />
== Computational Information ==<br />
<br />
GC-TOMAS v9-02 (30 sections) on 8 processors: <br />
One year simulation = 7-8 days wall clock time<br />
<br />
More speedups are available using lower aerosol size resolution<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 11:00, 07 May 2013 (EST)<br />
<br />
GC-TOMAS v9-03 on 16 processors (glooscap)<br />
12 bin: 2.8 days wall time per sim year<br />
15 bin: 3.3 days wall time per sim year<br />
30 bin: 6.1 days wall time per sim year<br />
40 bin: 7.8 days wall time per sim year<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 15:51, 3 March 2014 (EST)<br />
<br />
== Microphysics Code==<br />
The aerosol microphysics code is largely contained within the file <tt>tomas_mod.f</tt>. Tomas_mod and its subroutines are modular -- they use all their own internal variables. For details, see tomas_mod.f and comments. <br />
<br />
=== Nucleation ===<br />
The choice of nucleation theory is selected in the header section of <tt>tomas_mod.f</tt>. The choices are currently binary homogeneous nucleation as in Vehkamaki, 2001 or ternary homogenous nucleation as in Napari et al., 2002. The ternary nucleation rate is typically scaled by a globally uniform tuning factor of 10^-4 or 10^-5. Binary nucleation (Vehkamaki et al. 2002), ion-mediated nucleation (Yu, 2008) and activation nucleation (Kulmala, 2006) are options as well.<br />
<br />
In TOMAS-12 and TOMAS-30, nucleated particles follow the Kerminen approximation to grow to the smallest size bin. This has a tendency to overpredict the number of particles in the smallest bins of those models. See Y. H. Lee, J. R. Pierce, and P. J. Adams 2013 [http://www.geosci-model-dev-discuss.net/6/893/2013/gmdd-6-893-2013.html here] for more details on the consequences of this.<br />
<br />
=== Condensation ===<br />
<br />
=== Coagulation ===<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 14:08, 9 May 2011 (EST)<br />
<br />
== Validation ==<br />
<br />
GC-TOMAS [[GEOS-Chem v8-03-01|v8-03-01]] generally compares very well with observations and other models. Please see our [http://acmg.seas.harvard.edu/geos/wiki_docs/TOMAS/TOMAS_benchmark_ForHarvard.pdf GC-TOMAS v8-02-05 validation document] for more information and figures. <br />
<br />
Below are some results of benchmarking GC-TOMAS with earlier versions of the model as well as observations:<br />
<br />
[[Image:CN10_smaller.jpg]]<br />
<br />
'''Figure 1: CN10 concentrations predicted by GC-TOMAS v8-02-05 against observations. Descriptions of observational data can be found on p 5454 of Pierce et al, Atmos. Chem. Phys., 7, 2007.'''<br />
<br />
----<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 20:13, 10 February 2010 (EST)<br />
<br />
== Other features of TOMAS ==<br />
Other varieties of TOMAS are suited for specific science questions, for example with nucleation studies where explicit aerosol dynamics are needed for nanometer-sized particles. <br />
<br />
=== Set-up Guide ===<br />
<br />
This [[TOMAS setup guide]] was written for users on ACE-NET's Glooscap cluster, but may be more generally applicable.<br />
Please contact [mailto:sal.farina@gmail.com Salvatore Farina] for help in obtaining the latest development version of GEOS-Chem with TOMAS.<br />
This will allow you to take advantage of parallel computation in TOMAS.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 11:55, 26 July 2013 (EDT)<br />
<br />
=== Size Resolution ===<br />
<br />
;TOMAS-30: All 7 chemical species have size resolution ranging from 10 nm to 10 µm, spanned by 30 logarithmically spaced (mass doubling) bins.<br />
;TOMAS-40: Same as TOMAS-30 with 10 additional (mass doubling) sub-10nm bins with a lower limit ~1nm<br />
;TOMAS-12: All 7 chemical species have size resolution ranging from 10 nm to 1 µm spanned by 10 logarithmically spaced (mass quadrupling) bins and two supermicron bins. Coarser resolution than TOMAS-30 - Improved computation time. <br />
;TOMAS-15: Same as TOMAS-12 with 3 additional (mass quadrupling) sub-10nm bins with a lower limit ~2nm. Analogous to TOMAS40 with improved computation time.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 12:51, 4 June 2013 (EDT)<br />
<br />
=== Nesting and grid size ===<br />
TOMAS is implemented on a 2x2.5 North American domain. Developed by Jeffrey Pierce (jeffrey.pierce@dal.ca)<br />
<br />
=== AOD, CCN post-processing code ===<br />
Codes available for calculating aerosol optical depth using TOMAS predicted aerosol composition and size and Mie Theory. Also CCN concentrations calculated from TOMAS size-resolved composition and Kohler theory. Developed by Yunha Lee and Jeffrey Pierce, adapted for GEOS-Chem output by Jeffrey Pierce.<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 2:00, 9 May 2011 (EST)<br />
<br />
== References ==<br />
<br />
In this section we provide references relevant to TOMAS aerosl microphysics simulations.<br />
<br />
=== Studies using TOMAS simulations ===<br />
#'''Nucleation in GEOS-Chem:''' Westervelt, D. M., Pierce, J. R., Riipinen, I., Trivitayanurak, W., Hamed, A., Kulmala, M., Laaksonen, A., Decesari, S., and Adams, P. J.: ''Formation and growth of nucleated particles into cloud condensation nuclei: model-measurement comparison'', <u>Atmos. Chem. Phys. Discuss.</u>, '''13''', 8333-8386, doi:10.5194/acpd-13-8333-2013, 2013. [http://www.atmos-chem-phys-discuss.net/13/8333/2013/acpd-13-8333-2013.html LINK]<br />
#'''TOMAS implementation in GEOS-Chem:''' Trivitayanurak, W., Adams, P. J., Spracklen, D. V. and Carslaw, K. S.: ''Tropospheric aerosol microphysics simulation with assimilated meteorology: model description and intermodel comparison'', <u>Atmos. Chem. Phys.</u>, '''8'''(12), 3149-3168, 2008.<br />
#'''TOMAS initial paper, sulfate only:''' Adams, P. J. and Seinfeld, J. H.: ''redicting global aerosol size distributions in general circulation models'', <u>J. Geophys. Res.-Atmos.</u>, '''107'''(D19), -, doi:Artn 4370 Doi 10.1029/2001jd001010, 2002.<br />
#'''TOMAS with sea-salt:''' Pierce, J.R., and Adams P.J., ''Global evaluation of CCN formation by direct emission of sea salt and growth of ultrafine sea salt'', <u>J. Geophys. Res.-Atmos.</u>, '''111''' (D6), doi:10.1029/2005JD006186, 2006.<br />
#'''TOMAS with carbonaceous aerosol:''' Pierce, J. R., Chen, K. and Adams, P. J.: ''Contribution of primary carbonaceous aerosol to cloud condensation nuclei: processes and uncertainties evaluated with a global aerosol microphysics model'', <u>Atmos. Chem. Phys.</u>, '''7'''(20), 5447-5466, doi:10.5194/acp-7-5447-2007, 2007.<br />
#'''TOMAS with dust:''' Lee, Y.H., K. Chen, and P.J. Adams, 2009: ''Development of a global model of mineral dust aerosol microphysics''. <u>Atmos. Chem. Phys.</u>, '''8''', 2441-2558, doi:10.5194/acp-9-2441-2009.<br />
<br />
--[[User:Bmy|Bob Y.]] 17:04, 24 February 2014 (EST)<br />
<br />
=== Input data used by TOMAS ===<br />
#Usoskin, I. G. and Kovaltsov, G. A., ''Cosmic ray induced ionization in the atmosphere: Full modeling and practical applications'', <u>J. Geophys. Res.</u>, '''111''', doi:10.1029/2006JD007150, 2006..<br />
#Yu, Fangqun, et al, ''Ion-mediated nucleation in the atmosphere: Key controlling parameters, implications, and look-up table'', <u>J. Geophys. Res.</u>, '''115''', D03206, doi:10.1029/2009JD012630, 2010.<br />
<br />
--[[User:Bmy|Bob Y.]] 17:03, 24 February 2014 (EST)<br />
<br />
== Previous issues now resolved ==<br />
<br />
=== Minor bug in TOMAS sulfate emissions ===<br />
<br />
'''''This update was tested in the 1-month benchmark simulation [[GEOS-Chem_v9-02_benchmark_history#v9-02o|v9-02o]] and approved on 03 Sep 2013.'''''<br />
<br />
'''''[mailto:sal.farina@gmail.com Sal Farina] wrote:'''''<br />
:Calling mnfix before and after emission ensures the size distribution is well behaved, and eliminates "Negative SF emis" warnings. An edit to mnfix was also introduced, whereby "tiny" mass added to zero mass, "epsilon" number situations resulted in very high mass per particle results - necessitating excessive error detection, correction, and verbosity.<br />
<br />
--[[User:Melissa Payer|Melissa Sulprizio]] 15:08, 7 August 2013 (EDT)<br />
<br />
=== Segmentation Fault ===<br />
You may get an early segfault if your stacksize is not set to either unlimited or a very large number. To avoid this, you either have to change the value of an environmental variable (setenv command in <tt>.cshrc</tt>) or use the <tt>ulimit</tt> command. See [http://wiki.seas.harvard.edu/geos-chem/index.php/Machine_issues_%26_portability#Resetting_stacksize_for_Linux this page] for details.<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 20:20, 10 February 2010 (EST)<br />
<br />
=== Updates for GEOS-Chem v9-02 public release ===<br />
<br />
<div style="color: #aa0000; background: #eeeeee;border: 3px solid red; padding: 1em; margin: auto; width: 90%; ">'''<p>NOTE: As described below, there appears to be [[#Potential parallelization problems|a potential parallelizaiton problem with the TOMAS ND60 diagnostic]]. We are currently looking into this. This issue, however, does not affect the tracer concentrations computed by TOMAS, but only the output of the ND60 diagnostic itself. For this reason we are moving ahead with the TOMAS benchmarks for v9-02. (Bob Yantosca, 21 Feb 2014)</p>'''</div><br />
<br />
We have found and fixed several minor numerical and coding issues prior to the public release of [[GEOS-Chem v9-02]] (01 Mar 2014). The TOMAS40 simulation has been validated with the [[GEOS-Chem Unit Tester]]. Below is the [[GEOS-Chem Unit Tester#Interpreting_results_generated_by_the_GEOS-Chem_Unit_Tester|output of a unit test]] that was submitted on 2014/02/21 at 12:47:26 PM:<br />
<br />
###############################################################################<br />
### VALIDATION OF GEOS-CHEM OUTPUT FILES<br />
### In directory: geos5_4x5_TOMAS40<br />
###<br />
### File 1 : trac_avg.geos5_4x5_TOMAS40.2005070100.sp<br />
### File 2 : trac_avg.geos5_4x5_TOMAS40.2005070100.mp<br />
### Sizes : IDENTICAL (680420788 and 680420788)<br />
### Checksums : IDENTICAL (179613338 and 179613338)<br />
### Diffs : IDENTICAL<br />
###<br />
### File 1 : trac_rst.geos5_4x5_TOMAS40.2005070101.sp<br />
### File 2 : trac_rst.geos5_4x5_TOMAS40.2005070101.mp<br />
### Sizes : IDENTICAL (263480068 and 263480068)<br />
### Checksums : IDENTICAL (1925551193 and 1925551193)<br />
### Diffs : IDENTICAL<br />
###<br />
### File 1 : soil_rst.geos5_4x5_TOMAS40.2005070101.sp<br />
### File 2 : soil_rst.geos5_4x5_TOMAS40.2005070101.mp<br />
### Sizes : IDENTICAL (54040 and 54040)<br />
### Checksums : IDENTICAL (3229970876 and 3229970876)<br />
### Diffs : IDENTICAL<br />
###############################################################################<br />
<br />
In the subsections below, we describe in more detail the fixes that we made for [[GEOS-Chem v9-02]]:<br />
<br />
==== Fixes for minor coding errors ====<br />
<br />
#<p>In <tt>GeosCore/main.F</tt>, we now replaced <tt>CALL FLUSH()</tt> with <tt>CALL FLUSH(6)</tt>. The <tt>FLUSH</tt> routine needs to take an argument. Unit #6 is the unit stdout (i.e. the screen and/or log file).</p><br />
#<p>In routine <tt>CHEM_SO2</tt> (in module <tt>GeosCore/sulfate_mod.F</tt>), we now avoid referencing the dust tracers DST1, DST2, DST3, and DST4 tracers for TOMAS simulations. TOMAS uses size-resolved dust tracers, and therefore does not carry DST1-4 tracers. This error seems to have been introduced when the fix for cloud pH was introduced in Sep 2013.</p><br />
#<p>In routine <tt>COND_NUC</tt> (in module <tt>GeosCore/tomas_mod.F</tt>), we added error traps to avoid division-by-zero errors that occurred when the variable <tt>CSCH</tt> is zero. When <tt>CSCH</tt> is zero, we now set variable <tt>ADDT</tt> to zero. When <tt>ADDT</tt> is zero, it will get reassigned to a minimum time step, so this fix should work OK.</p><br />
#<p>In <tt>GeosCore/gamap_mod.F</tt>, we now have restored several entries to <tt>tracerinfo.dat</tt> for the ND44 diagnostic that were not getting properly printed out when the TOMAS simuation was being used.</p><br />
#<p>In module <tt>GeosCore/drydep_mod.F</tt>, we Now set <tt>MAXDEP=105</tt> for all simulations, including TOMAS. Formerly, TOMAS had <tt>MAXDEP=100</tt>. This is close enough.</p><br />
#<p>In module <tt>GeosCore/diag3.F</tt>, we now avoid an out-of-bounds error in <tt>DEPNAME(N)</tt> during TOMAS simulations. We save the drydep species name from <tt>DEPNAME(N)</tt> into an new variable <tt>DRYDEP_NAME</tt> for <tt>N = 1..NUMDEP</tt>. We then set <tt>DRYDEP_NAME = ''</tt> for <tt>N > NUMDEP</tt>. This error occurs because we extend the # of drydep tracers during TOMAS simulations to account for the size bins.</p><br />
#<p>We have fixed a couple of logical errors that prevented dust emissions from happening. Minor modifications were made to IF statements in <tt>GeosCore/chemistry_mod.F</tt>, <tt>GeosCore/dust_mod.F</tt>, and <tt>GeosCore/input_mod.F</tt>.</p><br />
#<p>In file <tt>GeosCore/Makefile</tt>, make sure to add <tt>tomas_mod.o</tt> to the list of modules used by <tt>wetscav_mod.F</tt> (aka the "dependency listing"). The corrected code should look like this:</p><br />
<br />
wetscav_mod.o : wetscav_mod.F \<br />
dao_mod.o diag_mod.o \<br />
depo_mercury_mod.o get_ndep_mod.o \<br />
get_popsinfo_mod.o tracerid_mod.o \<br />
tracer_mod.o tomas_mod.o<br />
<br />
--[[User:Bmy|Bob Y.]] 10:20, 19 February 2014 (EST)<br />
<br />
==== Fixes for parallelization errors ====<br />
<br />
#<p>In routine <tt>AEROPHYS</tt> (in module <tt>GeosCore/tomas_mod.F</tt>), we need to add the following variables to the <tt>!$OMP+PRIVATE</tt> statement: <tt>TRACNUM</tt>, <tt>NH3_TO_NH4</tt>, and <tt>SURF_AREA</tt>. Adding these now causes TOMAS to have identical sp vs. mp results when chemistry and microphysics are turned on.</p><br />
#<p>In routine <tt>DEPVEL</tt> (in <tt>GeosCore/drydep_mod.F</tt>): Instead of holding <tt>A_RADI</tt> and <tt>A_DEN</tt> as <tt>!$OMP+PRIVATE</tt> in TOMAS simulations (in the main DO loop in <tt>DEPVEL</tt>), we now save the particle size and density values to private variables <tt>DIAM</tt> and <tt>DEN</tt>. We then pass those as arguments to function <tt>DUST_SFCRSII</tt>.</p> <br />
#<p>We have corrected an issue in routine <tt>NFCLDMX</tt> (in module <tt>GeosCore/convection_mod.F</tt>) that potentially impacts the TOMAS wet scavenging, as described below:</p><br />
#*<p>We think there are different results for parallel and serial because of an assumption that's true for normal simulations but fails on TOMAS. The assumption is "tracers are independent through wet scavenging." Since TOMAS scavenging is size dependent, removing material from the distribution before calculating the soluble fraction of another component is "wrong." We now compute the fractions explicitly before the removal step. To do this, we now call routine <tt>COMPUTE_F</tt> in its own parallel DO loop located immediately before the main parallel do loop in <tt>NFCLDMX</tt>.</p><br />
#*<p>This modification also required the ND37 diagnostic IF block to be put into the same loop as <tt>COMPUTE_F</tt>. Furthermore, because <tt>COMPUTE_F</tt> returns the value of diagnostic index <tt>ISOL</tt>, and because <tt>ISOL</tt> is also used for the ND38 diagnostic in the main parallel loop below, we must also save the values of <tt>ISOL</tt> in a 1-D vector. This will allow the values of ISOL to be passed from the first parallel loop to the second. This ensures that the ND37 and ND38 diagnostics will be computed properly for all GEOS-5 simulations that have soluble tracers.</p><br />
#*<p>This modification has been tested in the [[GEOS-Chem Unit Tester]] by Bob Yantosca (04 Feb 2014) and it has yielded identical results for <tt>geos5_4x5_fullchem</tt>, <tt>geos5_4x5_Hg</tt>, <tt>geos5_4x5_RnPbBe</tt>, <tt>geos5_4x5_soa</tt> and <tt>geos5_4x5_soa_svpoa</tt> simulations.</p><br />
#<p>We have made some fixes in <tt>GeosCore/wetscav_mod.F</tt> that caused single-processor TOMAS runs to have different output than multi-processor runs. A few instances of code were computing quantities sequentially and then storing them for later use. These were technically thread-safe, but were susceptible to error because the order of computation would be different when running with parallelization turned on. These sections of code have now been rewritten accordingly.</p><br />
<br />
--[[User:Bmy|Bob Y.]] 14:09, 21 February 2014 (EST)<br />
<br />
==== Removed inefficient subroutine calls ====<br />
<br />
#<p>In <tt>GeosCore/diag3.F</tt>, we now use a 2-D array <tt>(J-L)</tt> for archiving into the ND60 TOMAS diagnostic. This eliminates an array temporary in the call to routine BPCH2.</p><br />
#<p>In routine <tt>AEROPHYS</tt> (in module <tt>GeosCore/tomas_mod.F</tt>), we now use an array <tt>ERR_IND</tt> to pass the I,J,L,N indices to error checking routine <tt>CHECK_VALUE</tt>. We previously used an array descriptor <tt>(/I,J,L,0/)</tt> which caused an array temporary to be created.</p><br />
#<p>In routine <tt>EMISSCARBON</tt> (in module <tt>GeosCore/carbon_mod.F</tt>), we removed array temporaries from the calls to subroutine <tt>EMITSGC</tt>. We now sum two arrays into a temporary array, and then pass that to <tt>EMITSGC</tt>.</p><br />
#<p>We rewrote the subroutine calls to NH4BULKTOBIN to avoid the creation of array temporaries. In most cases this was done by replacing <tt>MK(1:IBINS,SRTSO4)</tt> with <tt>MK(:,SRTSO4)</tt>, etc. By explicitly stating the sub-slice <tt>MK(1:IBINS,SRTSO4)</tt>, this causes the compiler to create an array temporary. Using <tt>MK(:,SRTSO4)</tt> instead allows for a more efficient pointer slice to be passed.</p><br />
<br />
--[[User:Bmy|Bob Y.]] 14:47, 31 January 2014 (EST)<br />
<br />
==== Fixes for convenience ====<br />
<br />
#<p>We now read many of the TOMAS data files from the directory <tt>TRIM( DATA_DIR_1x1 ) // 'TOMAS_201402/'</tt>. This avoids us from having to keep these big files (some of which approach 100 MB in size) in individual users' run directories.</p><br />
<br />
--[[User:Bmy|Bob Y.]] 16:20, 31 January 2014 (EST)<br />
<br />
== Outstanding issues ==<br />
<br />
=== Potential parallelization problems ===<br />
<br />
We have noticed that there may be a parallelization error in the TOMAS [http://acmg.seas.harvard.edu/geos/doc/man/appendix_5.html ND60 diagnostic]. This may be caused by a coding error; in particular, one or more variables that may have been omitted from an <tt>!$OMP+PRIVATE</tt> declaration.<br />
<br />
This is illustrated by the following [[GEOS-Chem_Unit_Tester#Interpreting_results_generated_by_the_GEOS-Chem_Unit_Tester|unit test simulation]] of the [[GEOS-Chem v9-01-02]] provisional release code (submitted at 2:11 PM on 21 Feb 2014):<br />
<br />
###############################################################################<br />
### VALIDATION OF GEOS-CHEM OUTPUT FILES<br />
### In directory: geos5_4x5_TOMAS40<br />
###<br />
### File 1 : trac_avg.geos5_4x5_TOMAS40.2005070100.sp<br />
### File 2 : trac_avg.geos5_4x5_TOMAS40.2005070100.mp<br />
### Sizes : IDENTICAL (707260156 and 707260156)<br />
### Checksums : DIFFERENT (895530022 and 2949483685)<br />
### Diffs : DIFFERENT<br />
###<br />
### File 1 : trac_rst.geos5_4x5_TOMAS40.2005070101.sp<br />
### File 2 : trac_rst.geos5_4x5_TOMAS40.2005070101.mp<br />
### Sizes : IDENTICAL (263480068 and 263480068)<br />
### Checksums : IDENTICAL (1925551193 and 1925551193)<br />
### Diffs : IDENTICAL<br />
###<br />
### File 1 : soil_rst.geos5_4x5_TOMAS40.2005070101.sp<br />
### File 2 : soil_rst.geos5_4x5_TOMAS40.2005070101.mp<br />
### Sizes : IDENTICAL (54040 and 54040)<br />
### Checksums : IDENTICAL (3229970876 and 3229970876)<br />
### Diffs : IDENTICAL<br />
###############################################################################<br />
<br />
In the above test, all TOMAS diagnostics (ND59, ND60, and ND61) were turned on. The restart files (here named <tt>trac_rst.*</tt>) from the single-processor and multi-processor stages of the unit test are identical, but the <tt>ctm.bpch</tt> files (here named <tt>trac_avg.*</tt>) were different. When the restart files are identical, that means single-processor and multi-processor stages produced the identical tracer concentrations (and soil NOx quantities). <br />
<br />
The only differences in the <tt>trac.avg.*</tt> files between the single-processor and multi-processor stages of the unit test were in TOMAS diagnostic quantities. The affected categories appear to be <tt>TMS-COND</tt>, <tt>TMS-COAG</tt>, <tt>TMS-NUCL</tt>, <tt>AERO-FIX</tt>, which points to the ND60 diagnostic.<br />
<br />
In order to confirm that the ND60 diagnostic exhibits the problem, we ran an additional unit test with ND59 and ND61 turned on, but ND60 turned off. This unit test, which was submitted at 3:33PM on 21 Feb 2014, yielded identical results.<br />
<br />
###############################################################################<br />
### VALIDATION OF GEOS-CHEM OUTPUT FILES<br />
### In directory: geos5_4x5_TOMAS40<br />
###<br />
### File 1 : trac_avg.geos5_4x5_TOMAS40.2005070100.sp<br />
### File 2 : trac_avg.geos5_4x5_TOMAS40.2005070100.mp<br />
### Sizes : IDENTICAL (690218236 and 690218236)<br />
### Checksums : IDENTICAL (4196844107 and 4196844107)<br />
### Diffs : IDENTICAL<br />
###<br />
### File 1 : trac_rst.geos5_4x5_TOMAS40.2005070101.sp<br />
### File 2 : trac_rst.geos5_4x5_TOMAS40.2005070101.mp<br />
### Sizes : IDENTICAL (263480068 and 263480068)<br />
### Checksums : IDENTICAL (1925551193 and 1925551193)<br />
### Diffs : IDENTICAL<br />
###<br />
### File 1 : soil_rst.geos5_4x5_TOMAS40.2005070101.sp<br />
### File 2 : soil_rst.geos5_4x5_TOMAS40.2005070101.mp<br />
### Sizes : IDENTICAL (54040 and 54040)<br />
### Checksums : IDENTICAL (3229970876 and 3229970876)<br />
### Diffs : IDENTICAL<br />
###############################################################################<br />
<br />
We are still looking into this issue. Because this issue only affects the ND60 diagnostic output, but not tracer concentrations, we are moving ahead with the TOMAS benchmarks for [[GEOS-Chem v9-02]] (as of 21 Feb 2014). <br />
<br />
--[[User:Bmy|Bob Y.]] 16:17, 21 February 2014 (EST)<br />
<br />
=== Offline Dust ===<br />
Currently, GEOS-Chem with TOMAS uses proscribed offline dust aerosol data in radiative transfer / photolysis calculations. Due to complications, this is turned off entirely for 2x2.5 resolution.<br />
<br />
=== Vertical Grids ===<br />
Currently, GC-TOMAS is only compatible with the reduced vertical grids:<br />
* [http://acmg.seas.harvard.edu/geos/doc/man/appendix_3.html#A3.3.1 GEOS3_30L]<br />
* [http://acmg.seas.harvard.edu/geos/doc/man/appendix_3.html#A3.4.1 GEOS4_30L]<br />
* [http://acmg.seas.harvard.edu/geos/doc/man/appendix_3.html#A3.5.1 GEOS5_47L]<br />
<br />
Development for the full vertical grids is ongoing.<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 20:43, 10 February 2010 (EST)<br />
<br />
== Obsolete versions of TOMAS ==<br />
<br />
In this section we preserve information that pertained to older versions of TOMAS (before the [[GEOS-Chem v9-02]] release).<br />
<br />
=== Code structure ===<br />
<br />
<div style="color: #aa0000; background: #eeeeee;border: 3px solid red; padding: 1em; margin: auto; width: 90%; ">'''<p>NOTE: This has been rendered obsolete by the [[#Update April 2013|re-integration of TOMAS into GEOS-Chem]], which was included in [[GEOS-Chem v9-02]]. All of the TOMAS routines have now been ported into the <tt>GeosCore</tt> directory. We shall leave this post here for reference. (Bob Yantosca, 30 Jan 2014)</p>'''</div><br />
<br />
The main-level <tt>Code</tt> directory has now been divided into several subdirectories:<br />
<br />
GeosCore/ GEOS-Chem "core" routines<br />
GeosTomas/ Parallel copies of GEOS-Chem routines that reference TOMAS<br />
GeosUtil/ "Utility" modules (e.g. error_mod.f, file_mod.f, time_mod.f, etc.<br />
Headers/ Header files (define.h, CMN_SIZE, CMN_DIAG, etc.)<br />
KPP/ KPP solver directory structure<br />
bin/ Directory where executables are placed<br />
doc/ Directory where documentation is created<br />
help/ Directory for GEOS-Chem Help Screen<br />
lib/ Directory where library files are placed<br />
mod/ Directory where module files are placed<br />
obsolete/ Directory where obsolete versions of code are archived<br />
<br />
Because there were a lot of TOMAS-related modifications in several GEOS-Chem "core" routines, the routines that need to "talk" to TOMAS were placed into a separate subdirectory named <tt>GeosTomas/</tt>. The files in <tt>GeosTomas</tt> are:<br />
<br />
Files:<br />
------<br />
Makefile -- GEOS-Chem routines that have been<br />
aero_drydep.f modified to reference the TOMAS aerosol<br />
carbon_mod.f microphysics package. These are kept<br />
chemdr.f in a separate GeosTomas directory so that<br />
chemistry_mod.f they do not interfere with the routines<br />
cleanup.f in the GeosCore directory.<br />
diag3.f<br />
diag_mod.f The GeosTomas directory only needs to<br />
diag_pl_mod.f contain the files that have been modified<br />
drydep_mod.f for TOMAS. The Makefile will look for<br />
dust_mod.f all other files from the GeosCore directory<br />
emissions_mod.f using the VPATH option in GNU Make.<br />
gamap_mod.f<br />
initialize.f NOTE to GEOS-Chem developers: When you<br />
input_mod.f make changes to any of these routines<br />
isoropia_mod.f in the GeosCore directory, you must also<br />
logical_mod.f make the same modifications to the<br />
ndxx_setup.f corresponding routines in the GeosTomas<br />
planeflight_mod.f directory.<br />
seasalt_mod.f<br />
sulfate_mod.f Maybe in the near future we can work<br />
tomas_mod.f towards integrating TOMAS into the GeosCore<br />
tomas_tpcore_mod.f90 directory more cleanly. However, due to<br />
tpcore_mod.f the large number of modifications that were<br />
tpcore_window_mod.f necessary for TOMAS, it was quicker to<br />
tracerid_mod.f implement the TOMAS code in a separate<br />
wetscav_mod.f subdirectory. <br />
xtra_read_mod.f -- Bob Y. (1/25/10)<br />
<br />
Each of these files were merged with the corresponding files in the <tt>GeosCore</tt> subdirectory. Therefore, in addition to having the GEOS-Chem modifications from [[GEOS-Chem v8-02-05|v8-02-05]], these files also have the relevant TOMAS references.<br />
<br />
A few technical considerations dictated the placing of these files into a separate <tt>GeosTomas/</tt> directory:<br />
<br />
* The ND60 diagnostic in the standard GEOS-Chem code (in <tt>GeosCore/</tt>) is now used for the CH4 offline simulation, but in TOMAS it's used for something else. <br />
* Some parameters needed to be declared differently with for simulations with TOMAS. <br />
* Because not all GEOS-Chem users will choose to use TOMAS, we did not want to unnecessarily bog down the code in <tt>GeosCore/</tt> with references to TOMAS-specific routines. <br />
<br />
All of these concerns could be best solved by keeping parallel copies of the affected routines in the <tt>GeosTomas</tt> directory.<br />
<br />
--[[User:Bmy|Bob Y.]] 13:35, 25 February 2010 (EST)<br />
<br />
=== Building GEOS-Chem with TOMAS ===<br />
<br />
<div style="color: #aa0000; background: #eeeeee;border: 3px solid red; padding: 1em; margin: auto; width: 90%; ">'''<p>NOTE: This has been rendered obsolete by the [[#Update April 2013|re-integration of TOMAS into GEOS-Chem]], which was included in [[GEOS-Chem v9-02]]. All of the TOMAS routines have now been ported into the <tt>GeosCore</tt> directory. We shall leave this post here for reference. (Bob Yantosca, 25 Feb 2014)</p>'''</div><br />
<br />
The <tt>VPATH</tt> feature of [http://www.gnu.org/software/make/manual/make.html GNU Make] is used to simplify the compilation. When GEOS-Chem is compiled with the tomas target, the GNU Make utility will search for files in the <tt>GeosTomas/</tt> directory first. If it cannot find files there, it will then search the <tt>GeosCore/</tt> directory. Thus, if we make a change to a "core" GEOS-Chem routine in the <tt>GeosCore/</tt> subdirectory (say in <tt>dao_mod.f</tt> or <tt>diag49_mod.f</tt>), then those changes will automatically be applied when you build GEOS-Chem with TOMAS. Thus, we only need to keep in <tt>GeosTomas/</tt> separate copies of those files that have to "talk" with TOMAS.<br />
<br />
Several new targets were added to the <tt>Makefile</tt> in the top-level <tt>Code/</tt> directory:<br />
<br />
#=============================================================================<br />
# Targets for TOMAS aerosol microphysics code (win, bmy, 1/25/10)<br />
#=============================================================================<br />
<br />
.PHONY: tomas libtomas exetomas cleantomas<br />
<br />
tomas:<br />
@$(MAKE) -C $(GEOSTOM) TOMAS=yes all<br />
<br />
libtomas:<br />
@$(MAKE) -C $(GEOSTOM) TOMAS=yes lib<br />
<br />
exetomas:<br />
@$(MAKE) -C $(GEOSTOM) TOMAS=yes exe<br />
<br />
cleantomas:<br />
@$(MAKE) -C $(GEOSTOM) TOMAS=yes clean<br />
<br />
You can build GEOS-Chem with the TOMAS option by typing:<br />
<br />
make tomas ...<br />
<br />
This will automatically do the proper things to build the TOMAS code into GEOS-Chem, such as:<br />
<br />
* Adding a <tt>-DTOMAS</tt> C-preprocessor switch to the <tt>FFLAGS</tt> compiler flag settings in <tt>Makefile_header.mk</tt>. This will cause TOMAS-specific areas of code to be turned on.<br />
* Turning off OpenMP parallelization. For now the GEOS-Chem + TOMAS code needs to be run on a single processor. We continue to work on parallelizing the code.<br />
* Calling the Makefile in the <tt>GeosTomas/</tt> subdirectory to build the executable. The executable file is now named <tt>geostomas</tt> in order to denote that the TOMAS code is built in.<br />
<br />
The GEOS-Chem + TOMAS has been built on the following compilers<br />
<br />
* Intel Fortran compiler v10<br />
* Intel Fortran compiler v11.1 (20101201)<br />
* SunStudio 12<br />
<br />
--[[User:Bmy|Bob Y.]] 10:36, 27 January 2010 (EST)<br />
<br />
=== Compile from GeosTomas directory ===<br />
<br />
<div style="color: #aa0000; background: #eeeeee;border: 3px solid red; padding: 1em; margin: auto; width: 90%; ">'''<p>NOTE: This has been rendered obsolete by the [[#Update April 2013|re-integration of TOMAS into GEOS-Chem]], which was included in [[GEOS-Chem v9-02]]. We shall leave this post here for reference. (Bob Yantosca, 30 Jan 2014)</p>'''</div><br />
<br />
'''''[mailto:dwesterv@andrew.cmu.edu Dan Westervelt] wrote:'''''<br />
<br />
:I think there is something going wrong in my compilation, although errors have come up at both compile time and run time. The worst of the problems is this: I'll make a change to any fortran file in the code (even something meaningless like print*, 'foo') and hundreds of compile errors come out with fishy error messages such as (from ifort v10.1):<br />
<br />
***fortcom: Error: chemistry_mod.f, line 478: A kind type parameter must be a compile-time constant. [DP]<br />
REAL(kind=dp) :: RCNTRL(20)<br />
<br />
:Any advice? The errors I'm having are not unique to any version of GC, any type of met fields, any compiler, etc.<br />
<br />
'''''[mailto:yantosca@seas.harvard.edu Bob Yantosca] wrote:'''''<br />
<br />
:Make sure you are always in the GeosTomas subdirectory when you build the code. Sometimes there is a problem if you build the code from a higher level directory. This may have to do with the VPATH in the makefile.<br />
<br />
'''''[mailto:dwesterv@andrew.cmu.edu Dan Westervelt] wrote:'''''<br />
<br />
:Thanks, that seems to do the trick.<br />
<br />
--[[User:Bmy|Bob Y.]] 14:37, 14 April 2010 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_aerosol_microphysics&diff=16034TOMAS aerosol microphysics2014-03-03T20:36:53Z<p>Salvatore Farina: /* TOMAS-specific setup */</p>
<hr />
<div>This page describes the TOMAS aerosol microphysics option in GEOS-Chem. TOMAS is one of two aerosol microphysics packages being incorporated into GEOS-Chem, the other being [[APM aerosol microphysics|APM]].<br />
<br />
== Overview ==<br />
<br />
The TwO-Moment Aerosol Sectional (TOMAS) microphysics package was developed for implementation into GEOS-Chem at Carnegie-Mellon University. Using a moving sectional and moment-based approach, TOMAS tracks two independent moments (number and mass) of the aerosol size distribution for a number of discrete size bins. It also contains codes to simulate nucleation, condensation, and coagulation processes. The aerosol species that are considered with high size resolution are sulfate, sea-salt, OC, EC, and dust. An advantage of TOMAS is the full size resolution for all chemical species and the conservation of aerosol number, the latter of which allows one to construct aerosol and CCN number budgets that will balance.<br />
<br />
=== Authors and collaborators ===<br />
* [mailto:petera@andrew.cmu.edu Peter Adams] ''(Carnegie-Mellon U.)'' -- Principal Investigator<br />
* [mailto:wtrivita@staffmail.ed.ac.uk Win Trivitayanurak] ''(Department of Highways, Thailand)''<br />
* [mailto:dwesterv@andrew.cmu.edu Dan Westervelt] ''(Carnegie-Mellon U.)''<br />
* [mailto:jeffrey.pierce@dal.ca Jeffrey Pierce] ''(Dalhousie U.)''<br />
* [mailto:sal.farina@gmail.com Salvatore Farina] ''(Colorado State U.)''<br />
<br />
Questions regarding TOMAS can be directed at Dan (e-mail linked above).<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 11:53, 27 January 2010 (EST)<br />
<br />
=== TOMAS User Groups ===<br />
<br />
{| border=1 cellspacing=0 cellpadding=5<br />
|- bgcolor="#cccccc"<br />
!User Group<br />
!Personnel<br />
!Projects<br />
|-valign="top"<br />
|[http://www.ce.cmu.edu/%7Eadams/ Carnegie-Mellon University]<br />
|[http://www.ce.cmu.edu/%7Eadams/people.htm#peter Peter Adams]<br>[http://www.ce.cmu.edu/~dwesterv/Site/Home.html Dan Westervelt]<br />
| [http://www.atmos-chem-phys-discuss.net/13/8333/2013/acpd-13-8333-2013.html New particle formation evaluation in GC-TOMAS] <br> Sensitivity of CCN to nucleation rates <br> Development of number tagging and source apportionment model for GC-TOMAS<br />
|-valign="top"<br />
|[http://fizz.phys.dal.ca/%7Epierce/ Dalhousie University] <br> [http://www.atmos.colostate.edu/faculty/pierce.php Colorado State]<br />
|[http://atm.dal.ca/Faculty/Jeffrey_Pierce.php Jeffrey Pierce]<br>Sal Farina<br>Stephen D'Andrea<br />
|Sensitivity of CCN to condensational growth rates <br> TOMAS parallelization <br> Others...<br />
|-valign="top"<br />
|Add yours here<br />
|<br />
|<br />
|}<br />
<br />
== TOMAS-specific setup ==<br />
TOMAS has its own run directories (run.Tomas) that can be downloaded from the Harvard FTP. The <tt>input.geos</tt> file will look slightly different from standard GEOS-Chem, and between versions.<br />
<br />
Pre- v9.02:<br />
To turn on TOMAS, see the "Microphysics menu" in <tt>input.geos</tt> and make sure TOMAS is set to '''T'''. <br />
<br />
v9.02 and later:<br />
TOMAS is enabled or disabled at compile time - the TOMAS flag in input.geos has been removed.<br />
<br />
<br />
TOMAS is a simulation type 3 and utilizes 171-423 tracers. Each aerosol species requires 30 tracers for the 30 bin size resolution, 12 for the 12 bin, etc. Here is the (abbreviated) default setup in input.geos for TOMAS-30 in v9.02 and later (see run.Tomas directory):<br />
<br />
Tracer # Description <br />
1- 62 Std Geos Chem <br />
63 H2SO4 <br />
64- 93 Number <br />
94-123 Sulfate <br />
124-153 Sea-salt <br />
154-183 Hydrophilic EC <br />
184-213 Hydrophobic EC <br />
214-243 Hydrophilic OC <br />
244-273 Hydrophobic OC <br />
274-303 Mineral dust <br />
304-333 Aerosol water<br />
<br />
TOMAS-40 requires 423 tracers (~360 TOMAS tracers for each of the 40-bin species, and ~62 standard GEOS-Chem tracers) <br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 18:48, 8 July 2013 (EDT)<br />
<br />
== Implementation notes ==<br />
<br />
TOMAS validation in [[GEOS-Chem v8-03-01]] was completed on 24 Feb 2010.<br />
<br />
=== Update April 2013 ===<br />
<br />
'''''This update was tested in the 1-month benchmark simulation [[GEOS-Chem_v9-02_benchmark_history#v9-02k|v9-02k]] and approved on 07 Jun 2013.'''''<br />
<br />
Sal Farina has been working with the GEOS-Chem Support Team to inline the TOMAS aerosol microphysics code into the <tt>GeosCore</tt> directory. All TOMAS-specific sections of code are now segregated from the rest of GEOS-Chem with C-preprocessor statements such as:<br />
<br />
#if defined( TOMAS )<br />
<br />
# if defined( TOMAS40 ) <br />
... Code for 40 bin TOMAS simulation (optional) goes here ...<br />
# elif defined( TOMAS12 )<br />
... Code for 12 bin TOMAS simulation (optional) goes here ...<br />
# elif defined( TOMAS15 )<br />
... Code for 15 bin TOMAS simulation (optional) goes here ...<br />
# else<br />
... Code for 30 bin TOMAS simulation (default) goes here ...<br />
# endif<br />
<br />
#endif <br />
<br />
TOMAS is now invoked by compiling GEOS-Chem with one of the following options:<br />
<br />
make -j4 TOMAS=yes ... # Compiles GEOS-Chem for the 30 bin (default) TOMAS simulation<br />
# -j4 compiles 4 files at a time; this reduces overall compilation time<br />
<br />
or<br />
<br />
make -j4 TOMAS40=yes ... # Compiles GEOS-Chem for the 40 bin (optional) TOMAS simulation<br />
# -j4 compiles 4 files at a time; this reduces overall compilation time<br />
<br />
All files in the old <tt>GeosTomas/</tt> directory have now been deleted, as these have been rendered obsolete.<br />
<br />
These updates are included in [[GEOS-Chem v9-02]]. These modifications will not affect the existing GEOS-Chem simulations, as all TOMAS code is not compiled into the executable unless you specify either <tt>TOMAS=yes</tt> or <tt>TOMAS40=yes</tt> at compile time.<br />
<br />
We are in the process of updating the wiki to reflect these changes as they are implemented. <br />
<br />
--[[User:Bmy|Bob Y.]] 13:59, 23 April 2013 (EDT)<br><br />
--[[User:Salvatore Farina|Salvatore Farina]] 13:49, 4 June 2013 (EDT)<br />
<br />
== Computational Information ==<br />
<br />
GC-TOMAS v9-02 (30 sections) on 8 processors: <br />
One year simulation = 7-8 days wall clock time<br />
<br />
More speedups are available using lower aerosol size resolution<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 11:00, 07 May 2013 (EST)<br />
<br />
== Microphysics Code==<br />
The aerosol microphysics code is largely contained within the file <tt>tomas_mod.f</tt>. Tomas_mod and its subroutines are modular -- they use all their own internal variables. For details, see tomas_mod.f and comments. <br />
<br />
=== Nucleation ===<br />
The choice of nucleation theory is selected in the header section of <tt>tomas_mod.f</tt>. The choices are currently binary homogeneous nucleation as in Vehkamaki, 2001 or ternary homogenous nucleation as in Napari et al., 2002. The ternary nucleation rate is typically scaled by a globally uniform tuning factor of 10^-4 or 10^-5. Binary nucleation (Vehkamaki et al. 2002), ion-mediated nucleation (Yu, 2008) and activation nucleation (Kulmala, 2006) are options as well.<br />
<br />
In TOMAS-12 and TOMAS-30, nucleated particles follow the Kerminen approximation to grow to the smallest size bin. This has a tendency to overpredict the number of particles in the smallest bins of those models. See Y. H. Lee, J. R. Pierce, and P. J. Adams 2013 [http://www.geosci-model-dev-discuss.net/6/893/2013/gmdd-6-893-2013.html here] for more details on the consequences of this.<br />
<br />
=== Condensation ===<br />
<br />
=== Coagulation ===<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 14:08, 9 May 2011 (EST)<br />
<br />
== Validation ==<br />
<br />
GC-TOMAS [[GEOS-Chem v8-03-01|v8-03-01]] generally compares very well with observations and other models. Please see our [http://acmg.seas.harvard.edu/geos/wiki_docs/TOMAS/TOMAS_benchmark_ForHarvard.pdf GC-TOMAS v8-02-05 validation document] for more information and figures. <br />
<br />
Below are some results of benchmarking GC-TOMAS with earlier versions of the model as well as observations:<br />
<br />
[[Image:CN10_smaller.jpg]]<br />
<br />
'''Figure 1: CN10 concentrations predicted by GC-TOMAS v8-02-05 against observations. Descriptions of observational data can be found on p 5454 of Pierce et al, Atmos. Chem. Phys., 7, 2007.'''<br />
<br />
----<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 20:13, 10 February 2010 (EST)<br />
<br />
== Other features of TOMAS ==<br />
Other varieties of TOMAS are suited for specific science questions, for example with nucleation studies where explicit aerosol dynamics are needed for nanometer-sized particles. <br />
<br />
=== Set-up Guide ===<br />
<br />
This [[TOMAS setup guide]] was written for users on ACE-NET's Glooscap cluster, but may be more generally applicable.<br />
Please contact [mailto:sal.farina@gmail.com Salvatore Farina] for help in obtaining the latest development version of GEOS-Chem with TOMAS.<br />
This will allow you to take advantage of parallel computation in TOMAS.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 11:55, 26 July 2013 (EDT)<br />
<br />
=== Size Resolution ===<br />
<br />
;TOMAS-30: All 7 chemical species have size resolution ranging from 10 nm to 10 µm, spanned by 30 logarithmically spaced (mass doubling) bins.<br />
;TOMAS-40: Same as TOMAS-30 with 10 additional (mass doubling) sub-10nm bins with a lower limit ~1nm<br />
;TOMAS-12: All 7 chemical species have size resolution ranging from 10 nm to 1 µm spanned by 10 logarithmically spaced (mass quadrupling) bins and two supermicron bins. Coarser resolution than TOMAS-30 - Improved computation time. <br />
;TOMAS-15: Same as TOMAS-12 with 3 additional (mass quadrupling) sub-10nm bins with a lower limit ~2nm. Analogous to TOMAS40 with improved computation time.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 12:51, 4 June 2013 (EDT)<br />
<br />
=== Nesting and grid size ===<br />
TOMAS is implemented on a 2x2.5 North American domain. Developed by Jeffrey Pierce (jeffrey.pierce@dal.ca)<br />
<br />
=== AOD, CCN post-processing code ===<br />
Codes available for calculating aerosol optical depth using TOMAS predicted aerosol composition and size and Mie Theory. Also CCN concentrations calculated from TOMAS size-resolved composition and Kohler theory. Developed by Yunha Lee and Jeffrey Pierce, adapted for GEOS-Chem output by Jeffrey Pierce.<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 2:00, 9 May 2011 (EST)<br />
<br />
== References ==<br />
<br />
In this section we provide references relevant to TOMAS aerosl microphysics simulations.<br />
<br />
=== Studies using TOMAS simulations ===<br />
#'''Nucleation in GEOS-Chem:''' Westervelt, D. M., Pierce, J. R., Riipinen, I., Trivitayanurak, W., Hamed, A., Kulmala, M., Laaksonen, A., Decesari, S., and Adams, P. J.: ''Formation and growth of nucleated particles into cloud condensation nuclei: model-measurement comparison'', <u>Atmos. Chem. Phys. Discuss.</u>, '''13''', 8333-8386, doi:10.5194/acpd-13-8333-2013, 2013. [http://www.atmos-chem-phys-discuss.net/13/8333/2013/acpd-13-8333-2013.html LINK]<br />
#'''TOMAS implementation in GEOS-Chem:''' Trivitayanurak, W., Adams, P. J., Spracklen, D. V. and Carslaw, K. S.: ''Tropospheric aerosol microphysics simulation with assimilated meteorology: model description and intermodel comparison'', <u>Atmos. Chem. Phys.</u>, '''8'''(12), 3149-3168, 2008.<br />
#'''TOMAS initial paper, sulfate only:''' Adams, P. J. and Seinfeld, J. H.: ''redicting global aerosol size distributions in general circulation models'', <u>J. Geophys. Res.-Atmos.</u>, '''107'''(D19), -, doi:Artn 4370 Doi 10.1029/2001jd001010, 2002.<br />
#'''TOMAS with sea-salt:''' Pierce, J.R., and Adams P.J., ''Global evaluation of CCN formation by direct emission of sea salt and growth of ultrafine sea salt'', <u>J. Geophys. Res.-Atmos.</u>, '''111''' (D6), doi:10.1029/2005JD006186, 2006.<br />
#'''TOMAS with carbonaceous aerosol:''' Pierce, J. R., Chen, K. and Adams, P. J.: ''Contribution of primary carbonaceous aerosol to cloud condensation nuclei: processes and uncertainties evaluated with a global aerosol microphysics model'', <u>Atmos. Chem. Phys.</u>, '''7'''(20), 5447-5466, doi:10.5194/acp-7-5447-2007, 2007.<br />
#'''TOMAS with dust:''' Lee, Y.H., K. Chen, and P.J. Adams, 2009: ''Development of a global model of mineral dust aerosol microphysics''. <u>Atmos. Chem. Phys.</u>, '''8''', 2441-2558, doi:10.5194/acp-9-2441-2009.<br />
<br />
--[[User:Bmy|Bob Y.]] 17:04, 24 February 2014 (EST)<br />
<br />
=== Input data used by TOMAS ===<br />
#Usoskin, I. G. and Kovaltsov, G. A., ''Cosmic ray induced ionization in the atmosphere: Full modeling and practical applications'', <u>J. Geophys. Res.</u>, '''111''', doi:10.1029/2006JD007150, 2006..<br />
#Yu, Fangqun, et al, ''Ion-mediated nucleation in the atmosphere: Key controlling parameters, implications, and look-up table'', <u>J. Geophys. Res.</u>, '''115''', D03206, doi:10.1029/2009JD012630, 2010.<br />
<br />
--[[User:Bmy|Bob Y.]] 17:03, 24 February 2014 (EST)<br />
<br />
== Previous issues now resolved ==<br />
<br />
=== Minor bug in TOMAS sulfate emissions ===<br />
<br />
'''''This update was tested in the 1-month benchmark simulation [[GEOS-Chem_v9-02_benchmark_history#v9-02o|v9-02o]] and approved on 03 Sep 2013.'''''<br />
<br />
'''''[mailto:sal.farina@gmail.com Sal Farina] wrote:'''''<br />
:Calling mnfix before and after emission ensures the size distribution is well behaved, and eliminates "Negative SF emis" warnings. An edit to mnfix was also introduced, whereby "tiny" mass added to zero mass, "epsilon" number situations resulted in very high mass per particle results - necessitating excessive error detection, correction, and verbosity.<br />
<br />
--[[User:Melissa Payer|Melissa Sulprizio]] 15:08, 7 August 2013 (EDT)<br />
<br />
=== Segmentation Fault ===<br />
You may get an early segfault if your stacksize is not set to either unlimited or a very large number. To avoid this, you either have to change the value of an environmental variable (setenv command in <tt>.cshrc</tt>) or use the <tt>ulimit</tt> command. See [http://wiki.seas.harvard.edu/geos-chem/index.php/Machine_issues_%26_portability#Resetting_stacksize_for_Linux this page] for details.<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 20:20, 10 February 2010 (EST)<br />
<br />
=== Updates for GEOS-Chem v9-02 public release ===<br />
<br />
<div style="color: #aa0000; background: #eeeeee;border: 3px solid red; padding: 1em; margin: auto; width: 90%; ">'''<p>NOTE: As described below, there appears to be [[#Potential parallelization problems|a potential parallelizaiton problem with the TOMAS ND60 diagnostic]]. We are currently looking into this. This issue, however, does not affect the tracer concentrations computed by TOMAS, but only the output of the ND60 diagnostic itself. For this reason we are moving ahead with the TOMAS benchmarks for v9-02. (Bob Yantosca, 21 Feb 2014)</p>'''</div><br />
<br />
We have found and fixed several minor numerical and coding issues prior to the public release of [[GEOS-Chem v9-02]] (01 Mar 2014). The TOMAS40 simulation has been validated with the [[GEOS-Chem Unit Tester]]. Below is the [[GEOS-Chem Unit Tester#Interpreting_results_generated_by_the_GEOS-Chem_Unit_Tester|output of a unit test]] that was submitted on 2014/02/21 at 12:47:26 PM:<br />
<br />
###############################################################################<br />
### VALIDATION OF GEOS-CHEM OUTPUT FILES<br />
### In directory: geos5_4x5_TOMAS40<br />
###<br />
### File 1 : trac_avg.geos5_4x5_TOMAS40.2005070100.sp<br />
### File 2 : trac_avg.geos5_4x5_TOMAS40.2005070100.mp<br />
### Sizes : IDENTICAL (680420788 and 680420788)<br />
### Checksums : IDENTICAL (179613338 and 179613338)<br />
### Diffs : IDENTICAL<br />
###<br />
### File 1 : trac_rst.geos5_4x5_TOMAS40.2005070101.sp<br />
### File 2 : trac_rst.geos5_4x5_TOMAS40.2005070101.mp<br />
### Sizes : IDENTICAL (263480068 and 263480068)<br />
### Checksums : IDENTICAL (1925551193 and 1925551193)<br />
### Diffs : IDENTICAL<br />
###<br />
### File 1 : soil_rst.geos5_4x5_TOMAS40.2005070101.sp<br />
### File 2 : soil_rst.geos5_4x5_TOMAS40.2005070101.mp<br />
### Sizes : IDENTICAL (54040 and 54040)<br />
### Checksums : IDENTICAL (3229970876 and 3229970876)<br />
### Diffs : IDENTICAL<br />
###############################################################################<br />
<br />
In the subsections below, we describe in more detail the fixes that we made for [[GEOS-Chem v9-02]]:<br />
<br />
==== Fixes for minor coding errors ====<br />
<br />
#<p>In <tt>GeosCore/main.F</tt>, we now replaced <tt>CALL FLUSH()</tt> with <tt>CALL FLUSH(6)</tt>. The <tt>FLUSH</tt> routine needs to take an argument. Unit #6 is the unit stdout (i.e. the screen and/or log file).</p><br />
#<p>In routine <tt>CHEM_SO2</tt> (in module <tt>GeosCore/sulfate_mod.F</tt>), we now avoid referencing the dust tracers DST1, DST2, DST3, and DST4 tracers for TOMAS simulations. TOMAS uses size-resolved dust tracers, and therefore does not carry DST1-4 tracers. This error seems to have been introduced when the fix for cloud pH was introduced in Sep 2013.</p><br />
#<p>In routine <tt>COND_NUC</tt> (in module <tt>GeosCore/tomas_mod.F</tt>), we added error traps to avoid division-by-zero errors that occurred when the variable <tt>CSCH</tt> is zero. When <tt>CSCH</tt> is zero, we now set variable <tt>ADDT</tt> to zero. When <tt>ADDT</tt> is zero, it will get reassigned to a minimum time step, so this fix should work OK.</p><br />
#<p>In <tt>GeosCore/gamap_mod.F</tt>, we now have restored several entries to <tt>tracerinfo.dat</tt> for the ND44 diagnostic that were not getting properly printed out when the TOMAS simuation was being used.</p><br />
#<p>In module <tt>GeosCore/drydep_mod.F</tt>, we Now set <tt>MAXDEP=105</tt> for all simulations, including TOMAS. Formerly, TOMAS had <tt>MAXDEP=100</tt>. This is close enough.</p><br />
#<p>In module <tt>GeosCore/diag3.F</tt>, we now avoid an out-of-bounds error in <tt>DEPNAME(N)</tt> during TOMAS simulations. We save the drydep species name from <tt>DEPNAME(N)</tt> into an new variable <tt>DRYDEP_NAME</tt> for <tt>N = 1..NUMDEP</tt>. We then set <tt>DRYDEP_NAME = ''</tt> for <tt>N > NUMDEP</tt>. This error occurs because we extend the # of drydep tracers during TOMAS simulations to account for the size bins.</p><br />
#<p>We have fixed a couple of logical errors that prevented dust emissions from happening. Minor modifications were made to IF statements in <tt>GeosCore/chemistry_mod.F</tt>, <tt>GeosCore/dust_mod.F</tt>, and <tt>GeosCore/input_mod.F</tt>.</p><br />
#<p>In file <tt>GeosCore/Makefile</tt>, make sure to add <tt>tomas_mod.o</tt> to the list of modules used by <tt>wetscav_mod.F</tt> (aka the "dependency listing"). The corrected code should look like this:</p><br />
<br />
wetscav_mod.o : wetscav_mod.F \<br />
dao_mod.o diag_mod.o \<br />
depo_mercury_mod.o get_ndep_mod.o \<br />
get_popsinfo_mod.o tracerid_mod.o \<br />
tracer_mod.o tomas_mod.o<br />
<br />
--[[User:Bmy|Bob Y.]] 10:20, 19 February 2014 (EST)<br />
<br />
==== Fixes for parallelization errors ====<br />
<br />
#<p>In routine <tt>AEROPHYS</tt> (in module <tt>GeosCore/tomas_mod.F</tt>), we need to add the following variables to the <tt>!$OMP+PRIVATE</tt> statement: <tt>TRACNUM</tt>, <tt>NH3_TO_NH4</tt>, and <tt>SURF_AREA</tt>. Adding these now causes TOMAS to have identical sp vs. mp results when chemistry and microphysics are turned on.</p><br />
#<p>In routine <tt>DEPVEL</tt> (in <tt>GeosCore/drydep_mod.F</tt>): Instead of holding <tt>A_RADI</tt> and <tt>A_DEN</tt> as <tt>!$OMP+PRIVATE</tt> in TOMAS simulations (in the main DO loop in <tt>DEPVEL</tt>), we now save the particle size and density values to private variables <tt>DIAM</tt> and <tt>DEN</tt>. We then pass those as arguments to function <tt>DUST_SFCRSII</tt>.</p> <br />
#<p>We have corrected an issue in routine <tt>NFCLDMX</tt> (in module <tt>GeosCore/convection_mod.F</tt>) that potentially impacts the TOMAS wet scavenging, as described below:</p><br />
#*<p>We think there are different results for parallel and serial because of an assumption that's true for normal simulations but fails on TOMAS. The assumption is "tracers are independent through wet scavenging." Since TOMAS scavenging is size dependent, removing material from the distribution before calculating the soluble fraction of another component is "wrong." We now compute the fractions explicitly before the removal step. To do this, we now call routine <tt>COMPUTE_F</tt> in its own parallel DO loop located immediately before the main parallel do loop in <tt>NFCLDMX</tt>.</p><br />
#*<p>This modification also required the ND37 diagnostic IF block to be put into the same loop as <tt>COMPUTE_F</tt>. Furthermore, because <tt>COMPUTE_F</tt> returns the value of diagnostic index <tt>ISOL</tt>, and because <tt>ISOL</tt> is also used for the ND38 diagnostic in the main parallel loop below, we must also save the values of <tt>ISOL</tt> in a 1-D vector. This will allow the values of ISOL to be passed from the first parallel loop to the second. This ensures that the ND37 and ND38 diagnostics will be computed properly for all GEOS-5 simulations that have soluble tracers.</p><br />
#*<p>This modification has been tested in the [[GEOS-Chem Unit Tester]] by Bob Yantosca (04 Feb 2014) and it has yielded identical results for <tt>geos5_4x5_fullchem</tt>, <tt>geos5_4x5_Hg</tt>, <tt>geos5_4x5_RnPbBe</tt>, <tt>geos5_4x5_soa</tt> and <tt>geos5_4x5_soa_svpoa</tt> simulations.</p><br />
#<p>We have made some fixes in <tt>GeosCore/wetscav_mod.F</tt> that caused single-processor TOMAS runs to have different output than multi-processor runs. A few instances of code were computing quantities sequentially and then storing them for later use. These were technically thread-safe, but were susceptible to error because the order of computation would be different when running with parallelization turned on. These sections of code have now been rewritten accordingly.</p><br />
<br />
--[[User:Bmy|Bob Y.]] 14:09, 21 February 2014 (EST)<br />
<br />
==== Removed inefficient subroutine calls ====<br />
<br />
#<p>In <tt>GeosCore/diag3.F</tt>, we now use a 2-D array <tt>(J-L)</tt> for archiving into the ND60 TOMAS diagnostic. This eliminates an array temporary in the call to routine BPCH2.</p><br />
#<p>In routine <tt>AEROPHYS</tt> (in module <tt>GeosCore/tomas_mod.F</tt>), we now use an array <tt>ERR_IND</tt> to pass the I,J,L,N indices to error checking routine <tt>CHECK_VALUE</tt>. We previously used an array descriptor <tt>(/I,J,L,0/)</tt> which caused an array temporary to be created.</p><br />
#<p>In routine <tt>EMISSCARBON</tt> (in module <tt>GeosCore/carbon_mod.F</tt>), we removed array temporaries from the calls to subroutine <tt>EMITSGC</tt>. We now sum two arrays into a temporary array, and then pass that to <tt>EMITSGC</tt>.</p><br />
#<p>We rewrote the subroutine calls to NH4BULKTOBIN to avoid the creation of array temporaries. In most cases this was done by replacing <tt>MK(1:IBINS,SRTSO4)</tt> with <tt>MK(:,SRTSO4)</tt>, etc. By explicitly stating the sub-slice <tt>MK(1:IBINS,SRTSO4)</tt>, this causes the compiler to create an array temporary. Using <tt>MK(:,SRTSO4)</tt> instead allows for a more efficient pointer slice to be passed.</p><br />
<br />
--[[User:Bmy|Bob Y.]] 14:47, 31 January 2014 (EST)<br />
<br />
==== Fixes for convenience ====<br />
<br />
#<p>We now read many of the TOMAS data files from the directory <tt>TRIM( DATA_DIR_1x1 ) // 'TOMAS_201402/'</tt>. This avoids us from having to keep these big files (some of which approach 100 MB in size) in individual users' run directories.</p><br />
<br />
--[[User:Bmy|Bob Y.]] 16:20, 31 January 2014 (EST)<br />
<br />
== Outstanding issues ==<br />
<br />
=== Potential parallelization problems ===<br />
<br />
We have noticed that there may be a parallelization error in the TOMAS [http://acmg.seas.harvard.edu/geos/doc/man/appendix_5.html ND60 diagnostic]. This may be caused by a coding error; in particular, one or more variables that may have been omitted from an <tt>!$OMP+PRIVATE</tt> declaration.<br />
<br />
This is illustrated by the following [[GEOS-Chem_Unit_Tester#Interpreting_results_generated_by_the_GEOS-Chem_Unit_Tester|unit test simulation]] of the [[GEOS-Chem v9-01-02]] provisional release code (submitted at 2:11 PM on 21 Feb 2014):<br />
<br />
###############################################################################<br />
### VALIDATION OF GEOS-CHEM OUTPUT FILES<br />
### In directory: geos5_4x5_TOMAS40<br />
###<br />
### File 1 : trac_avg.geos5_4x5_TOMAS40.2005070100.sp<br />
### File 2 : trac_avg.geos5_4x5_TOMAS40.2005070100.mp<br />
### Sizes : IDENTICAL (707260156 and 707260156)<br />
### Checksums : DIFFERENT (895530022 and 2949483685)<br />
### Diffs : DIFFERENT<br />
###<br />
### File 1 : trac_rst.geos5_4x5_TOMAS40.2005070101.sp<br />
### File 2 : trac_rst.geos5_4x5_TOMAS40.2005070101.mp<br />
### Sizes : IDENTICAL (263480068 and 263480068)<br />
### Checksums : IDENTICAL (1925551193 and 1925551193)<br />
### Diffs : IDENTICAL<br />
###<br />
### File 1 : soil_rst.geos5_4x5_TOMAS40.2005070101.sp<br />
### File 2 : soil_rst.geos5_4x5_TOMAS40.2005070101.mp<br />
### Sizes : IDENTICAL (54040 and 54040)<br />
### Checksums : IDENTICAL (3229970876 and 3229970876)<br />
### Diffs : IDENTICAL<br />
###############################################################################<br />
<br />
In the above test, all TOMAS diagnostics (ND59, ND60, and ND61) were turned on. The restart files (here named <tt>trac_rst.*</tt>) from the single-processor and multi-processor stages of the unit test are identical, but the <tt>ctm.bpch</tt> files (here named <tt>trac_avg.*</tt>) were different. When the restart files are identical, that means single-processor and multi-processor stages produced the identical tracer concentrations (and soil NOx quantities). <br />
<br />
The only differences in the <tt>trac.avg.*</tt> files between the single-processor and multi-processor stages of the unit test were in TOMAS diagnostic quantities. The affected categories appear to be <tt>TMS-COND</tt>, <tt>TMS-COAG</tt>, <tt>TMS-NUCL</tt>, <tt>AERO-FIX</tt>, which points to the ND60 diagnostic.<br />
<br />
In order to confirm that the ND60 diagnostic exhibits the problem, we ran an additional unit test with ND59 and ND61 turned on, but ND60 turned off. This unit test, which was submitted at 3:33PM on 21 Feb 2014, yielded identical results.<br />
<br />
###############################################################################<br />
### VALIDATION OF GEOS-CHEM OUTPUT FILES<br />
### In directory: geos5_4x5_TOMAS40<br />
###<br />
### File 1 : trac_avg.geos5_4x5_TOMAS40.2005070100.sp<br />
### File 2 : trac_avg.geos5_4x5_TOMAS40.2005070100.mp<br />
### Sizes : IDENTICAL (690218236 and 690218236)<br />
### Checksums : IDENTICAL (4196844107 and 4196844107)<br />
### Diffs : IDENTICAL<br />
###<br />
### File 1 : trac_rst.geos5_4x5_TOMAS40.2005070101.sp<br />
### File 2 : trac_rst.geos5_4x5_TOMAS40.2005070101.mp<br />
### Sizes : IDENTICAL (263480068 and 263480068)<br />
### Checksums : IDENTICAL (1925551193 and 1925551193)<br />
### Diffs : IDENTICAL<br />
###<br />
### File 1 : soil_rst.geos5_4x5_TOMAS40.2005070101.sp<br />
### File 2 : soil_rst.geos5_4x5_TOMAS40.2005070101.mp<br />
### Sizes : IDENTICAL (54040 and 54040)<br />
### Checksums : IDENTICAL (3229970876 and 3229970876)<br />
### Diffs : IDENTICAL<br />
###############################################################################<br />
<br />
We are still looking into this issue. Because this issue only affects the ND60 diagnostic output, but not tracer concentrations, we are moving ahead with the TOMAS benchmarks for [[GEOS-Chem v9-02]] (as of 21 Feb 2014). <br />
<br />
--[[User:Bmy|Bob Y.]] 16:17, 21 February 2014 (EST)<br />
<br />
=== Offline Dust ===<br />
Currently, GEOS-Chem with TOMAS uses proscribed offline dust aerosol data in radiative transfer / photolysis calculations. Due to complications, this is turned off entirely for 2x2.5 resolution.<br />
<br />
=== Vertical Grids ===<br />
Currently, GC-TOMAS is only compatible with the reduced vertical grids:<br />
* [http://acmg.seas.harvard.edu/geos/doc/man/appendix_3.html#A3.3.1 GEOS3_30L]<br />
* [http://acmg.seas.harvard.edu/geos/doc/man/appendix_3.html#A3.4.1 GEOS4_30L]<br />
* [http://acmg.seas.harvard.edu/geos/doc/man/appendix_3.html#A3.5.1 GEOS5_47L]<br />
<br />
Development for the full vertical grids is ongoing.<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 20:43, 10 February 2010 (EST)<br />
<br />
== Obsolete versions of TOMAS ==<br />
<br />
In this section we preserve information that pertained to older versions of TOMAS (before the [[GEOS-Chem v9-02]] release).<br />
<br />
=== Code structure ===<br />
<br />
<div style="color: #aa0000; background: #eeeeee;border: 3px solid red; padding: 1em; margin: auto; width: 90%; ">'''<p>NOTE: This has been rendered obsolete by the [[#Update April 2013|re-integration of TOMAS into GEOS-Chem]], which was included in [[GEOS-Chem v9-02]]. All of the TOMAS routines have now been ported into the <tt>GeosCore</tt> directory. We shall leave this post here for reference. (Bob Yantosca, 30 Jan 2014)</p>'''</div><br />
<br />
The main-level <tt>Code</tt> directory has now been divided into several subdirectories:<br />
<br />
GeosCore/ GEOS-Chem "core" routines<br />
GeosTomas/ Parallel copies of GEOS-Chem routines that reference TOMAS<br />
GeosUtil/ "Utility" modules (e.g. error_mod.f, file_mod.f, time_mod.f, etc.<br />
Headers/ Header files (define.h, CMN_SIZE, CMN_DIAG, etc.)<br />
KPP/ KPP solver directory structure<br />
bin/ Directory where executables are placed<br />
doc/ Directory where documentation is created<br />
help/ Directory for GEOS-Chem Help Screen<br />
lib/ Directory where library files are placed<br />
mod/ Directory where module files are placed<br />
obsolete/ Directory where obsolete versions of code are archived<br />
<br />
Because there were a lot of TOMAS-related modifications in several GEOS-Chem "core" routines, the routines that need to "talk" to TOMAS were placed into a separate subdirectory named <tt>GeosTomas/</tt>. The files in <tt>GeosTomas</tt> are:<br />
<br />
Files:<br />
------<br />
Makefile -- GEOS-Chem routines that have been<br />
aero_drydep.f modified to reference the TOMAS aerosol<br />
carbon_mod.f microphysics package. These are kept<br />
chemdr.f in a separate GeosTomas directory so that<br />
chemistry_mod.f they do not interfere with the routines<br />
cleanup.f in the GeosCore directory.<br />
diag3.f<br />
diag_mod.f The GeosTomas directory only needs to<br />
diag_pl_mod.f contain the files that have been modified<br />
drydep_mod.f for TOMAS. The Makefile will look for<br />
dust_mod.f all other files from the GeosCore directory<br />
emissions_mod.f using the VPATH option in GNU Make.<br />
gamap_mod.f<br />
initialize.f NOTE to GEOS-Chem developers: When you<br />
input_mod.f make changes to any of these routines<br />
isoropia_mod.f in the GeosCore directory, you must also<br />
logical_mod.f make the same modifications to the<br />
ndxx_setup.f corresponding routines in the GeosTomas<br />
planeflight_mod.f directory.<br />
seasalt_mod.f<br />
sulfate_mod.f Maybe in the near future we can work<br />
tomas_mod.f towards integrating TOMAS into the GeosCore<br />
tomas_tpcore_mod.f90 directory more cleanly. However, due to<br />
tpcore_mod.f the large number of modifications that were<br />
tpcore_window_mod.f necessary for TOMAS, it was quicker to<br />
tracerid_mod.f implement the TOMAS code in a separate<br />
wetscav_mod.f subdirectory. <br />
xtra_read_mod.f -- Bob Y. (1/25/10)<br />
<br />
Each of these files were merged with the corresponding files in the <tt>GeosCore</tt> subdirectory. Therefore, in addition to having the GEOS-Chem modifications from [[GEOS-Chem v8-02-05|v8-02-05]], these files also have the relevant TOMAS references.<br />
<br />
A few technical considerations dictated the placing of these files into a separate <tt>GeosTomas/</tt> directory:<br />
<br />
* The ND60 diagnostic in the standard GEOS-Chem code (in <tt>GeosCore/</tt>) is now used for the CH4 offline simulation, but in TOMAS it's used for something else. <br />
* Some parameters needed to be declared differently with for simulations with TOMAS. <br />
* Because not all GEOS-Chem users will choose to use TOMAS, we did not want to unnecessarily bog down the code in <tt>GeosCore/</tt> with references to TOMAS-specific routines. <br />
<br />
All of these concerns could be best solved by keeping parallel copies of the affected routines in the <tt>GeosTomas</tt> directory.<br />
<br />
--[[User:Bmy|Bob Y.]] 13:35, 25 February 2010 (EST)<br />
<br />
=== Building GEOS-Chem with TOMAS ===<br />
<br />
<div style="color: #aa0000; background: #eeeeee;border: 3px solid red; padding: 1em; margin: auto; width: 90%; ">'''<p>NOTE: This has been rendered obsolete by the [[#Update April 2013|re-integration of TOMAS into GEOS-Chem]], which was included in [[GEOS-Chem v9-02]]. All of the TOMAS routines have now been ported into the <tt>GeosCore</tt> directory. We shall leave this post here for reference. (Bob Yantosca, 25 Feb 2014)</p>'''</div><br />
<br />
The <tt>VPATH</tt> feature of [http://www.gnu.org/software/make/manual/make.html GNU Make] is used to simplify the compilation. When GEOS-Chem is compiled with the tomas target, the GNU Make utility will search for files in the <tt>GeosTomas/</tt> directory first. If it cannot find files there, it will then search the <tt>GeosCore/</tt> directory. Thus, if we make a change to a "core" GEOS-Chem routine in the <tt>GeosCore/</tt> subdirectory (say in <tt>dao_mod.f</tt> or <tt>diag49_mod.f</tt>), then those changes will automatically be applied when you build GEOS-Chem with TOMAS. Thus, we only need to keep in <tt>GeosTomas/</tt> separate copies of those files that have to "talk" with TOMAS.<br />
<br />
Several new targets were added to the <tt>Makefile</tt> in the top-level <tt>Code/</tt> directory:<br />
<br />
#=============================================================================<br />
# Targets for TOMAS aerosol microphysics code (win, bmy, 1/25/10)<br />
#=============================================================================<br />
<br />
.PHONY: tomas libtomas exetomas cleantomas<br />
<br />
tomas:<br />
@$(MAKE) -C $(GEOSTOM) TOMAS=yes all<br />
<br />
libtomas:<br />
@$(MAKE) -C $(GEOSTOM) TOMAS=yes lib<br />
<br />
exetomas:<br />
@$(MAKE) -C $(GEOSTOM) TOMAS=yes exe<br />
<br />
cleantomas:<br />
@$(MAKE) -C $(GEOSTOM) TOMAS=yes clean<br />
<br />
You can build GEOS-Chem with the TOMAS option by typing:<br />
<br />
make tomas ...<br />
<br />
This will automatically do the proper things to build the TOMAS code into GEOS-Chem, such as:<br />
<br />
* Adding a <tt>-DTOMAS</tt> C-preprocessor switch to the <tt>FFLAGS</tt> compiler flag settings in <tt>Makefile_header.mk</tt>. This will cause TOMAS-specific areas of code to be turned on.<br />
* Turning off OpenMP parallelization. For now the GEOS-Chem + TOMAS code needs to be run on a single processor. We continue to work on parallelizing the code.<br />
* Calling the Makefile in the <tt>GeosTomas/</tt> subdirectory to build the executable. The executable file is now named <tt>geostomas</tt> in order to denote that the TOMAS code is built in.<br />
<br />
The GEOS-Chem + TOMAS has been built on the following compilers<br />
<br />
* Intel Fortran compiler v10<br />
* Intel Fortran compiler v11.1 (20101201)<br />
* SunStudio 12<br />
<br />
--[[User:Bmy|Bob Y.]] 10:36, 27 January 2010 (EST)<br />
<br />
=== Compile from GeosTomas directory ===<br />
<br />
<div style="color: #aa0000; background: #eeeeee;border: 3px solid red; padding: 1em; margin: auto; width: 90%; ">'''<p>NOTE: This has been rendered obsolete by the [[#Update April 2013|re-integration of TOMAS into GEOS-Chem]], which was included in [[GEOS-Chem v9-02]]. We shall leave this post here for reference. (Bob Yantosca, 30 Jan 2014)</p>'''</div><br />
<br />
'''''[mailto:dwesterv@andrew.cmu.edu Dan Westervelt] wrote:'''''<br />
<br />
:I think there is something going wrong in my compilation, although errors have come up at both compile time and run time. The worst of the problems is this: I'll make a change to any fortran file in the code (even something meaningless like print*, 'foo') and hundreds of compile errors come out with fishy error messages such as (from ifort v10.1):<br />
<br />
***fortcom: Error: chemistry_mod.f, line 478: A kind type parameter must be a compile-time constant. [DP]<br />
REAL(kind=dp) :: RCNTRL(20)<br />
<br />
:Any advice? The errors I'm having are not unique to any version of GC, any type of met fields, any compiler, etc.<br />
<br />
'''''[mailto:yantosca@seas.harvard.edu Bob Yantosca] wrote:'''''<br />
<br />
:Make sure you are always in the GeosTomas subdirectory when you build the code. Sometimes there is a problem if you build the code from a higher level directory. This may have to do with the VPATH in the makefile.<br />
<br />
'''''[mailto:dwesterv@andrew.cmu.edu Dan Westervelt] wrote:'''''<br />
<br />
:Thanks, that seems to do the trick.<br />
<br />
--[[User:Bmy|Bob Y.]] 14:37, 14 April 2010 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_setup_guide&diff=14301TOMAS setup guide2013-09-10T00:01:25Z<p>Salvatore Farina: /* Building GEOS-Chem/TOMAS */</p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with [[TOMAS aerosol microphysics]] on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
However, the 'bleeding edge' code has many recent developments in GEOS-Chem/TOMAS that are not included in the public release, including parallel computing.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Compiler ===<br />
GEOS-Chem works best ''(only)'' with the Intel Ifort Fortran compiler - v11.1<br />
There is an instance of the compiler installed on glooscap, which you can load by doing<br />
module load intel/11.1.073<br />
<br />
'''Alternatively''', I have installed ifort version 11.1.080. This also gives you access to the ''iidb'' debugger. To use this version, add the following to your .bashrc<br />
<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
export FC="ifort"<br />
<br />
=== Code ===<br />
You can grab the absolute latest code from my source directory on glooscap:<br />
cp -r /home/sfarina/source/GC_Bleeding_Edge/ ~<br />
<br />
or, (safer) you can grab my latest "snapshot"<br />
cp /home/sfarina/source/GC_BE_snapshot-latest.tgz .<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
Please follow the directions for [[Installing libraries for GEOS-Chem]] wiki before proceeding. You will need to install the netCDF-4.2 libraries.<br />
<br />
=== Environment ===<br />
After installing the libraries, your .bashrc should include a similar section to the following<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
<br />
Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11''<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data (meteorology, emissions, land use, etc.) for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes and additions due to recent GC development and TOMAS specifics.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
=== Restart Files ===<br />
There are restart files for TOMAS at 4x5 resolution at<br />
/net/samqfs/pierce/sfarina/standard_run_directories/restart.TOMASXX<br />
Where ''XX'' is the number of bins. These restart files use an "empty" restart file for 2005/06/01 and spin-up times can be calculated accordingly. I will be adding to this directory in the coming week or two. Restart files for 2x2.5 are located at<br />
/net/samqfs/pierce/sfarina/standard_run_directories/2x2.5/restart.ires.TOMAS15<br />
<br />
So far, I have only used TOMAS15 at this model resolution.<br />
<br />
The North American nested grid is under active development for TOMAS.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== define.h ===<br />
Choice of GEOS-Chem model resolution is done in Headers/define.h . Follow the instructions [http://acmg.seas.harvard.edu/geos/doc/man/chapter_3.html#3.4 here].<br />
<br />
=== Make ===<br />
Glooscap allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do<br />
cd YOUR_CODE_DIR/GC_Bleeding_Edge/GeosCore<br />
pshell16<br />
make -j16 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on glooscap.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
Alternatively, you can use the following to define a tomas version when compiling:<br />
make TOMAS=yes geos<br />
make TOMAS40=yes geos<br />
etc.<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
=== A Note about Speed ===<br />
Choosing the appropriate version of tomas for your needs includes consideration of time and resources.<br />
Using 16 processors on glooscap at 4x5 resolution, the model time : real time ratio is roughly as follows:<br />
version | speedup<br />
40 bin - 64<br />
30 bin - 82<br />
15 bin - 144<br />
12 bin - 170<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, '''ask'''.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. To use git on glooscap:<br />
module load git<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have your source code directory, make a separate branch for yourself before making any changes. This will simplify trading and tracking updates/advances/bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
Now that you've successfully run the model, there are a few more hurdles to inspect your data.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
==== Copy ====<br />
Copy the relevant files to your postprocessing directory for a given run<br />
ctm.bpch<br />
diaginfo.dat<br />
tracerinfo.dat<br />
proc_one.pro<br />
averageCNCCN_XX.py <-- XX is TOMAS version<br />
plotCNCCN.py<br />
<br />
==== Split ====<br />
Use the script Bpch_Sep_Sal interactively from within the IDL environment to ctm.bpch into separate months<br />
For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
<br />
==== Create netcdf output ====<br />
Using the IDL script proc_one.pro, we extract information from the monthly .bpch files and save it to the standard netCDF<br />
Edit proc_one.pro to use the correct infile/outfiles<br />
Execute proc_one from your shell:<br />
idl proc_one.pro<br />
<br />
==== Counting CN and CCN ====<br />
Run averageCNCCN_XX.py, where XX is the model version<br />
For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
<br />
==== Plotting the Results====<br />
Edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version.<br />
plotCNCCN.py will automatically detect the model version and customize map names.<br />
To plot the surface and zonal average concentrations of CN3, CN10, CN40, and CN80 for august:<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
<br />
==== NCview ====<br />
You can also use ncview on the file ctm.nc to view individual species concentrations or nucleation rates.<br />
ncview ctm.nc<br />
ncview ctm_nuc.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F. This is a hard to track bug related to ongoing development of grid independent geoschem.<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* If you are trying to run geoschem ''outside'' of a qrsh (grid engine) environment (i.e. on the head node), you will need to add '''ulimit -S -s unlimited''' to your .bashrc<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups is a better idea.<br />
* if you have any questions or you are running into trouble, ''please ask'' either myself, Sajeev, or Jeff for help. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_setup_guide&diff=14300TOMAS setup guide2013-09-09T23:52:09Z<p>Salvatore Farina: /* Restart Files */</p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with [[TOMAS aerosol microphysics]] on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
However, the 'bleeding edge' code has many recent developments in GEOS-Chem/TOMAS that are not included in the public release, including parallel computing.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Compiler ===<br />
GEOS-Chem works best ''(only)'' with the Intel Ifort Fortran compiler - v11.1<br />
There is an instance of the compiler installed on glooscap, which you can load by doing<br />
module load intel/11.1.073<br />
<br />
'''Alternatively''', I have installed ifort version 11.1.080. This also gives you access to the ''iidb'' debugger. To use this version, add the following to your .bashrc<br />
<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
export FC="ifort"<br />
<br />
=== Code ===<br />
You can grab the absolute latest code from my source directory on glooscap:<br />
cp -r /home/sfarina/source/GC_Bleeding_Edge/ ~<br />
<br />
or, (safer) you can grab my latest "snapshot"<br />
cp /home/sfarina/source/GC_BE_snapshot-latest.tgz .<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
Please follow the directions for [[Installing libraries for GEOS-Chem]] wiki before proceeding. You will need to install the netCDF-4.2 libraries.<br />
<br />
=== Environment ===<br />
After installing the libraries, your .bashrc should include a similar section to the following<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
<br />
Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11''<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data (meteorology, emissions, land use, etc.) for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes and additions due to recent GC development and TOMAS specifics.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
=== Restart Files ===<br />
There are restart files for TOMAS at 4x5 resolution at<br />
/net/samqfs/pierce/sfarina/standard_run_directories/restart.TOMASXX<br />
Where ''XX'' is the number of bins. These restart files use an "empty" restart file for 2005/06/01 and spin-up times can be calculated accordingly. I will be adding to this directory in the coming week or two. Restart files for 2x2.5 are located at<br />
/net/samqfs/pierce/sfarina/standard_run_directories/2x2.5/restart.ires.TOMAS15<br />
<br />
So far, I have only used TOMAS15 at this model resolution.<br />
<br />
The North American nested grid is under active development for TOMAS.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== Make ===<br />
Glooscap allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do<br />
cd YOUR_CODE_DIR/GC_Bleeding_Edge/GeosCore<br />
pshell16<br />
make -j16 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on glooscap.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
Alternatively, you can use the following to define a tomas version when compiling:<br />
make TOMAS=yes geos<br />
make TOMAS40=yes geos<br />
etc.<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
=== A Note about Speed ===<br />
Choosing the appropriate version of tomas for your needs includes consideration of time and resources.<br />
Using 16 processors on glooscap at 4x5 resolution, the model time : real time ratio is roughly as follows:<br />
version | speedup<br />
40 bin - 64<br />
30 bin - 82<br />
15 bin - 144<br />
12 bin - 170<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, '''ask'''.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. To use git on glooscap:<br />
module load git<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have your source code directory, make a separate branch for yourself before making any changes. This will simplify trading and tracking updates/advances/bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
Now that you've successfully run the model, there are a few more hurdles to inspect your data.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
==== Copy ====<br />
Copy the relevant files to your postprocessing directory for a given run<br />
ctm.bpch<br />
diaginfo.dat<br />
tracerinfo.dat<br />
proc_one.pro<br />
averageCNCCN_XX.py <-- XX is TOMAS version<br />
plotCNCCN.py<br />
<br />
==== Split ====<br />
Use the script Bpch_Sep_Sal interactively from within the IDL environment to ctm.bpch into separate months<br />
For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
<br />
==== Create netcdf output ====<br />
Using the IDL script proc_one.pro, we extract information from the monthly .bpch files and save it to the standard netCDF<br />
Edit proc_one.pro to use the correct infile/outfiles<br />
Execute proc_one from your shell:<br />
idl proc_one.pro<br />
<br />
==== Counting CN and CCN ====<br />
Run averageCNCCN_XX.py, where XX is the model version<br />
For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
<br />
==== Plotting the Results====<br />
Edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version.<br />
plotCNCCN.py will automatically detect the model version and customize map names.<br />
To plot the surface and zonal average concentrations of CN3, CN10, CN40, and CN80 for august:<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
<br />
==== NCview ====<br />
You can also use ncview on the file ctm.nc to view individual species concentrations or nucleation rates.<br />
ncview ctm.nc<br />
ncview ctm_nuc.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F. This is a hard to track bug related to ongoing development of grid independent geoschem.<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* If you are trying to run geoschem ''outside'' of a qrsh (grid engine) environment (i.e. on the head node), you will need to add '''ulimit -S -s unlimited''' to your .bashrc<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups is a better idea.<br />
* if you have any questions or you are running into trouble, ''please ask'' either myself, Sajeev, or Jeff for help. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_setup_guide&diff=14299TOMAS setup guide2013-09-09T23:51:44Z<p>Salvatore Farina: /* Data */</p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with [[TOMAS aerosol microphysics]] on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
However, the 'bleeding edge' code has many recent developments in GEOS-Chem/TOMAS that are not included in the public release, including parallel computing.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Compiler ===<br />
GEOS-Chem works best ''(only)'' with the Intel Ifort Fortran compiler - v11.1<br />
There is an instance of the compiler installed on glooscap, which you can load by doing<br />
module load intel/11.1.073<br />
<br />
'''Alternatively''', I have installed ifort version 11.1.080. This also gives you access to the ''iidb'' debugger. To use this version, add the following to your .bashrc<br />
<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
export FC="ifort"<br />
<br />
=== Code ===<br />
You can grab the absolute latest code from my source directory on glooscap:<br />
cp -r /home/sfarina/source/GC_Bleeding_Edge/ ~<br />
<br />
or, (safer) you can grab my latest "snapshot"<br />
cp /home/sfarina/source/GC_BE_snapshot-latest.tgz .<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
Please follow the directions for [[Installing libraries for GEOS-Chem]] wiki before proceeding. You will need to install the netCDF-4.2 libraries.<br />
<br />
=== Environment ===<br />
After installing the libraries, your .bashrc should include a similar section to the following<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
<br />
Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11''<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data (meteorology, emissions, land use, etc.) for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes and additions due to recent GC development and TOMAS specifics.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
=== Restart Files ===<br />
There are restart files for TOMAS at 4x5 resolution at<br />
/net/samqfs/pierce/sfarina/standard_run_directories/restart.TOMASXX<br />
Where ''XX'' is the number of bins. These restart files use an "empty" restart file for 2005/06/01 and spin-up times can be calculated accordingly. I will be adding to this directory in the coming week or two. Restart files for 2x2.5 are located at<br />
/net/samqfs/pierce/sfarina/standard_run_directories/2x2.5/restart.TOMAS15<br />
<br />
So far, I have only used TOMAS15 at this model resolution.<br />
<br />
The North American nested grid is under active development for TOMAS.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== Make ===<br />
Glooscap allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do<br />
cd YOUR_CODE_DIR/GC_Bleeding_Edge/GeosCore<br />
pshell16<br />
make -j16 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on glooscap.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
Alternatively, you can use the following to define a tomas version when compiling:<br />
make TOMAS=yes geos<br />
make TOMAS40=yes geos<br />
etc.<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
=== A Note about Speed ===<br />
Choosing the appropriate version of tomas for your needs includes consideration of time and resources.<br />
Using 16 processors on glooscap at 4x5 resolution, the model time : real time ratio is roughly as follows:<br />
version | speedup<br />
40 bin - 64<br />
30 bin - 82<br />
15 bin - 144<br />
12 bin - 170<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, '''ask'''.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. To use git on glooscap:<br />
module load git<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have your source code directory, make a separate branch for yourself before making any changes. This will simplify trading and tracking updates/advances/bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
Now that you've successfully run the model, there are a few more hurdles to inspect your data.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
==== Copy ====<br />
Copy the relevant files to your postprocessing directory for a given run<br />
ctm.bpch<br />
diaginfo.dat<br />
tracerinfo.dat<br />
proc_one.pro<br />
averageCNCCN_XX.py <-- XX is TOMAS version<br />
plotCNCCN.py<br />
<br />
==== Split ====<br />
Use the script Bpch_Sep_Sal interactively from within the IDL environment to ctm.bpch into separate months<br />
For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
<br />
==== Create netcdf output ====<br />
Using the IDL script proc_one.pro, we extract information from the monthly .bpch files and save it to the standard netCDF<br />
Edit proc_one.pro to use the correct infile/outfiles<br />
Execute proc_one from your shell:<br />
idl proc_one.pro<br />
<br />
==== Counting CN and CCN ====<br />
Run averageCNCCN_XX.py, where XX is the model version<br />
For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
<br />
==== Plotting the Results====<br />
Edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version.<br />
plotCNCCN.py will automatically detect the model version and customize map names.<br />
To plot the surface and zonal average concentrations of CN3, CN10, CN40, and CN80 for august:<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
<br />
==== NCview ====<br />
You can also use ncview on the file ctm.nc to view individual species concentrations or nucleation rates.<br />
ncview ctm.nc<br />
ncview ctm_nuc.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F. This is a hard to track bug related to ongoing development of grid independent geoschem.<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* If you are trying to run geoschem ''outside'' of a qrsh (grid engine) environment (i.e. on the head node), you will need to add '''ulimit -S -s unlimited''' to your .bashrc<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups is a better idea.<br />
* if you have any questions or you are running into trouble, ''please ask'' either myself, Sajeev, or Jeff for help. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_setup_guide&diff=14298TOMAS setup guide2013-09-09T23:35:43Z<p>Salvatore Farina: /* Compiler */</p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with [[TOMAS aerosol microphysics]] on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
However, the 'bleeding edge' code has many recent developments in GEOS-Chem/TOMAS that are not included in the public release, including parallel computing.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Compiler ===<br />
GEOS-Chem works best ''(only)'' with the Intel Ifort Fortran compiler - v11.1<br />
There is an instance of the compiler installed on glooscap, which you can load by doing<br />
module load intel/11.1.073<br />
<br />
'''Alternatively''', I have installed ifort version 11.1.080. This also gives you access to the ''iidb'' debugger. To use this version, add the following to your .bashrc<br />
<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
export FC="ifort"<br />
<br />
=== Code ===<br />
You can grab the absolute latest code from my source directory on glooscap:<br />
cp -r /home/sfarina/source/GC_Bleeding_Edge/ ~<br />
<br />
or, (safer) you can grab my latest "snapshot"<br />
cp /home/sfarina/source/GC_BE_snapshot-latest.tgz .<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
Please follow the directions for [[Installing libraries for GEOS-Chem]] wiki before proceeding. You will need to install the netCDF-4.2 libraries.<br />
<br />
=== Environment ===<br />
After installing the libraries, your .bashrc should include a similar section to the following<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
<br />
Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11''<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes due to recent GC development.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== Make ===<br />
Glooscap allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do<br />
cd YOUR_CODE_DIR/GC_Bleeding_Edge/GeosCore<br />
pshell16<br />
make -j16 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on glooscap.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
Alternatively, you can use the following to define a tomas version when compiling:<br />
make TOMAS=yes geos<br />
make TOMAS40=yes geos<br />
etc.<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
=== A Note about Speed ===<br />
Choosing the appropriate version of tomas for your needs includes consideration of time and resources.<br />
Using 16 processors on glooscap at 4x5 resolution, the model time : real time ratio is roughly as follows:<br />
version | speedup<br />
40 bin - 64<br />
30 bin - 82<br />
15 bin - 144<br />
12 bin - 170<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, '''ask'''.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. To use git on glooscap:<br />
module load git<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have your source code directory, make a separate branch for yourself before making any changes. This will simplify trading and tracking updates/advances/bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
Now that you've successfully run the model, there are a few more hurdles to inspect your data.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
==== Copy ====<br />
Copy the relevant files to your postprocessing directory for a given run<br />
ctm.bpch<br />
diaginfo.dat<br />
tracerinfo.dat<br />
proc_one.pro<br />
averageCNCCN_XX.py <-- XX is TOMAS version<br />
plotCNCCN.py<br />
<br />
==== Split ====<br />
Use the script Bpch_Sep_Sal interactively from within the IDL environment to ctm.bpch into separate months<br />
For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
<br />
==== Create netcdf output ====<br />
Using the IDL script proc_one.pro, we extract information from the monthly .bpch files and save it to the standard netCDF<br />
Edit proc_one.pro to use the correct infile/outfiles<br />
Execute proc_one from your shell:<br />
idl proc_one.pro<br />
<br />
==== Counting CN and CCN ====<br />
Run averageCNCCN_XX.py, where XX is the model version<br />
For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
<br />
==== Plotting the Results====<br />
Edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version.<br />
plotCNCCN.py will automatically detect the model version and customize map names.<br />
To plot the surface and zonal average concentrations of CN3, CN10, CN40, and CN80 for august:<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
<br />
==== NCview ====<br />
You can also use ncview on the file ctm.nc to view individual species concentrations or nucleation rates.<br />
ncview ctm.nc<br />
ncview ctm_nuc.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F. This is a hard to track bug related to ongoing development of grid independent geoschem.<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* If you are trying to run geoschem ''outside'' of a qrsh (grid engine) environment (i.e. on the head node), you will need to add '''ulimit -S -s unlimited''' to your .bashrc<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups is a better idea.<br />
* if you have any questions or you are running into trouble, ''please ask'' either myself, Sajeev, or Jeff for help. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_setup_guide&diff=14297TOMAS setup guide2013-09-09T23:34:57Z<p>Salvatore Farina: /* Compiler */</p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with [[TOMAS aerosol microphysics]] on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
However, the 'bleeding edge' code has many recent developments in GEOS-Chem/TOMAS that are not included in the public release, including parallel computing.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Compiler ===<br />
GEOS-Chem works best ''(only)'' with the Intel Ifort Fortran compiler - v11.1<br />
There is an instance of the compiler installed on glooscap, which you can load by doing<br />
module load intel/11.1.073<br />
<br />
'''Alternatively''', I have installed ifort version 11.1.080. To use this version, add the following to your .bashrc<br />
<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
export FC="ifort"<br />
<br />
=== Code ===<br />
You can grab the absolute latest code from my source directory on glooscap:<br />
cp -r /home/sfarina/source/GC_Bleeding_Edge/ ~<br />
<br />
or, (safer) you can grab my latest "snapshot"<br />
cp /home/sfarina/source/GC_BE_snapshot-latest.tgz .<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
Please follow the directions for [[Installing libraries for GEOS-Chem]] wiki before proceeding. You will need to install the netCDF-4.2 libraries.<br />
<br />
=== Environment ===<br />
After installing the libraries, your .bashrc should include a similar section to the following<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
<br />
Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11''<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes due to recent GC development.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== Make ===<br />
Glooscap allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do<br />
cd YOUR_CODE_DIR/GC_Bleeding_Edge/GeosCore<br />
pshell16<br />
make -j16 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on glooscap.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
Alternatively, you can use the following to define a tomas version when compiling:<br />
make TOMAS=yes geos<br />
make TOMAS40=yes geos<br />
etc.<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
=== A Note about Speed ===<br />
Choosing the appropriate version of tomas for your needs includes consideration of time and resources.<br />
Using 16 processors on glooscap at 4x5 resolution, the model time : real time ratio is roughly as follows:<br />
version | speedup<br />
40 bin - 64<br />
30 bin - 82<br />
15 bin - 144<br />
12 bin - 170<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, '''ask'''.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. To use git on glooscap:<br />
module load git<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have your source code directory, make a separate branch for yourself before making any changes. This will simplify trading and tracking updates/advances/bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
Now that you've successfully run the model, there are a few more hurdles to inspect your data.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
==== Copy ====<br />
Copy the relevant files to your postprocessing directory for a given run<br />
ctm.bpch<br />
diaginfo.dat<br />
tracerinfo.dat<br />
proc_one.pro<br />
averageCNCCN_XX.py <-- XX is TOMAS version<br />
plotCNCCN.py<br />
<br />
==== Split ====<br />
Use the script Bpch_Sep_Sal interactively from within the IDL environment to ctm.bpch into separate months<br />
For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
<br />
==== Create netcdf output ====<br />
Using the IDL script proc_one.pro, we extract information from the monthly .bpch files and save it to the standard netCDF<br />
Edit proc_one.pro to use the correct infile/outfiles<br />
Execute proc_one from your shell:<br />
idl proc_one.pro<br />
<br />
==== Counting CN and CCN ====<br />
Run averageCNCCN_XX.py, where XX is the model version<br />
For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
<br />
==== Plotting the Results====<br />
Edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version.<br />
plotCNCCN.py will automatically detect the model version and customize map names.<br />
To plot the surface and zonal average concentrations of CN3, CN10, CN40, and CN80 for august:<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
<br />
==== NCview ====<br />
You can also use ncview on the file ctm.nc to view individual species concentrations or nucleation rates.<br />
ncview ctm.nc<br />
ncview ctm_nuc.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F. This is a hard to track bug related to ongoing development of grid independent geoschem.<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* If you are trying to run geoschem ''outside'' of a qrsh (grid engine) environment (i.e. on the head node), you will need to add '''ulimit -S -s unlimited''' to your .bashrc<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups is a better idea.<br />
* if you have any questions or you are running into trouble, ''please ask'' either myself, Sajeev, or Jeff for help. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_setup_guide&diff=14296TOMAS setup guide2013-09-09T23:34:36Z<p>Salvatore Farina: /* Compiler */</p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with [[TOMAS aerosol microphysics]] on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
However, the 'bleeding edge' code has many recent developments in GEOS-Chem/TOMAS that are not included in the public release, including parallel computing.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Compiler ===<br />
GEOS-Chem works best (''only'') with the Intel Ifort Fortran compiler - v11.1<br />
There is an instance of the compiler installed on glooscap, which you can load by doing<br />
module load intel/11.1.073<br />
<br />
'''Alternatively''', I have installed ifort version 11.1.080. To use this version, add the following to your .bashrc<br />
<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
export FC="ifort"<br />
<br />
=== Code ===<br />
You can grab the absolute latest code from my source directory on glooscap:<br />
cp -r /home/sfarina/source/GC_Bleeding_Edge/ ~<br />
<br />
or, (safer) you can grab my latest "snapshot"<br />
cp /home/sfarina/source/GC_BE_snapshot-latest.tgz .<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
Please follow the directions for [[Installing libraries for GEOS-Chem]] wiki before proceeding. You will need to install the netCDF-4.2 libraries.<br />
<br />
=== Environment ===<br />
After installing the libraries, your .bashrc should include a similar section to the following<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
<br />
Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11''<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes due to recent GC development.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== Make ===<br />
Glooscap allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do<br />
cd YOUR_CODE_DIR/GC_Bleeding_Edge/GeosCore<br />
pshell16<br />
make -j16 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on glooscap.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
Alternatively, you can use the following to define a tomas version when compiling:<br />
make TOMAS=yes geos<br />
make TOMAS40=yes geos<br />
etc.<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
=== A Note about Speed ===<br />
Choosing the appropriate version of tomas for your needs includes consideration of time and resources.<br />
Using 16 processors on glooscap at 4x5 resolution, the model time : real time ratio is roughly as follows:<br />
version | speedup<br />
40 bin - 64<br />
30 bin - 82<br />
15 bin - 144<br />
12 bin - 170<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, '''ask'''.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. To use git on glooscap:<br />
module load git<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have your source code directory, make a separate branch for yourself before making any changes. This will simplify trading and tracking updates/advances/bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
Now that you've successfully run the model, there are a few more hurdles to inspect your data.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
==== Copy ====<br />
Copy the relevant files to your postprocessing directory for a given run<br />
ctm.bpch<br />
diaginfo.dat<br />
tracerinfo.dat<br />
proc_one.pro<br />
averageCNCCN_XX.py <-- XX is TOMAS version<br />
plotCNCCN.py<br />
<br />
==== Split ====<br />
Use the script Bpch_Sep_Sal interactively from within the IDL environment to ctm.bpch into separate months<br />
For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
<br />
==== Create netcdf output ====<br />
Using the IDL script proc_one.pro, we extract information from the monthly .bpch files and save it to the standard netCDF<br />
Edit proc_one.pro to use the correct infile/outfiles<br />
Execute proc_one from your shell:<br />
idl proc_one.pro<br />
<br />
==== Counting CN and CCN ====<br />
Run averageCNCCN_XX.py, where XX is the model version<br />
For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
<br />
==== Plotting the Results====<br />
Edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version.<br />
plotCNCCN.py will automatically detect the model version and customize map names.<br />
To plot the surface and zonal average concentrations of CN3, CN10, CN40, and CN80 for august:<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
<br />
==== NCview ====<br />
You can also use ncview on the file ctm.nc to view individual species concentrations or nucleation rates.<br />
ncview ctm.nc<br />
ncview ctm_nuc.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F. This is a hard to track bug related to ongoing development of grid independent geoschem.<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* If you are trying to run geoschem ''outside'' of a qrsh (grid engine) environment (i.e. on the head node), you will need to add '''ulimit -S -s unlimited''' to your .bashrc<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups is a better idea.<br />
* if you have any questions or you are running into trouble, ''please ask'' either myself, Sajeev, or Jeff for help. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_setup_guide&diff=14295TOMAS setup guide2013-09-09T23:32:45Z<p>Salvatore Farina: /* Compiler */</p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with [[TOMAS aerosol microphysics]] on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
However, the 'bleeding edge' code has many recent developments in GEOS-Chem/TOMAS that are not included in the public release, including parallel computing.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Compiler ===<br />
GEOS-Chem works best with the Intel Ifort Fortran compiler - v11.1<br />
There is an instance of the compiler installed on glooscap, which you can load by doing<br />
module load intel/11.1.073<br />
<br />
'''Alternatively''', I have installed ifort version 11.1.080. To use this version, add the following to your .bashrc<br />
<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
export FC="ifort"<br />
<br />
=== Code ===<br />
You can grab the absolute latest code from my source directory on glooscap:<br />
cp -r /home/sfarina/source/GC_Bleeding_Edge/ ~<br />
<br />
or, (safer) you can grab my latest "snapshot"<br />
cp /home/sfarina/source/GC_BE_snapshot-latest.tgz .<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
Please follow the directions for [[Installing libraries for GEOS-Chem]] wiki before proceeding. You will need to install the netCDF-4.2 libraries.<br />
<br />
=== Environment ===<br />
After installing the libraries, your .bashrc should include a similar section to the following<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
<br />
Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11''<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes due to recent GC development.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== Make ===<br />
Glooscap allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do<br />
cd YOUR_CODE_DIR/GC_Bleeding_Edge/GeosCore<br />
pshell16<br />
make -j16 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on glooscap.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
Alternatively, you can use the following to define a tomas version when compiling:<br />
make TOMAS=yes geos<br />
make TOMAS40=yes geos<br />
etc.<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
=== A Note about Speed ===<br />
Choosing the appropriate version of tomas for your needs includes consideration of time and resources.<br />
Using 16 processors on glooscap at 4x5 resolution, the model time : real time ratio is roughly as follows:<br />
version | speedup<br />
40 bin - 64<br />
30 bin - 82<br />
15 bin - 144<br />
12 bin - 170<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, '''ask'''.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. To use git on glooscap:<br />
module load git<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have your source code directory, make a separate branch for yourself before making any changes. This will simplify trading and tracking updates/advances/bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
Now that you've successfully run the model, there are a few more hurdles to inspect your data.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
==== Copy ====<br />
Copy the relevant files to your postprocessing directory for a given run<br />
ctm.bpch<br />
diaginfo.dat<br />
tracerinfo.dat<br />
proc_one.pro<br />
averageCNCCN_XX.py <-- XX is TOMAS version<br />
plotCNCCN.py<br />
<br />
==== Split ====<br />
Use the script Bpch_Sep_Sal interactively from within the IDL environment to ctm.bpch into separate months<br />
For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
<br />
==== Create netcdf output ====<br />
Using the IDL script proc_one.pro, we extract information from the monthly .bpch files and save it to the standard netCDF<br />
Edit proc_one.pro to use the correct infile/outfiles<br />
Execute proc_one from your shell:<br />
idl proc_one.pro<br />
<br />
==== Counting CN and CCN ====<br />
Run averageCNCCN_XX.py, where XX is the model version<br />
For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
<br />
==== Plotting the Results====<br />
Edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version.<br />
plotCNCCN.py will automatically detect the model version and customize map names.<br />
To plot the surface and zonal average concentrations of CN3, CN10, CN40, and CN80 for august:<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
<br />
==== NCview ====<br />
You can also use ncview on the file ctm.nc to view individual species concentrations or nucleation rates.<br />
ncview ctm.nc<br />
ncview ctm_nuc.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F. This is a hard to track bug related to ongoing development of grid independent geoschem.<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* If you are trying to run geoschem ''outside'' of a qrsh (grid engine) environment (i.e. on the head node), you will need to add '''ulimit -S -s unlimited''' to your .bashrc<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups is a better idea.<br />
* if you have any questions or you are running into trouble, ''please ask'' either myself, Sajeev, or Jeff for help. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_setup_guide&diff=14294TOMAS setup guide2013-09-09T23:32:15Z<p>Salvatore Farina: /* Environment */</p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with [[TOMAS aerosol microphysics]] on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
However, the 'bleeding edge' code has many recent developments in GEOS-Chem/TOMAS that are not included in the public release, including parallel computing.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Compiler ===<br />
GEOS-Chem works best with the Intel Ifort Fortran compiler - v11.1<br />
There is an instance of the compiler installed on glooscap, which you can load by doing<br />
module load intel/11.1.073<br />
<br />
Alternatively, I have installed ifort version 11.1.080. To use this version, add the following to your .bashrc<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
export FC="ifort"<br />
<br />
=== Code ===<br />
You can grab the absolute latest code from my source directory on glooscap:<br />
cp -r /home/sfarina/source/GC_Bleeding_Edge/ ~<br />
<br />
or, (safer) you can grab my latest "snapshot"<br />
cp /home/sfarina/source/GC_BE_snapshot-latest.tgz .<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
Please follow the directions for [[Installing libraries for GEOS-Chem]] wiki before proceeding. You will need to install the netCDF-4.2 libraries.<br />
<br />
=== Environment ===<br />
After installing the libraries, your .bashrc should include a similar section to the following<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
<br />
Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11''<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes due to recent GC development.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== Make ===<br />
Glooscap allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do<br />
cd YOUR_CODE_DIR/GC_Bleeding_Edge/GeosCore<br />
pshell16<br />
make -j16 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on glooscap.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
Alternatively, you can use the following to define a tomas version when compiling:<br />
make TOMAS=yes geos<br />
make TOMAS40=yes geos<br />
etc.<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
=== A Note about Speed ===<br />
Choosing the appropriate version of tomas for your needs includes consideration of time and resources.<br />
Using 16 processors on glooscap at 4x5 resolution, the model time : real time ratio is roughly as follows:<br />
version | speedup<br />
40 bin - 64<br />
30 bin - 82<br />
15 bin - 144<br />
12 bin - 170<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, '''ask'''.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. To use git on glooscap:<br />
module load git<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have your source code directory, make a separate branch for yourself before making any changes. This will simplify trading and tracking updates/advances/bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
Now that you've successfully run the model, there are a few more hurdles to inspect your data.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
==== Copy ====<br />
Copy the relevant files to your postprocessing directory for a given run<br />
ctm.bpch<br />
diaginfo.dat<br />
tracerinfo.dat<br />
proc_one.pro<br />
averageCNCCN_XX.py <-- XX is TOMAS version<br />
plotCNCCN.py<br />
<br />
==== Split ====<br />
Use the script Bpch_Sep_Sal interactively from within the IDL environment to ctm.bpch into separate months<br />
For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
<br />
==== Create netcdf output ====<br />
Using the IDL script proc_one.pro, we extract information from the monthly .bpch files and save it to the standard netCDF<br />
Edit proc_one.pro to use the correct infile/outfiles<br />
Execute proc_one from your shell:<br />
idl proc_one.pro<br />
<br />
==== Counting CN and CCN ====<br />
Run averageCNCCN_XX.py, where XX is the model version<br />
For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
<br />
==== Plotting the Results====<br />
Edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version.<br />
plotCNCCN.py will automatically detect the model version and customize map names.<br />
To plot the surface and zonal average concentrations of CN3, CN10, CN40, and CN80 for august:<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
<br />
==== NCview ====<br />
You can also use ncview on the file ctm.nc to view individual species concentrations or nucleation rates.<br />
ncview ctm.nc<br />
ncview ctm_nuc.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F. This is a hard to track bug related to ongoing development of grid independent geoschem.<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* If you are trying to run geoschem ''outside'' of a qrsh (grid engine) environment (i.e. on the head node), you will need to add '''ulimit -S -s unlimited''' to your .bashrc<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups is a better idea.<br />
* if you have any questions or you are running into trouble, ''please ask'' either myself, Sajeev, or Jeff for help. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_setup_guide&diff=14293TOMAS setup guide2013-09-09T23:31:44Z<p>Salvatore Farina: /* Environment */</p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with [[TOMAS aerosol microphysics]] on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
However, the 'bleeding edge' code has many recent developments in GEOS-Chem/TOMAS that are not included in the public release, including parallel computing.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Compiler ===<br />
GEOS-Chem works best with the Intel Ifort Fortran compiler - v11.1<br />
There is an instance of the compiler installed on glooscap, which you can load by doing<br />
module load intel/11.1.073<br />
<br />
Alternatively, I have installed ifort version 11.1.080. To use this version, add the following to your .bashrc<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
export FC="ifort"<br />
<br />
=== Code ===<br />
You can grab the absolute latest code from my source directory on glooscap:<br />
cp -r /home/sfarina/source/GC_Bleeding_Edge/ ~<br />
<br />
or, (safer) you can grab my latest "snapshot"<br />
cp /home/sfarina/source/GC_BE_snapshot-latest.tgz .<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
Please follow the directions for [[Installing libraries for GEOS-Chem]] wiki before proceeding. You will need to install the netCDF-4.2 libraries.<br />
<br />
=== Environment ===<br />
After installing the libraries, your .bashrc should include a similar section to the following<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
<br />
For reasons I can't recall, I use a specific version of the ifort11 compiler that is not installed by default on glooscap. If you cannot successfully compile and run geoschem with tomas using the intel ifort 11 installed on glooscap (using $ module load intel/11.1.073), adding the following to your .bashrc should allow you to use the version that I have installed with Sajeev's help:<br />
<br />
export FC="ifort"<br />
<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
<br />
Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11''<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes due to recent GC development.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== Make ===<br />
Glooscap allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do<br />
cd YOUR_CODE_DIR/GC_Bleeding_Edge/GeosCore<br />
pshell16<br />
make -j16 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on glooscap.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
Alternatively, you can use the following to define a tomas version when compiling:<br />
make TOMAS=yes geos<br />
make TOMAS40=yes geos<br />
etc.<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
=== A Note about Speed ===<br />
Choosing the appropriate version of tomas for your needs includes consideration of time and resources.<br />
Using 16 processors on glooscap at 4x5 resolution, the model time : real time ratio is roughly as follows:<br />
version | speedup<br />
40 bin - 64<br />
30 bin - 82<br />
15 bin - 144<br />
12 bin - 170<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, '''ask'''.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. To use git on glooscap:<br />
module load git<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have your source code directory, make a separate branch for yourself before making any changes. This will simplify trading and tracking updates/advances/bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
Now that you've successfully run the model, there are a few more hurdles to inspect your data.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
==== Copy ====<br />
Copy the relevant files to your postprocessing directory for a given run<br />
ctm.bpch<br />
diaginfo.dat<br />
tracerinfo.dat<br />
proc_one.pro<br />
averageCNCCN_XX.py <-- XX is TOMAS version<br />
plotCNCCN.py<br />
<br />
==== Split ====<br />
Use the script Bpch_Sep_Sal interactively from within the IDL environment to ctm.bpch into separate months<br />
For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
<br />
==== Create netcdf output ====<br />
Using the IDL script proc_one.pro, we extract information from the monthly .bpch files and save it to the standard netCDF<br />
Edit proc_one.pro to use the correct infile/outfiles<br />
Execute proc_one from your shell:<br />
idl proc_one.pro<br />
<br />
==== Counting CN and CCN ====<br />
Run averageCNCCN_XX.py, where XX is the model version<br />
For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
<br />
==== Plotting the Results====<br />
Edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version.<br />
plotCNCCN.py will automatically detect the model version and customize map names.<br />
To plot the surface and zonal average concentrations of CN3, CN10, CN40, and CN80 for august:<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
<br />
==== NCview ====<br />
You can also use ncview on the file ctm.nc to view individual species concentrations or nucleation rates.<br />
ncview ctm.nc<br />
ncview ctm_nuc.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F. This is a hard to track bug related to ongoing development of grid independent geoschem.<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* If you are trying to run geoschem ''outside'' of a qrsh (grid engine) environment (i.e. on the head node), you will need to add '''ulimit -S -s unlimited''' to your .bashrc<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups is a better idea.<br />
* if you have any questions or you are running into trouble, ''please ask'' either myself, Sajeev, or Jeff for help. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_setup_guide&diff=14292TOMAS setup guide2013-09-09T23:31:03Z<p>Salvatore Farina: /* Environment */</p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with [[TOMAS aerosol microphysics]] on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
However, the 'bleeding edge' code has many recent developments in GEOS-Chem/TOMAS that are not included in the public release, including parallel computing.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Compiler ===<br />
GEOS-Chem works best with the Intel Ifort Fortran compiler - v11.1<br />
There is an instance of the compiler installed on glooscap, which you can load by doing<br />
module load intel/11.1.073<br />
<br />
Alternatively, I have installed ifort version 11.1.080. To use this version, add the following to your .bashrc<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
export FC="ifort"<br />
<br />
=== Code ===<br />
You can grab the absolute latest code from my source directory on glooscap:<br />
cp -r /home/sfarina/source/GC_Bleeding_Edge/ ~<br />
<br />
or, (safer) you can grab my latest "snapshot"<br />
cp /home/sfarina/source/GC_BE_snapshot-latest.tgz .<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
Please follow the directions for [[Installing libraries for GEOS-Chem]] wiki before proceeding. You will need to install the netCDF-4.2 libraries.<br />
<br />
=== Environment ===<br />
After installing the libraries, your .bashrc should include a similar section to the following<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
<br />
For reasons I can't recall, I use a specific version of the ifort11 compiler that is not installed by default on glooscap. Part of the reasoning is for debugging purposes. If you cannot successfully compile and run geoschem with tomas using the intel ifort 11 installed on glooscap (using $ module load intel/11.1.073), adding the following to your .bashrc should allow you to use the version that I have installed with Sajeev's help:<br />
<br />
export FC="ifort"<br />
<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
<br />
Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11''<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes due to recent GC development.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== Make ===<br />
Glooscap allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do<br />
cd YOUR_CODE_DIR/GC_Bleeding_Edge/GeosCore<br />
pshell16<br />
make -j16 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on glooscap.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
Alternatively, you can use the following to define a tomas version when compiling:<br />
make TOMAS=yes geos<br />
make TOMAS40=yes geos<br />
etc.<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
=== A Note about Speed ===<br />
Choosing the appropriate version of tomas for your needs includes consideration of time and resources.<br />
Using 16 processors on glooscap at 4x5 resolution, the model time : real time ratio is roughly as follows:<br />
version | speedup<br />
40 bin - 64<br />
30 bin - 82<br />
15 bin - 144<br />
12 bin - 170<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, '''ask'''.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. To use git on glooscap:<br />
module load git<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have your source code directory, make a separate branch for yourself before making any changes. This will simplify trading and tracking updates/advances/bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
Now that you've successfully run the model, there are a few more hurdles to inspect your data.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
==== Copy ====<br />
Copy the relevant files to your postprocessing directory for a given run<br />
ctm.bpch<br />
diaginfo.dat<br />
tracerinfo.dat<br />
proc_one.pro<br />
averageCNCCN_XX.py <-- XX is TOMAS version<br />
plotCNCCN.py<br />
<br />
==== Split ====<br />
Use the script Bpch_Sep_Sal interactively from within the IDL environment to ctm.bpch into separate months<br />
For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
<br />
==== Create netcdf output ====<br />
Using the IDL script proc_one.pro, we extract information from the monthly .bpch files and save it to the standard netCDF<br />
Edit proc_one.pro to use the correct infile/outfiles<br />
Execute proc_one from your shell:<br />
idl proc_one.pro<br />
<br />
==== Counting CN and CCN ====<br />
Run averageCNCCN_XX.py, where XX is the model version<br />
For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
<br />
==== Plotting the Results====<br />
Edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version.<br />
plotCNCCN.py will automatically detect the model version and customize map names.<br />
To plot the surface and zonal average concentrations of CN3, CN10, CN40, and CN80 for august:<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
<br />
==== NCview ====<br />
You can also use ncview on the file ctm.nc to view individual species concentrations or nucleation rates.<br />
ncview ctm.nc<br />
ncview ctm_nuc.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F. This is a hard to track bug related to ongoing development of grid independent geoschem.<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* If you are trying to run geoschem ''outside'' of a qrsh (grid engine) environment (i.e. on the head node), you will need to add '''ulimit -S -s unlimited''' to your .bashrc<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups is a better idea.<br />
* if you have any questions or you are running into trouble, ''please ask'' either myself, Sajeev, or Jeff for help. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_setup_guide&diff=14291TOMAS setup guide2013-09-09T23:24:34Z<p>Salvatore Farina: /* Make */</p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with [[TOMAS aerosol microphysics]] on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
However, the 'bleeding edge' code has many recent developments in GEOS-Chem/TOMAS that are not included in the public release, including parallel computing.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Compiler ===<br />
GEOS-Chem works best with the Intel Ifort Fortran compiler - v11.1<br />
There is an instance of the compiler installed on glooscap, which you can load by doing<br />
module load intel/11.1.073<br />
<br />
Alternatively, I have installed ifort version 11.1.080. To use this version, add the following to your .bashrc<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
export FC="ifort"<br />
<br />
=== Code ===<br />
You can grab the absolute latest code from my source directory on glooscap:<br />
cp -r /home/sfarina/source/GC_Bleeding_Edge/ ~<br />
<br />
or, (safer) you can grab my latest "snapshot"<br />
cp /home/sfarina/source/GC_BE_snapshot-latest.tgz .<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
Please follow the directions for [[Installing libraries for GEOS-Chem]] wiki before proceeding. You will need to install the netCDF-4.2 libraries.<br />
<br />
=== Environment ===<br />
After installing the libraries, your bashrc should include a similar section to the following<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
<br />
export FC="ifort"<br />
<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
<br />
Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11''<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes due to recent GC development.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== Make ===<br />
Glooscap allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do<br />
cd YOUR_CODE_DIR/GC_Bleeding_Edge/GeosCore<br />
pshell16<br />
make -j16 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on glooscap.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
Alternatively, you can use the following to define a tomas version when compiling:<br />
make TOMAS=yes geos<br />
make TOMAS40=yes geos<br />
etc.<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
=== A Note about Speed ===<br />
Choosing the appropriate version of tomas for your needs includes consideration of time and resources.<br />
Using 16 processors on glooscap at 4x5 resolution, the model time : real time ratio is roughly as follows:<br />
version | speedup<br />
40 bin - 64<br />
30 bin - 82<br />
15 bin - 144<br />
12 bin - 170<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, '''ask'''.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. To use git on glooscap:<br />
module load git<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have your source code directory, make a separate branch for yourself before making any changes. This will simplify trading and tracking updates/advances/bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
Now that you've successfully run the model, there are a few more hurdles to inspect your data.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
==== Copy ====<br />
Copy the relevant files to your postprocessing directory for a given run<br />
ctm.bpch<br />
diaginfo.dat<br />
tracerinfo.dat<br />
proc_one.pro<br />
averageCNCCN_XX.py <-- XX is TOMAS version<br />
plotCNCCN.py<br />
<br />
==== Split ====<br />
Use the script Bpch_Sep_Sal interactively from within the IDL environment to ctm.bpch into separate months<br />
For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
<br />
==== Create netcdf output ====<br />
Using the IDL script proc_one.pro, we extract information from the monthly .bpch files and save it to the standard netCDF<br />
Edit proc_one.pro to use the correct infile/outfiles<br />
Execute proc_one from your shell:<br />
idl proc_one.pro<br />
<br />
==== Counting CN and CCN ====<br />
Run averageCNCCN_XX.py, where XX is the model version<br />
For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
<br />
==== Plotting the Results====<br />
Edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version.<br />
plotCNCCN.py will automatically detect the model version and customize map names.<br />
To plot the surface and zonal average concentrations of CN3, CN10, CN40, and CN80 for august:<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
<br />
==== NCview ====<br />
You can also use ncview on the file ctm.nc to view individual species concentrations or nucleation rates.<br />
ncview ctm.nc<br />
ncview ctm_nuc.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F. This is a hard to track bug related to ongoing development of grid independent geoschem.<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* If you are trying to run geoschem ''outside'' of a qrsh (grid engine) environment (i.e. on the head node), you will need to add '''ulimit -S -s unlimited''' to your .bashrc<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups is a better idea.<br />
* if you have any questions or you are running into trouble, ''please ask'' either myself, Sajeev, or Jeff for help. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_setup_guide&diff=14290TOMAS setup guide2013-09-09T23:22:39Z<p>Salvatore Farina: /* Running GEOS-Chem with TOMAS */ -> a note about speed</p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with [[TOMAS aerosol microphysics]] on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
However, the 'bleeding edge' code has many recent developments in GEOS-Chem/TOMAS that are not included in the public release, including parallel computing.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Compiler ===<br />
GEOS-Chem works best with the Intel Ifort Fortran compiler - v11.1<br />
There is an instance of the compiler installed on glooscap, which you can load by doing<br />
module load intel/11.1.073<br />
<br />
Alternatively, I have installed ifort version 11.1.080. To use this version, add the following to your .bashrc<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
export FC="ifort"<br />
<br />
=== Code ===<br />
You can grab the absolute latest code from my source directory on glooscap:<br />
cp -r /home/sfarina/source/GC_Bleeding_Edge/ ~<br />
<br />
or, (safer) you can grab my latest "snapshot"<br />
cp /home/sfarina/source/GC_BE_snapshot-latest.tgz .<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
Please follow the directions for [[Installing libraries for GEOS-Chem]] wiki before proceeding. You will need to install the netCDF-4.2 libraries.<br />
<br />
=== Environment ===<br />
After installing the libraries, your bashrc should include a similar section to the following<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
<br />
export FC="ifort"<br />
<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
<br />
Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11''<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes due to recent GC development.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== Make ===<br />
Glooscap allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do<br />
cd YOUR_CODE_DIR/GC_Bleeding_Edge/GeosCore<br />
pshell16<br />
make -j16 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on glooscap.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
=== A Note about Speed ===<br />
Choosing the appropriate version of tomas for your needs includes consideration of time and resources.<br />
Using 16 processors on glooscap at 4x5 resolution, the model time : real time ratio is roughly as follows:<br />
version | speedup<br />
40 bin - 64<br />
30 bin - 82<br />
15 bin - 144<br />
12 bin - 170<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, '''ask'''.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. To use git on glooscap:<br />
module load git<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have your source code directory, make a separate branch for yourself before making any changes. This will simplify trading and tracking updates/advances/bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
Now that you've successfully run the model, there are a few more hurdles to inspect your data.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
==== Copy ====<br />
Copy the relevant files to your postprocessing directory for a given run<br />
ctm.bpch<br />
diaginfo.dat<br />
tracerinfo.dat<br />
proc_one.pro<br />
averageCNCCN_XX.py <-- XX is TOMAS version<br />
plotCNCCN.py<br />
<br />
==== Split ====<br />
Use the script Bpch_Sep_Sal interactively from within the IDL environment to ctm.bpch into separate months<br />
For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
<br />
==== Create netcdf output ====<br />
Using the IDL script proc_one.pro, we extract information from the monthly .bpch files and save it to the standard netCDF<br />
Edit proc_one.pro to use the correct infile/outfiles<br />
Execute proc_one from your shell:<br />
idl proc_one.pro<br />
<br />
==== Counting CN and CCN ====<br />
Run averageCNCCN_XX.py, where XX is the model version<br />
For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
<br />
==== Plotting the Results====<br />
Edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version.<br />
plotCNCCN.py will automatically detect the model version and customize map names.<br />
To plot the surface and zonal average concentrations of CN3, CN10, CN40, and CN80 for august:<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
<br />
==== NCview ====<br />
You can also use ncview on the file ctm.nc to view individual species concentrations or nucleation rates.<br />
ncview ctm.nc<br />
ncview ctm_nuc.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F. This is a hard to track bug related to ongoing development of grid independent geoschem.<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* If you are trying to run geoschem ''outside'' of a qrsh (grid engine) environment (i.e. on the head node), you will need to add '''ulimit -S -s unlimited''' to your .bashrc<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups is a better idea.<br />
* if you have any questions or you are running into trouble, ''please ask'' either myself, Sajeev, or Jeff for help. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_aerosol_microphysics&diff=13774TOMAS aerosol microphysics2013-08-05T20:45:05Z<p>Salvatore Farina: /* Outstanding issues */</p>
<hr />
<div>This page describes the TOMAS aerosol microphysics option in GEOS-Chem. TOMAS is one of two aerosol microphysics packages being incorporated into GEOS-Chem, the other being [[APM aerosol microphysics|APM]].<br />
<br />
== Overview ==<br />
<br />
The TwO-Moment Aerosol Sectional (TOMAS) microphysics package was developed for implementation into GEOS-Chem at Carnegie-Mellon University. Using a moving sectional and moment-based approach, TOMAS tracks two independent moments (number and mass) of the aerosol size distribution for a number of discrete size bins. It also contains codes to simulate nucleation, condensation, and coagulation processes. The aerosol species that are considered with high size resolution are sulfate, sea-salt, OC, EC, and dust. An advantage of TOMAS is the full size resolution for all chemical species and the conservation of aerosol number, the latter of which allows one to construct aerosol and CCN number budgets that will balance.<br />
<br />
=== Authors and collaborators ===<br />
* [mailto:petera@andrew.cmu.edu Peter Adams] ''(Carnegie-Mellon U.)'' -- Principal Investigator<br />
* [mailto:wtrivita@staffmail.ed.ac.uk Win Trivitayanurak] ''(Department of Highways, Thailand)''<br />
* [mailto:dwesterv@andrew.cmu.edu Dan Westervelt] ''(Carnegie-Mellon U.)''<br />
* [mailto:jeffrey.pierce@dal.ca Jeffrey Pierce] ''(Dalhousie U.)''<br />
* [mailto:sal.farina@gmail.com Salvatore Farina] ''(Colorado State U.)''<br />
<br />
Questions regarding TOMAS can be directed at Dan (e-mail linked above).<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 11:53, 27 January 2010 (EST)<br />
<br />
=== TOMAS User Groups ===<br />
<br />
{| border=1 cellspacing=0 cellpadding=5<br />
|- bgcolor="#cccccc"<br />
!User Group<br />
!Personnel<br />
!Projects<br />
|-valign="top"<br />
|[http://www.ce.cmu.edu/%7Eadams/ Carnegie-Mellon University]<br />
|[http://www.ce.cmu.edu/%7Eadams/people.htm#peter Peter Adams]<br>[http://www.ce.cmu.edu/~dwesterv/Site/Home.html Dan Westervelt]<br />
| [http://www.atmos-chem-phys-discuss.net/13/8333/2013/acpd-13-8333-2013.html New particle formation evaluation in GC-TOMAS] <br> Sensitivity of CCN to nucleation rates <br> Development of number tagging and source apportionment model for GC-TOMAS<br />
|-valign="top"<br />
|[http://fizz.phys.dal.ca/%7Epierce/ Dalhousie University] <br> [http://www.atmos.colostate.edu/faculty/pierce.php Colorado State]<br />
|[http://atm.dal.ca/Faculty/Jeffrey_Pierce.php Jeffrey Pierce]<br>Sal Farina<br>Stephen D'Andrea<br />
|Sensitivity of CCN to condensational growth rates <br> TOMAS parallelization <br> Others...<br />
|-valign="top"<br />
|Add yours here<br />
|<br />
|<br />
|}<br />
<br />
== TOMAS-specific setup ==<br />
TOMAS has its own run directories (run.Tomas) that can be downloaded from the Harvard FTP. The <tt>input.geos</tt> file will look slightly different from standard GEOS-Chem, and between versions.<br />
<br />
Pre- v9.02:<br />
To turn on TOMAS, see the "Microphysics menu" in <tt>input.geos</tt> and make sure TOMAS is set to '''T'''. <br />
<br />
v9.02 and later:<br />
TOMAS is enabled or disabled at compile time - the TOMAS flag in input.geos has been removed.<br />
<br />
<br />
TOMAS is a simulation type 3 and utilizes 171-423 tracers. Each aerosol species requires 30 tracers for the 30 bin size resolution, 12 for the 12 bin, etc. Here is the (abbreviated) default setup in input.geos for TOMAS-30 in v9.02 and later(see run.Tomas directory):<br />
<br />
Tracer # Description <br />
1- 62 Std Geos Chem <br />
63 H2SO4 <br />
64- 93 Number <br />
94-123 Sulfate <br />
124-153 Sea-salt <br />
154-183 Hydrophilic EC <br />
184-213 Hydrophobic EC <br />
214-243 Hydrophilic OC <br />
244-273 Hydrophobic OC <br />
274-303 Mineral dust <br />
304-333 Aerosol water<br />
<br />
TOMAS-40 requires 423 tracers (~360 TOMAS tracers for each of the 40-bin species, and ~62 standard GEOS-Chem tracers) <br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 18:48, 8 July 2013 (EDT)<br />
<br />
== Implementation notes ==<br />
<br />
TOMAS validation in [[GEOS-Chem v8-03-01]] was completed on 24 Feb 2010.<br />
<br />
=== Update April 2013 ===<br />
<br />
'''''This update was tested in the 1-month benchmark simulation [[GEOS-Chem_v9-02_benchmark_history#v9-02k|v9-02k]] and approved on 07 Jun 2013.'''''<br />
<br />
Sal Farina has been working with the GEOS-Chem Support Team to inline the TOMAS aerosol microphysics code into the <tt>GeosCore</tt> directory. All TOMAS-specific sections of code are now segregated from the rest of GEOS-Chem with C-preprocessor statements such as:<br />
<br />
#if defined( TOMAS )<br />
<br />
# if defined( TOMAS40 ) <br />
... Code for 40 bin TOMAS simulation (optional) goes here ...<br />
# elif defined( TOMAS12 )<br />
... Code for 12 bin TOMAS simulation (optional) goes here ...<br />
# elif defined( TOMAS15 )<br />
... Code for 15 bin TOMAS simulation (optional) goes here ...<br />
# else<br />
... Code for 30 bin TOMAS simulation (default) goes here ...<br />
# endif<br />
<br />
#endif <br />
<br />
TOMAS is now invoked by compiling GEOS-Chem with one of the following options:<br />
<br />
make -j4 TOMAS=yes ... # Compiles GEOS-Chem for the 30 bin (default) TOMAS simulation<br />
# -j4 compiles 4 files at a time; this reduces overall compilation time<br />
<br />
or<br />
<br />
make -j4 TOMAS40=yes ... # Compiles GEOS-Chem for the 40 bin (optional) TOMAS simulation<br />
# -j4 compiles 4 files at a time; this reduces overall compilation time<br />
<br />
All files in the old <tt>GeosTomas/</tt> directory have now been deleted, as these have been rendered obsolete.<br />
<br />
These updates will be included in [[GEOS-Chem v9-02]]. These modifications will not affect the existing GEOS-Chem simulations, as all TOMAS code is not compiled into the executable unless you specify either <tt>TOMAS=yes</tt> or <tt>TOMAS40=yes</tt> at compile time.<br />
<br />
We are in the process of updating the wiki to reflect these changes as they are implemented. <br />
<br />
--[[User:Bmy|Bob Y.]] 13:59, 23 April 2013 (EDT)<br><br />
--[[User:Salvatore Farina|Salvatore Farina]] 13:49, 4 June 2013 (EDT)<br />
<br />
=== Code structure ===<br />
<br />
'''''NOTE: This will be rendered obsolete by the [[#Update April 2013|re-integration of TOMAS into GEOS-Chem]], which is slated for [[GEOS-Chem v9-02]].'''''<br />
<br />
The main-level <tt>Code</tt> directory has now been divided into several subdirectories:<br />
<br />
GeosCore/ GEOS-Chem "core" routines<br />
GeosTomas/ Parallel copies of GEOS-Chem routines that reference TOMAS<br />
GeosUtil/ "Utility" modules (e.g. error_mod.f, file_mod.f, time_mod.f, etc.<br />
Headers/ Header files (define.h, CMN_SIZE, CMN_DIAG, etc.)<br />
KPP/ KPP solver directory structure<br />
bin/ Directory where executables are placed<br />
doc/ Directory where documentation is created<br />
help/ Directory for GEOS-Chem Help Screen<br />
lib/ Directory where library files are placed<br />
mod/ Directory where module files are placed<br />
obsolete/ Directory where obsolete versions of code are archived<br />
<br />
Because there were a lot of TOMAS-related modifications in several GEOS-Chem "core" routines, the routines that need to "talk" to TOMAS were placed into a separate subdirectory named <tt>GeosTomas/</tt>. The files in <tt>GeosTomas</tt> are:<br />
<br />
Files:<br />
------<br />
Makefile -- GEOS-Chem routines that have been<br />
aero_drydep.f modified to reference the TOMAS aerosol<br />
carbon_mod.f microphysics package. These are kept<br />
chemdr.f in a separate GeosTomas directory so that<br />
chemistry_mod.f they do not interfere with the routines<br />
cleanup.f in the GeosCore directory.<br />
diag3.f<br />
diag_mod.f The GeosTomas directory only needs to<br />
diag_pl_mod.f contain the files that have been modified<br />
drydep_mod.f for TOMAS. The Makefile will look for<br />
dust_mod.f all other files from the GeosCore directory<br />
emissions_mod.f using the VPATH option in GNU Make.<br />
gamap_mod.f<br />
initialize.f NOTE to GEOS-Chem developers: When you<br />
input_mod.f make changes to any of these routines<br />
isoropia_mod.f in the GeosCore directory, you must also<br />
logical_mod.f make the same modifications to the<br />
ndxx_setup.f corresponding routines in the GeosTomas<br />
planeflight_mod.f directory.<br />
seasalt_mod.f<br />
sulfate_mod.f Maybe in the near future we can work<br />
tomas_mod.f towards integrating TOMAS into the GeosCore<br />
tomas_tpcore_mod.f90 directory more cleanly. However, due to<br />
tpcore_mod.f the large number of modifications that were<br />
tpcore_window_mod.f necessary for TOMAS, it was quicker to<br />
tracerid_mod.f implement the TOMAS code in a separate<br />
wetscav_mod.f subdirectory. <br />
xtra_read_mod.f -- Bob Y. (1/25/10)<br />
<br />
Each of these files were merged with the corresponding files in the <tt>GeosCore</tt> subdirectory. Therefore, in addition to having the GEOS-Chem modifications from [[GEOS-Chem v8-02-05|v8-02-05]], these files also have the relevant TOMAS references.<br />
<br />
A few technical considerations dictated the placing of these files into a separate <tt>GeosTomas/</tt> directory:<br />
<br />
* The ND60 diagnostic in the standard GEOS-Chem code (in <tt>GeosCore/</tt>) is now used for the CH4 offline simulation, but in TOMAS it's used for something else. <br />
* Some parameters needed to be declared differently with for simulations with TOMAS. <br />
* Because not all GEOS-Chem users will choose to use TOMAS, we did not want to unnecessarily bog down the code in <tt>GeosCore/</tt> with references to TOMAS-specific routines. <br />
<br />
All of these concerns could be best solved by keeping parallel copies of the affected routines in the <tt>GeosTomas</tt> directory.<br />
<br />
--[[User:Bmy|Bob Y.]] 13:35, 25 February 2010 (EST)<br />
<br />
=== Building GEOS-Chem with TOMAS ===<br />
<br />
'''''NOTE: This will be rendered obsolete by the [[#Update April 2013|re-integration of TOMAS into GEOS-Chem]], which is slated for [[GEOS-Chem v9-02]].'''''<br />
<br />
The <tt>VPATH</tt> feature of [http://www.gnu.org/software/make/manual/make.html GNU Make] is used to simplify the compilation. When GEOS-Chem is compiled with the tomas target, the GNU Make utility will search for files in the <tt>GeosTomas/</tt> directory first. If it cannot find files there, it will then search the <tt>GeosCore/</tt> directory. Thus, if we make a change to a "core" GEOS-Chem routine in the <tt>GeosCore/</tt> subdirectory (say in <tt>dao_mod.f</tt> or <tt>diag49_mod.f</tt>), then those changes will automatically be applied when you build GEOS-Chem with TOMAS. Thus, we only need to keep in <tt>GeosTomas/</tt> separate copies of those files that have to "talk" with TOMAS.<br />
<br />
Several new targets were added to the <tt>Makefile</tt> in the top-level <tt>Code/</tt> directory:<br />
<br />
#=============================================================================<br />
# Targets for TOMAS aerosol microphysics code (win, bmy, 1/25/10)<br />
#=============================================================================<br />
<br />
.PHONY: tomas libtomas exetomas cleantomas<br />
<br />
tomas:<br />
@$(MAKE) -C $(GEOSTOM) TOMAS=yes all<br />
<br />
libtomas:<br />
@$(MAKE) -C $(GEOSTOM) TOMAS=yes lib<br />
<br />
exetomas:<br />
@$(MAKE) -C $(GEOSTOM) TOMAS=yes exe<br />
<br />
cleantomas:<br />
@$(MAKE) -C $(GEOSTOM) TOMAS=yes clean<br />
<br />
You can build GEOS-Chem with the TOMAS option by typing:<br />
<br />
make tomas ...<br />
<br />
This will automatically do the proper things to build the TOMAS code into GEOS-Chem, such as:<br />
<br />
* Adding a <tt>-DTOMAS</tt> C-preprocessor switch to the <tt>FFLAGS</tt> compiler flag settings in <tt>Makefile_header.mk</tt>. This will cause TOMAS-specific areas of code to be turned on.<br />
* Turning off OpenMP parallelization. For now the GEOS-Chem + TOMAS code needs to be run on a single processor. We continue to work on parallelizing the code.<br />
* Calling the Makefile in the <tt>GeosTomas/</tt> subdirectory to build the executable. The executable file is now named <tt>geostomas</tt> in order to denote that the TOMAS code is built in.<br />
<br />
The GEOS-Chem + TOMAS has been built on the following compilers<br />
<br />
* Intel Fortran compiler v10<br />
* Intel Fortran compiler v11.1 (20101201)<br />
* SunStudio 12<br />
<br />
--[[User:Bmy|Bob Y.]] 10:36, 27 January 2010 (EST)<br />
<br />
== Computational Information ==<br />
<br />
GC-TOMAS v9-02 (30 sections) on 8 processors: <br />
One year simulation = 7-8 days wall clock time<br />
<br />
More speedups are available using lower aerosol size resolution<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 11:00, 07 May 2013 (EST)<br />
<br />
== Microphysics Code==<br />
The aerosol microphysics code is largely contained within the file <tt>tomas_mod.f</tt>. Tomas_mod and its subroutines are modular -- they use all their own internal variables. For details, see tomas_mod.f and comments. <br />
<br />
=== Nucleation ===<br />
The choice of nucleation theory is selected in the header section of <tt>tomas_mod.f</tt>. The choices are currently binary homogeneous nucleation as in Vehkamaki, 2001 or ternary homogenous nucleation as in Napari et al., 2002. The ternary nucleation rate is typically scaled by a globally uniform tuning factor of 10^-4 or 10^-5. Binary nucleation (Vehkamaki et al. 2002), ion-mediated nucleation (Yu, 2008) and activation nucleation (Kulmala, 2006) are options as well.<br />
<br />
In TOMAS-12 and TOMAS-30, nucleated particles follow the Kerminen approximation to grow to the smallest size bin. This has a tendency to overpredict the number of particles in the smallest bins of those models. See Y. H. Lee, J. R. Pierce, and P. J. Adams 2013 [http://www.geosci-model-dev-discuss.net/6/893/2013/gmdd-6-893-2013.html here] for more details on the consequences of this.<br />
<br />
=== Condensation ===<br />
<br />
=== Coagulation ===<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 14:08, 9 May 2011 (EST)<br />
<br />
== Validation ==<br />
<br />
GC-TOMAS [[GEOS-Chem v8-03-01|v8-03-01]] generally compares very well with observations and other models. Please see our [http://acmg.seas.harvard.edu/geos/wiki_docs/TOMAS/TOMAS_benchmark_ForHarvard.pdf GC-TOMAS v8-02-05 validation document] for more information and figures. <br />
<br />
Below are some results of benchmarking GC-TOMAS with earlier versions of the model as well as observations:<br />
<br />
[[Image:CN10_smaller.jpg]]<br />
<br />
'''Figure 1: CN10 concentrations predicted by GC-TOMAS v8-02-05 against observations. Descriptions of observational data can be found on p 5454 of Pierce et al, Atmos. Chem. Phys., 7, 2007.'''<br />
<br />
----<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 20:13, 10 February 2010 (EST)<br />
<br />
== Previous issues now resolved ==<br />
<br />
=== Segmentation Fault ===<br />
You may get an early segfault if your stacksize is not set to either unlimited or a very large number. To avoid this, you either have to change the value of an environmental variable (setenv command in <tt>.cshrc</tt>) or use the <tt>ulimit</tt> command. See [http://wiki.seas.harvard.edu/geos-chem/index.php/Machine_issues_%26_portability#Resetting_stacksize_for_Linux this page] for details.<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 20:20, 10 February 2010 (EST)<br />
<br />
== Outstanding issues ==<br />
<br />
=== Offline Dust ===<br />
Currently, GEOS-Chem with TOMAS uses proscribed offline dust aerosol data in radiative transfer / photolysis calculations. Due to complications, this is turned off entirely for 2x2.5 resolution.<br />
<br />
=== Vertical Grids ===<br />
Currently, GC-TOMAS is only compatible with the reduced vertical grids:<br />
* [http://acmg.seas.harvard.edu/geos/doc/man/appendix_3.html#A3.3.1 GEOS3_30L]<br />
* [http://acmg.seas.harvard.edu/geos/doc/man/appendix_3.html#A3.4.1 GEOS4_30L]<br />
* [http://acmg.seas.harvard.edu/geos/doc/man/appendix_3.html#A3.5.1 GEOS5_47L]<br />
<br />
Development for the full vertical grids is ongoing.<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 20:43, 10 February 2010 (EST)<br />
<br />
=== Compile from GeosTomas directory ===<br />
<br />
'''''NOTE: This will be rendered obsolete by the [[#Update April 2013|re-integration of TOMAS into GEOS-Chem]], which is slated for [[GEOS-Chem v9-02]].'''''<br />
<br />
'''''[mailto:dwesterv@andrew.cmu.edu Dan Westervelt] wrote:'''''<br />
<br />
:I think there is something going wrong in my compilation, although errors have come up at both compile time and run time. The worst of the problems is this: I'll make a change to any fortran file in the code (even something meaningless like print*, 'foo') and hundreds of compile errors come out with fishy error messages such as (from ifort v10.1):<br />
<br />
***fortcom: Error: chemistry_mod.f, line 478: A kind type parameter must be a compile-time constant. [DP]<br />
REAL(kind=dp) :: RCNTRL(20)<br />
<br />
:Any advice? The errors I'm having are not unique to any version of GC, any type of met fields, any compiler, etc.<br />
<br />
'''''[mailto:yantosca@seas.harvard.edu Bob Yantosca] wrote:'''''<br />
<br />
:Make sure you are always in the GeosTomas subdirectory when you build the code. Sometimes there is a problem if you build the code from a higher level directory. This may have to do with the VPATH in the makefile.<br />
<br />
'''''[mailto:dwesterv@andrew.cmu.edu Dan Westervelt] wrote:'''''<br />
<br />
:Thanks, that seems to do the trick.<br />
<br />
--[[User:Bmy|Bob Y.]] 14:37, 14 April 2010 (EDT)<br />
<br />
== Other features of TOMAS ==<br />
Other varieties of TOMAS are suited for specific science questions, for example with nucleation studies where explicit aerosol dynamics are needed for nanometer-sized particles. <br />
<br />
=== Set-up Guide ===<br />
<br />
This [[TOMAS setup guide]] was written for users on ACE-NET's Glooscap cluster, but may be more generally applicable.<br />
Please contact [mailto:sal.farina@gmail.com Salvatore Farina] for help in obtaining the latest development version of GEOS-Chem with TOMAS.<br />
This will allow you to take advantage of parallel computation in TOMAS.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 11:55, 26 July 2013 (EDT)<br />
<br />
=== Size Resolution ===<br />
<br />
'''TOMAS-30''': All 7 chemical species have size resolution ranging from 10 nm to 10 µm, spanned by 30 logarithmically spaced (mass doubling) bins.<br />
<br />
'''TOMAS-40''': Same as TOMAS-30 with 10 additional (mass doubling) sub-10nm bins with a lower limit ~1nm<br />
<br />
'''TOMAS-12''': All 7 chemical species have size resolution ranging from 10 nm to 1 µm spanned by 10 logarithmically spaced (mass quadrupling) bins and two supermicron bins. Coarser resolution than TOMAS-30 - Improved computation time. <br />
<br />
'''TOMAS-15''': Same as TOMAS-12 with 3 additional (mass quadrupling) sub-10nm bins with a lower limit ~2nm. Analogous to TOMAS40 with improved computation time.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 12:51, 4 June 2013 (EDT)<br />
<br />
=== Nesting and grid size ===<br />
TOMAS is implemented on a 2x2.5 North American domain. Developed by Jeffrey Pierce (jeffrey.pierce@dal.ca)<br />
<br />
=== AOD, CCN post-processing code ===<br />
Codes available for calculating aerosol optical depth using TOMAS predicted aerosol composition and size and Mie Theory. Also CCN concentrations calculated from TOMAS size-resolved composition and Kohler theory. Developed by Yunha Lee and Jeffrey Pierce, adapted for GEOS-Chem output by Jeffrey Pierce.<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 2:00, 9 May 2011 (EST)<br />
<br />
== Reference ==<br />
'''Nucleation in GEOS-Chem'''<br />
Westervelt, D. M., Pierce, J. R., Riipinen, I., Trivitayanurak, W., Hamed, A., Kulmala, M., Laaksonen, A., Decesari, S., and Adams, P. J.: Formation and growth of nucleated particles into cloud condensation nuclei: model-measurement comparison, Atmos. Chem. Phys. Discuss., 13, 8333-8386, doi:10.5194/acpd-13-8333-2013, 2013. [http://www.atmos-chem-phys-discuss.net/13/8333/2013/acpd-13-8333-2013.html LINK]<br />
<br />
'''TOMAS implementation in GEOS-Chem:''' <br />
Trivitayanurak, W., Adams, P. J., Spracklen, D. V. and Carslaw, K. S.: Tropospheric aerosol microphysics simulation with assimilated meteorology: model description and intermodel comparison, Atmospheric Chemistry and Physics, 8(12), 3149-3168, 2008.<br />
<br />
'''TOMAS initial paper, sulfate only:''' <br />
Adams, P. J. and Seinfeld, J. H.: Predicting global aerosol size distributions in general circulation models, J Geophys Res-Atmos, 107(D19), -, doi:Artn 4370 Doi 10.1029/2001jd001010, 2002.<br />
<br />
'''TOMAS with sea-salt:'''<br />
Pierce, J.R., and Adams P.J., Global evaluation of CCN formation by direct emission of sea salt and growth of ultrafine sea salt, Journal of Geophysical Research-Atmospheres, 111 (D6), doi:10.1029/2005JD006186, 2006.<br />
<br />
'''TOMAS with carbonaceous aerosol:''' <br />
Pierce, J. R., Chen, K. and Adams, P. J.: Contribution of primary carbonaceous aerosol to cloud condensation nuclei: processes and uncertainties evaluated with a global aerosol microphysics model, Atmos. Chem. Phys., 7(20), 5447-5466, doi:10.5194/acp-7-5447-2007, 2007.<br />
<br />
'''TOMAS with dust:''' <br />
Lee, Y.H., K. Chen, and P.J. Adams, 2009: Development of a global model of mineral dust aerosol microphysics. Atmos. Chem. Phys., 8, 2441-2558, doi:10.5194/acp-9-2441-2009.</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_setup_guide&diff=13773TOMAS setup guide2013-08-05T20:43:05Z<p>Salvatore Farina: /* Getting Set Up */</p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with [[TOMAS aerosol microphysics]] on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
However, the 'bleeding edge' code has many recent developments in GEOS-Chem/TOMAS that are not included in the public release, including parallel computing.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Compiler ===<br />
GEOS-Chem works best with the Intel Ifort Fortran compiler - v11.1<br />
There is an instance of the compiler installed on glooscap, which you can load by doing<br />
module load intel/11.1.073<br />
<br />
Alternatively, I have installed ifort version 11.1.080. To use this version, add the following to your .bashrc<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
export FC="ifort"<br />
<br />
=== Code ===<br />
You can grab the absolute latest code from my source directory on glooscap:<br />
cp -r /home/sfarina/source/GC_Bleeding_Edge/ ~<br />
<br />
or, (safer) you can grab my latest "snapshot"<br />
cp /home/sfarina/source/GC_BE_snapshot-latest.tgz .<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
Please follow the directions for [[Installing libraries for GEOS-Chem]] wiki before proceeding. You will need to install the netCDF-4.2 libraries.<br />
<br />
=== Environment ===<br />
After installing the libraries, your bashrc should include a similar section to the following<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
<br />
export FC="ifort"<br />
<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
<br />
Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11''<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes due to recent GC development.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== Make ===<br />
Glooscap allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do<br />
cd YOUR_CODE_DIR/GC_Bleeding_Edge/GeosCore<br />
pshell16<br />
make -j16 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on glooscap.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, '''ask'''.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. To use git on glooscap:<br />
module load git<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have your source code directory, make a separate branch for yourself before making any changes. This will simplify trading and tracking updates/advances/bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
Now that you've successfully run the model, there are a few more hurdles to inspect your data.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
==== Copy ====<br />
Copy the relevant files to your postprocessing directory for a given run<br />
ctm.bpch<br />
diaginfo.dat<br />
tracerinfo.dat<br />
proc_one.pro<br />
averageCNCCN_XX.py <-- XX is TOMAS version<br />
plotCNCCN.py<br />
<br />
==== Split ====<br />
Use the script Bpch_Sep_Sal interactively from within the IDL environment to ctm.bpch into separate months<br />
For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
<br />
==== Create netcdf output ====<br />
Using the IDL script proc_one.pro, we extract information from the monthly .bpch files and save it to the standard netCDF<br />
Edit proc_one.pro to use the correct infile/outfiles<br />
Execute proc_one from your shell:<br />
idl proc_one.pro<br />
<br />
==== Counting CN and CCN ====<br />
Run averageCNCCN_XX.py, where XX is the model version<br />
For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
<br />
==== Plotting the Results====<br />
Edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version.<br />
plotCNCCN.py will automatically detect the model version and customize map names.<br />
To plot the surface and zonal average concentrations of CN3, CN10, CN40, and CN80 for august:<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
<br />
==== NCview ====<br />
You can also use ncview on the file ctm.nc to view individual species concentrations or nucleation rates.<br />
ncview ctm.nc<br />
ncview ctm_nuc.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F. This is a hard to track bug related to ongoing development of grid independent geoschem.<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* If you are trying to run geoschem ''outside'' of a qrsh (grid engine) environment (i.e. on the head node), you will need to add '''ulimit -S -s unlimited''' to your .bashrc<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups is a better idea.<br />
* if you have any questions or you are running into trouble, ''please ask'' either myself, Sajeev, or Jeff for help. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_setup_guide&diff=13748TOMAS setup guide2013-07-26T20:42:25Z<p>Salvatore Farina: /* Building GEOS-Chem/TOMAS */</p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with [[TOMAS aerosol microphysics]] on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
However, the 'bleeding edge' code has many recent developments in GEOS-Chem/TOMAS that are not included in the public release, including parallel computing.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Code ===<br />
You can grab the absolute latest code from my source directory on glooscap:<br />
cp -r /home/sfarina/source/GC_Bleeding_Edge/ ~<br />
<br />
or, (safer) you can grab my latest "snapshot"<br />
cp /home/sfarina/source/GC_BE_snapshot-latest.tgz .<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* Intel Ifort Fortran compiler - v11.1 - required to build geoschem<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
Please follow the directions for [[Installing libraries for GEOS-Chem]] wiki before proceeding. You will need to install the netCDF-4.2 libraries.<br />
<br />
=== Environment ===<br />
After installing the libraries, your bashrc should include a similar section to the following<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
<br />
export FC="ifort"<br />
<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
<br />
Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11''<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes due to recent GC development.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== Make ===<br />
Glooscap allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do<br />
cd YOUR_CODE_DIR/GC_Bleeding_Edge/GeosCore<br />
pshell16<br />
make -j16 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on glooscap.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, '''ask'''.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. To use git on glooscap:<br />
module load git<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have your source code directory, make a separate branch for yourself before making any changes. This will simplify trading and tracking updates/advances/bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
Now that you've successfully run the model, there are a few more hurdles to inspect your data.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
==== Copy ====<br />
Copy the relevant files to your postprocessing directory for a given run<br />
ctm.bpch<br />
diaginfo.dat<br />
tracerinfo.dat<br />
proc_one.pro<br />
averageCNCCN_XX.py <-- XX is TOMAS version<br />
plotCNCCN.py<br />
<br />
==== Split ====<br />
Use the script Bpch_Sep_Sal interactively from within the IDL environment to ctm.bpch into separate months<br />
For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
<br />
==== Create netcdf output ====<br />
Using the IDL script proc_one.pro, we extract information from the monthly .bpch files and save it to the standard netCDF<br />
Edit proc_one.pro to use the correct infile/outfiles<br />
Execute proc_one from your shell:<br />
idl proc_one.pro<br />
<br />
==== Counting CN and CCN ====<br />
Run averageCNCCN_XX.py, where XX is the model version<br />
For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
<br />
==== Plotting the Results====<br />
Edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version.<br />
plotCNCCN.py will automatically detect the model version and customize map names.<br />
To plot the surface and zonal average concentrations of CN3, CN10, CN40, and CN80 for august:<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
<br />
==== NCview ====<br />
You can also use ncview on the file ctm.nc to view individual species concentrations or nucleation rates.<br />
ncview ctm.nc<br />
ncview ctm_nuc.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F. This is a hard to track bug related to ongoing development of grid independent geoschem.<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* If you are trying to run geoschem ''outside'' of a qrsh (grid engine) environment (i.e. on the head node), you will need to add '''ulimit -S -s unlimited''' to your .bashrc<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups is a better idea.<br />
* if you have any questions or you are running into trouble, ''please ask'' either myself, Sajeev, or Jeff for help. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_setup_guide&diff=13747TOMAS setup guide2013-07-26T20:41:42Z<p>Salvatore Farina: /* Libraries */</p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with [[TOMAS aerosol microphysics]] on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
However, the 'bleeding edge' code has many recent developments in GEOS-Chem/TOMAS that are not included in the public release, including parallel computing.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Code ===<br />
You can grab the absolute latest code from my source directory on glooscap:<br />
cp -r /home/sfarina/source/GC_Bleeding_Edge/ ~<br />
<br />
or, (safer) you can grab my latest "snapshot"<br />
cp /home/sfarina/source/GC_BE_snapshot-latest.tgz .<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* Intel Ifort Fortran compiler - v11.1 - required to build geoschem<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
Please follow the directions for [[Installing libraries for GEOS-Chem]] wiki before proceeding. You will need to install the netCDF-4.2 libraries.<br />
<br />
=== Environment ===<br />
After installing the libraries, your bashrc should include a similar section to the following<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
<br />
export FC="ifort"<br />
<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
<br />
Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11''<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes due to recent GC development.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== Compiler ===<br />
Please note that the '''ONLY VERSION''' of the intel compiler which reliably compiles a working executable of geos-chem with TOMAS is version 11.1.<br />
Installation is described above in the libraries section.<br />
<br />
=== Make ===<br />
Glooscap allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do<br />
cd YOUR_CODE_DIR/GC_Bleeding_Edge/GeosCore<br />
pshell16<br />
make -j16 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on glooscap.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, '''ask'''.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. To use git on glooscap:<br />
module load git<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have your source code directory, make a separate branch for yourself before making any changes. This will simplify trading and tracking updates/advances/bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
Now that you've successfully run the model, there are a few more hurdles to inspect your data.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
==== Copy ====<br />
Copy the relevant files to your postprocessing directory for a given run<br />
ctm.bpch<br />
diaginfo.dat<br />
tracerinfo.dat<br />
proc_one.pro<br />
averageCNCCN_XX.py <-- XX is TOMAS version<br />
plotCNCCN.py<br />
<br />
==== Split ====<br />
Use the script Bpch_Sep_Sal interactively from within the IDL environment to ctm.bpch into separate months<br />
For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
<br />
==== Create netcdf output ====<br />
Using the IDL script proc_one.pro, we extract information from the monthly .bpch files and save it to the standard netCDF<br />
Edit proc_one.pro to use the correct infile/outfiles<br />
Execute proc_one from your shell:<br />
idl proc_one.pro<br />
<br />
==== Counting CN and CCN ====<br />
Run averageCNCCN_XX.py, where XX is the model version<br />
For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
<br />
==== Plotting the Results====<br />
Edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version.<br />
plotCNCCN.py will automatically detect the model version and customize map names.<br />
To plot the surface and zonal average concentrations of CN3, CN10, CN40, and CN80 for august:<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
<br />
==== NCview ====<br />
You can also use ncview on the file ctm.nc to view individual species concentrations or nucleation rates.<br />
ncview ctm.nc<br />
ncview ctm_nuc.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F. This is a hard to track bug related to ongoing development of grid independent geoschem.<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* If you are trying to run geoschem ''outside'' of a qrsh (grid engine) environment (i.e. on the head node), you will need to add '''ulimit -S -s unlimited''' to your .bashrc<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups is a better idea.<br />
* if you have any questions or you are running into trouble, ''please ask'' either myself, Sajeev, or Jeff for help. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_setup_guide&diff=13746TOMAS setup guide2013-07-26T20:34:52Z<p>Salvatore Farina: </p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with [[TOMAS aerosol microphysics]] on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
However, the 'bleeding edge' code has many recent developments in GEOS-Chem/TOMAS that are not included in the public release, including parallel computing.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Code ===<br />
You can grab the absolute latest code from my source directory on glooscap:<br />
cp -r /home/sfarina/source/GC_Bleeding_Edge/ ~<br />
<br />
or, (safer) you can grab my latest "snapshot"<br />
cp /home/sfarina/source/GC_BE_snapshot-latest.tgz .<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* Intel Ifort Fortran compiler - v11.1 - required to build geoschem<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
Please follow the directions on the [[Installing libraries for GEOS-Chem]] wiki before proceeding.<br />
<br />
=== Environment ===<br />
After installing the libraries, your bashrc should include a similar section to the following<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
<br />
export FC="ifort"<br />
<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
<br />
Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11''<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes due to recent GC development.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== Compiler ===<br />
Please note that the '''ONLY VERSION''' of the intel compiler which reliably compiles a working executable of geos-chem with TOMAS is version 11.1.<br />
Installation is described above in the libraries section.<br />
<br />
=== Make ===<br />
Glooscap allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do<br />
cd YOUR_CODE_DIR/GC_Bleeding_Edge/GeosCore<br />
pshell16<br />
make -j16 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on glooscap.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, '''ask'''.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. To use git on glooscap:<br />
module load git<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have your source code directory, make a separate branch for yourself before making any changes. This will simplify trading and tracking updates/advances/bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
Now that you've successfully run the model, there are a few more hurdles to inspect your data.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
==== Copy ====<br />
Copy the relevant files to your postprocessing directory for a given run<br />
ctm.bpch<br />
diaginfo.dat<br />
tracerinfo.dat<br />
proc_one.pro<br />
averageCNCCN_XX.py <-- XX is TOMAS version<br />
plotCNCCN.py<br />
<br />
==== Split ====<br />
Use the script Bpch_Sep_Sal interactively from within the IDL environment to ctm.bpch into separate months<br />
For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
<br />
==== Create netcdf output ====<br />
Using the IDL script proc_one.pro, we extract information from the monthly .bpch files and save it to the standard netCDF<br />
Edit proc_one.pro to use the correct infile/outfiles<br />
Execute proc_one from your shell:<br />
idl proc_one.pro<br />
<br />
==== Counting CN and CCN ====<br />
Run averageCNCCN_XX.py, where XX is the model version<br />
For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
<br />
==== Plotting the Results====<br />
Edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version.<br />
plotCNCCN.py will automatically detect the model version and customize map names.<br />
To plot the surface and zonal average concentrations of CN3, CN10, CN40, and CN80 for august:<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
<br />
==== NCview ====<br />
You can also use ncview on the file ctm.nc to view individual species concentrations or nucleation rates.<br />
ncview ctm.nc<br />
ncview ctm_nuc.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F. This is a hard to track bug related to ongoing development of grid independent geoschem.<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* If you are trying to run geoschem ''outside'' of a qrsh (grid engine) environment (i.e. on the head node), you will need to add '''ulimit -S -s unlimited''' to your .bashrc<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups is a better idea.<br />
* if you have any questions or you are running into trouble, ''please ask'' either myself, Sajeev, or Jeff for help. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_aerosol_microphysics&diff=13740TOMAS aerosol microphysics2013-07-26T15:59:29Z<p>Salvatore Farina: </p>
<hr />
<div>This page describes the TOMAS aerosol microphysics option in GEOS-Chem. TOMAS is one of two aerosol microphysics packages being incorporated into GEOS-Chem, the other being [[APM aerosol microphysics|APM]].<br />
<br />
== Overview ==<br />
<br />
The TwO-Moment Aerosol Sectional (TOMAS) microphysics package was developed for implementation into GEOS-Chem at Carnegie-Mellon University. Using a moving sectional and moment-based approach, TOMAS tracks two independent moments (number and mass) of the aerosol size distribution for a number of discrete size bins. It also contains codes to simulate nucleation, condensation, and coagulation processes. The aerosol species that are considered with high size resolution are sulfate, sea-salt, OC, EC, and dust. An advantage of TOMAS is the full size resolution for all chemical species and the conservation of aerosol number, the latter of which allows one to construct aerosol and CCN number budgets that will balance.<br />
<br />
=== Authors and collaborators ===<br />
* [mailto:petera@andrew.cmu.edu Peter Adams] ''(Carnegie-Mellon U.)'' -- Principal Investigator<br />
* [mailto:wtrivita@staffmail.ed.ac.uk Win Trivitayanurak] ''(Department of Highways, Thailand)''<br />
* [mailto:dwesterv@andrew.cmu.edu Dan Westervelt] ''(Carnegie-Mellon U.)''<br />
* [mailto:jeffrey.pierce@dal.ca Jeffrey Pierce] ''(Dalhousie U.)''<br />
* [mailto:sal.farina@gmail.com Salvatore Farina] ''(Colorado State U.)''<br />
<br />
Questions regarding TOMAS can be directed at Dan (e-mail linked above).<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 11:53, 27 January 2010 (EST)<br />
<br />
=== TOMAS User Groups ===<br />
<br />
{| border=1 cellspacing=0 cellpadding=5<br />
|- bgcolor="#cccccc"<br />
!User Group<br />
!Personnel<br />
!Projects<br />
|-valign="top"<br />
|[http://www.ce.cmu.edu/%7Eadams/ Carnegie-Mellon University]<br />
|[http://www.ce.cmu.edu/%7Eadams/people.htm#peter Peter Adams]<br>[http://www.ce.cmu.edu/~dwesterv/Site/Home.html Dan Westervelt]<br />
| [http://www.atmos-chem-phys-discuss.net/13/8333/2013/acpd-13-8333-2013.html New particle formation evaluation in GC-TOMAS] <br> Sensitivity of CCN to nucleation rates <br> Development of number tagging and source apportionment model for GC-TOMAS<br />
|-valign="top"<br />
|[http://fizz.phys.dal.ca/%7Epierce/ Dalhousie University] <br> [http://www.atmos.colostate.edu/faculty/pierce.php Colorado State]<br />
|[http://atm.dal.ca/Faculty/Jeffrey_Pierce.php Jeffrey Pierce]<br>Sal Farina<br>Stephen D'Andrea<br />
|Sensitivity of CCN to condensational growth rates <br> TOMAS parallelization <br> Others...<br />
|-valign="top"<br />
|Add yours here<br />
|<br />
|<br />
|}<br />
<br />
== TOMAS-specific setup ==<br />
TOMAS has its own run directories (run.Tomas) that can be downloaded from the Harvard FTP. The <tt>input.geos</tt> file will look slightly different from standard GEOS-Chem, and between versions.<br />
<br />
Pre- v9.02:<br />
To turn on TOMAS, see the "Microphysics menu" in <tt>input.geos</tt> and make sure TOMAS is set to '''T'''. <br />
<br />
v9.02 and later:<br />
TOMAS is enabled or disabled at compile time - the TOMAS flag in input.geos has been removed.<br />
<br />
<br />
TOMAS is a simulation type 3 and utilizes 171-423 tracers. Each aerosol species requires 30 tracers for the 30 bin size resolution, 12 for the 12 bin, etc. Here is the (abbreviated) default setup in input.geos for TOMAS-30 in v9.02 and later(see run.Tomas directory):<br />
<br />
Tracer # Description <br />
1- 62 Std Geos Chem <br />
63 H2SO4 <br />
64- 93 Number <br />
94-123 Sulfate <br />
124-153 Sea-salt <br />
154-183 Hydrophilic EC <br />
184-213 Hydrophobic EC <br />
214-243 Hydrophilic OC <br />
244-273 Hydrophobic OC <br />
274-303 Mineral dust <br />
304-333 Aerosol water<br />
<br />
TOMAS-40 requires 423 tracers (~360 TOMAS tracers for each of the 40-bin species, and ~62 standard GEOS-Chem tracers) <br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 18:48, 8 July 2013 (EDT)<br />
<br />
== Implementation notes ==<br />
<br />
TOMAS validation in [[GEOS-Chem v8-03-01]] was completed on 24 Feb 2010.<br />
<br />
=== Update April 2013 ===<br />
<br />
'''''This update was tested in the 1-month benchmark simulation [[GEOS-Chem_v9-02_benchmark_history#v9-02k|v9-02k]] and approved on 07 Jun 2013.'''''<br />
<br />
Sal Farina has been working with the GEOS-Chem Support Team to inline the TOMAS aerosol microphysics code into the <tt>GeosCore</tt> directory. All TOMAS-specific sections of code are now segregated from the rest of GEOS-Chem with C-preprocessor statements such as:<br />
<br />
#if defined( TOMAS )<br />
<br />
# if defined( TOMAS40 ) <br />
... Code for 40 bin TOMAS simulation (optional) goes here ...<br />
# elif defined( TOMAS12 )<br />
... Code for 12 bin TOMAS simulation (optional) goes here ...<br />
# elif defined( TOMAS15 )<br />
... Code for 15 bin TOMAS simulation (optional) goes here ...<br />
# else<br />
... Code for 30 bin TOMAS simulation (default) goes here ...<br />
# endif<br />
<br />
#endif <br />
<br />
TOMAS is now invoked by compiling GEOS-Chem with one of the following options:<br />
<br />
make -j4 TOMAS=yes ... # Compiles GEOS-Chem for the 30 bin (default) TOMAS simulation<br />
# -j4 compiles 4 files at a time; this reduces overall compilation time<br />
<br />
or<br />
<br />
make -j4 TOMAS40=yes ... # Compiles GEOS-Chem for the 40 bin (optional) TOMAS simulation<br />
# -j4 compiles 4 files at a time; this reduces overall compilation time<br />
<br />
All files in the old <tt>GeosTomas/</tt> directory have now been deleted, as these have been rendered obsolete.<br />
<br />
These updates will be included in [[GEOS-Chem v9-02]]. These modifications will not affect the existing GEOS-Chem simulations, as all TOMAS code is not compiled into the executable unless you specify either <tt>TOMAS=yes</tt> or <tt>TOMAS40=yes</tt> at compile time.<br />
<br />
We are in the process of updating the wiki to reflect these changes as they are implemented. <br />
<br />
--[[User:Bmy|Bob Y.]] 13:59, 23 April 2013 (EDT)<br><br />
--[[User:Salvatore Farina|Salvatore Farina]] 13:49, 4 June 2013 (EDT)<br />
<br />
=== Code structure ===<br />
<br />
'''''NOTE: This will be rendered obsolete by the [[#Update April 2013|re-integration of TOMAS into GEOS-Chem]], which is slated for [[GEOS-Chem v9-02]].'''''<br />
<br />
The main-level <tt>Code</tt> directory has now been divided into several subdirectories:<br />
<br />
GeosCore/ GEOS-Chem "core" routines<br />
GeosTomas/ Parallel copies of GEOS-Chem routines that reference TOMAS<br />
GeosUtil/ "Utility" modules (e.g. error_mod.f, file_mod.f, time_mod.f, etc.<br />
Headers/ Header files (define.h, CMN_SIZE, CMN_DIAG, etc.)<br />
KPP/ KPP solver directory structure<br />
bin/ Directory where executables are placed<br />
doc/ Directory where documentation is created<br />
help/ Directory for GEOS-Chem Help Screen<br />
lib/ Directory where library files are placed<br />
mod/ Directory where module files are placed<br />
obsolete/ Directory where obsolete versions of code are archived<br />
<br />
Because there were a lot of TOMAS-related modifications in several GEOS-Chem "core" routines, the routines that need to "talk" to TOMAS were placed into a separate subdirectory named <tt>GeosTomas/</tt>. The files in <tt>GeosTomas</tt> are:<br />
<br />
Files:<br />
------<br />
Makefile -- GEOS-Chem routines that have been<br />
aero_drydep.f modified to reference the TOMAS aerosol<br />
carbon_mod.f microphysics package. These are kept<br />
chemdr.f in a separate GeosTomas directory so that<br />
chemistry_mod.f they do not interfere with the routines<br />
cleanup.f in the GeosCore directory.<br />
diag3.f<br />
diag_mod.f The GeosTomas directory only needs to<br />
diag_pl_mod.f contain the files that have been modified<br />
drydep_mod.f for TOMAS. The Makefile will look for<br />
dust_mod.f all other files from the GeosCore directory<br />
emissions_mod.f using the VPATH option in GNU Make.<br />
gamap_mod.f<br />
initialize.f NOTE to GEOS-Chem developers: When you<br />
input_mod.f make changes to any of these routines<br />
isoropia_mod.f in the GeosCore directory, you must also<br />
logical_mod.f make the same modifications to the<br />
ndxx_setup.f corresponding routines in the GeosTomas<br />
planeflight_mod.f directory.<br />
seasalt_mod.f<br />
sulfate_mod.f Maybe in the near future we can work<br />
tomas_mod.f towards integrating TOMAS into the GeosCore<br />
tomas_tpcore_mod.f90 directory more cleanly. However, due to<br />
tpcore_mod.f the large number of modifications that were<br />
tpcore_window_mod.f necessary for TOMAS, it was quicker to<br />
tracerid_mod.f implement the TOMAS code in a separate<br />
wetscav_mod.f subdirectory. <br />
xtra_read_mod.f -- Bob Y. (1/25/10)<br />
<br />
Each of these files were merged with the corresponding files in the <tt>GeosCore</tt> subdirectory. Therefore, in addition to having the GEOS-Chem modifications from [[GEOS-Chem v8-02-05|v8-02-05]], these files also have the relevant TOMAS references.<br />
<br />
A few technical considerations dictated the placing of these files into a separate <tt>GeosTomas/</tt> directory:<br />
<br />
* The ND60 diagnostic in the standard GEOS-Chem code (in <tt>GeosCore/</tt>) is now used for the CH4 offline simulation, but in TOMAS it's used for something else. <br />
* Some parameters needed to be declared differently with for simulations with TOMAS. <br />
* Because not all GEOS-Chem users will choose to use TOMAS, we did not want to unnecessarily bog down the code in <tt>GeosCore/</tt> with references to TOMAS-specific routines. <br />
<br />
All of these concerns could be best solved by keeping parallel copies of the affected routines in the <tt>GeosTomas</tt> directory.<br />
<br />
--[[User:Bmy|Bob Y.]] 13:35, 25 February 2010 (EST)<br />
<br />
=== Building GEOS-Chem with TOMAS ===<br />
<br />
'''''NOTE: This will be rendered obsolete by the [[#Update April 2013|re-integration of TOMAS into GEOS-Chem]], which is slated for [[GEOS-Chem v9-02]].'''''<br />
<br />
The <tt>VPATH</tt> feature of [http://www.gnu.org/software/make/manual/make.html GNU Make] is used to simplify the compilation. When GEOS-Chem is compiled with the tomas target, the GNU Make utility will search for files in the <tt>GeosTomas/</tt> directory first. If it cannot find files there, it will then search the <tt>GeosCore/</tt> directory. Thus, if we make a change to a "core" GEOS-Chem routine in the <tt>GeosCore/</tt> subdirectory (say in <tt>dao_mod.f</tt> or <tt>diag49_mod.f</tt>), then those changes will automatically be applied when you build GEOS-Chem with TOMAS. Thus, we only need to keep in <tt>GeosTomas/</tt> separate copies of those files that have to "talk" with TOMAS.<br />
<br />
Several new targets were added to the <tt>Makefile</tt> in the top-level <tt>Code/</tt> directory:<br />
<br />
#=============================================================================<br />
# Targets for TOMAS aerosol microphysics code (win, bmy, 1/25/10)<br />
#=============================================================================<br />
<br />
.PHONY: tomas libtomas exetomas cleantomas<br />
<br />
tomas:<br />
@$(MAKE) -C $(GEOSTOM) TOMAS=yes all<br />
<br />
libtomas:<br />
@$(MAKE) -C $(GEOSTOM) TOMAS=yes lib<br />
<br />
exetomas:<br />
@$(MAKE) -C $(GEOSTOM) TOMAS=yes exe<br />
<br />
cleantomas:<br />
@$(MAKE) -C $(GEOSTOM) TOMAS=yes clean<br />
<br />
You can build GEOS-Chem with the TOMAS option by typing:<br />
<br />
make tomas ...<br />
<br />
This will automatically do the proper things to build the TOMAS code into GEOS-Chem, such as:<br />
<br />
* Adding a <tt>-DTOMAS</tt> C-preprocessor switch to the <tt>FFLAGS</tt> compiler flag settings in <tt>Makefile_header.mk</tt>. This will cause TOMAS-specific areas of code to be turned on.<br />
* Turning off OpenMP parallelization. For now the GEOS-Chem + TOMAS code needs to be run on a single processor. We continue to work on parallelizing the code.<br />
* Calling the Makefile in the <tt>GeosTomas/</tt> subdirectory to build the executable. The executable file is now named <tt>geostomas</tt> in order to denote that the TOMAS code is built in.<br />
<br />
The GEOS-Chem + TOMAS has been built on the following compilers<br />
<br />
* Intel Fortran compiler v10<br />
* Intel Fortran compiler v11.1 (20101201)<br />
* SunStudio 12<br />
<br />
--[[User:Bmy|Bob Y.]] 10:36, 27 January 2010 (EST)<br />
<br />
== Computational Information ==<br />
<br />
GC-TOMAS v9-02 (30 sections) on 8 processors: <br />
One year simulation = 7-8 days wall clock time<br />
<br />
More speedups are available using lower aerosol size resolution<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 11:00, 07 May 2013 (EST)<br />
<br />
== Microphysics Code==<br />
The aerosol microphysics code is largely contained within the file <tt>tomas_mod.f</tt>. Tomas_mod and its subroutines are modular -- they use all their own internal variables. For details, see tomas_mod.f and comments. <br />
<br />
=== Nucleation ===<br />
The choice of nucleation theory is selected in the header section of <tt>tomas_mod.f</tt>. The choices are currently binary homogeneous nucleation as in Vehkamaki, 2001 or ternary homogenous nucleation as in Napari et al., 2002. The ternary nucleation rate is typically scaled by a globally uniform tuning factor of 10^-4 or 10^-5. Binary nucleation (Vehkamaki et al. 2002), ion-mediated nucleation (Yu, 2008) and activation nucleation (Kulmala, 2006) are options as well.<br />
<br />
In TOMAS-12 and TOMAS-30, nucleated particles follow the Kerminen approximation to grow to the smallest size bin. This has a tendency to overpredict the number of particles in the smallest bins of those models. See Y. H. Lee, J. R. Pierce, and P. J. Adams 2013 [http://www.geosci-model-dev-discuss.net/6/893/2013/gmdd-6-893-2013.html here] for more details on the consequences of this.<br />
<br />
=== Condensation ===<br />
<br />
=== Coagulation ===<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 14:08, 9 May 2011 (EST)<br />
<br />
== Validation ==<br />
<br />
GC-TOMAS [[GEOS-Chem v8-03-01|v8-03-01]] generally compares very well with observations and other models. Please see our [http://acmg.seas.harvard.edu/geos/wiki_docs/TOMAS/TOMAS_benchmark_ForHarvard.pdf GC-TOMAS v8-02-05 validation document] for more information and figures. <br />
<br />
Below are some results of benchmarking GC-TOMAS with earlier versions of the model as well as observations:<br />
<br />
[[Image:CN10_smaller.jpg]]<br />
<br />
'''Figure 1: CN10 concentrations predicted by GC-TOMAS v8-02-05 against observations. Descriptions of observational data can be found on p 5454 of Pierce et al, Atmos. Chem. Phys., 7, 2007.'''<br />
<br />
----<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 20:13, 10 February 2010 (EST)<br />
<br />
== Previous issues now resolved ==<br />
<br />
=== Segmentation Fault ===<br />
You may get an early segfault if your stacksize is not set to either unlimited or a very large number. To avoid this, you either have to change the value of an environmental variable (setenv command in <tt>.cshrc</tt>) or use the <tt>ulimit</tt> command. See [http://wiki.seas.harvard.edu/geos-chem/index.php/Machine_issues_%26_portability#Resetting_stacksize_for_Linux this page] for details.<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 20:20, 10 February 2010 (EST)<br />
<br />
== Outstanding issues ==<br />
<br />
=== Vertical Grids ===<br />
Currently, GC-TOMAS is only compatible with the reduced vertical grids:<br />
* [http://acmg.seas.harvard.edu/geos/doc/man/appendix_3.html#A3.3.1 GEOS3_30L]<br />
* [http://acmg.seas.harvard.edu/geos/doc/man/appendix_3.html#A3.4.1 GEOS4_30L]<br />
* [http://acmg.seas.harvard.edu/geos/doc/man/appendix_3.html#A3.5.1 GEOS5_47L]<br />
<br />
Development for the full vertical grids is ongoing.<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 20:43, 10 February 2010 (EST)<br />
<br />
=== Compile from GeosTomas directory ===<br />
<br />
'''''NOTE: This will be rendered obsolete by the [[#Update April 2013|re-integration of TOMAS into GEOS-Chem]], which is slated for [[GEOS-Chem v9-02]].'''''<br />
<br />
'''''[mailto:dwesterv@andrew.cmu.edu Dan Westervelt] wrote:'''''<br />
<br />
:I think there is something going wrong in my compilation, although errors have come up at both compile time and run time. The worst of the problems is this: I'll make a change to any fortran file in the code (even something meaningless like print*, 'foo') and hundreds of compile errors come out with fishy error messages such as (from ifort v10.1):<br />
<br />
***fortcom: Error: chemistry_mod.f, line 478: A kind type parameter must be a compile-time constant. [DP]<br />
REAL(kind=dp) :: RCNTRL(20)<br />
<br />
:Any advice? The errors I'm having are not unique to any version of GC, any type of met fields, any compiler, etc.<br />
<br />
'''''[mailto:yantosca@seas.harvard.edu Bob Yantosca] wrote:'''''<br />
<br />
:Make sure you are always in the GeosTomas subdirectory when you build the code. Sometimes there is a problem if you build the code from a higher level directory. This may have to do with the VPATH in the makefile.<br />
<br />
'''''[mailto:dwesterv@andrew.cmu.edu Dan Westervelt] wrote:'''''<br />
<br />
:Thanks, that seems to do the trick.<br />
<br />
--[[User:Bmy|Bob Y.]] 14:37, 14 April 2010 (EDT)<br />
<br />
== Other features of TOMAS ==<br />
Other varieties of TOMAS are suited for specific science questions, for example with nucleation studies where explicit aerosol dynamics are needed for nanometer-sized particles. <br />
<br />
=== Set-up Guide ===<br />
<br />
This [[TOMAS setup guide]] was written for users on ACE-NET's Glooscap cluster, but may be more generally applicable.<br />
Please contact [mailto:sal.farina@gmail.com Salvatore Farina] for help in obtaining the latest development version of GEOS-Chem with TOMAS.<br />
This will allow you to take advantage of parallel computation in TOMAS.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 11:55, 26 July 2013 (EDT)<br />
<br />
=== Size Resolution ===<br />
<br />
'''TOMAS-30''': All 7 chemical species have size resolution ranging from 10 nm to 10 µm, spanned by 30 logarithmically spaced (mass doubling) bins.<br />
<br />
'''TOMAS-40''': Same as TOMAS-30 with 10 additional (mass doubling) sub-10nm bins with a lower limit ~1nm<br />
<br />
'''TOMAS-12''': All 7 chemical species have size resolution ranging from 10 nm to 1 µm spanned by 10 logarithmically spaced (mass quadrupling) bins and two supermicron bins. Coarser resolution than TOMAS-30 - Improved computation time. <br />
<br />
'''TOMAS-15''': Same as TOMAS-12 with 3 additional (mass quadrupling) sub-10nm bins with a lower limit ~2nm. Analogous to TOMAS40 with improved computation time.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 12:51, 4 June 2013 (EDT)<br />
<br />
=== Nesting and grid size ===<br />
TOMAS is implemented on a 2x2.5 North American domain. Developed by Jeffrey Pierce (jeffrey.pierce@dal.ca)<br />
<br />
=== AOD, CCN post-processing code ===<br />
Codes available for calculating aerosol optical depth using TOMAS predicted aerosol composition and size and Mie Theory. Also CCN concentrations calculated from TOMAS size-resolved composition and Kohler theory. Developed by Yunha Lee and Jeffrey Pierce, adapted for GEOS-Chem output by Jeffrey Pierce.<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 2:00, 9 May 2011 (EST)<br />
<br />
== Reference ==<br />
'''Nucleation in GEOS-Chem'''<br />
Westervelt, D. M., Pierce, J. R., Riipinen, I., Trivitayanurak, W., Hamed, A., Kulmala, M., Laaksonen, A., Decesari, S., and Adams, P. J.: Formation and growth of nucleated particles into cloud condensation nuclei: model-measurement comparison, Atmos. Chem. Phys. Discuss., 13, 8333-8386, doi:10.5194/acpd-13-8333-2013, 2013. [http://www.atmos-chem-phys-discuss.net/13/8333/2013/acpd-13-8333-2013.html LINK]<br />
<br />
'''TOMAS implementation in GEOS-Chem:''' <br />
Trivitayanurak, W., Adams, P. J., Spracklen, D. V. and Carslaw, K. S.: Tropospheric aerosol microphysics simulation with assimilated meteorology: model description and intermodel comparison, Atmospheric Chemistry and Physics, 8(12), 3149-3168, 2008.<br />
<br />
'''TOMAS initial paper, sulfate only:''' <br />
Adams, P. J. and Seinfeld, J. H.: Predicting global aerosol size distributions in general circulation models, J Geophys Res-Atmos, 107(D19), -, doi:Artn 4370 Doi 10.1029/2001jd001010, 2002.<br />
<br />
'''TOMAS with sea-salt:'''<br />
Pierce, J.R., and Adams P.J., Global evaluation of CCN formation by direct emission of sea salt and growth of ultrafine sea salt, Journal of Geophysical Research-Atmospheres, 111 (D6), doi:10.1029/2005JD006186, 2006.<br />
<br />
'''TOMAS with carbonaceous aerosol:''' <br />
Pierce, J. R., Chen, K. and Adams, P. J.: Contribution of primary carbonaceous aerosol to cloud condensation nuclei: processes and uncertainties evaluated with a global aerosol microphysics model, Atmos. Chem. Phys., 7(20), 5447-5466, doi:10.5194/acp-7-5447-2007, 2007.<br />
<br />
'''TOMAS with dust:''' <br />
Lee, Y.H., K. Chen, and P.J. Adams, 2009: Development of a global model of mineral dust aerosol microphysics. Atmos. Chem. Phys., 8, 2441-2558, doi:10.5194/acp-9-2441-2009.</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_setup_guide&diff=13739TOMAS setup guide2013-07-26T15:57:55Z<p>Salvatore Farina: </p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with [[TOMAS aerosol microphysics]] on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
However, the 'bleeding edge' code has many recent developments in GEOS-Chem/TOMAS that are not included in the public release, including parallel computing.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Code ===<br />
You can grab the absolute latest code from my source directory on glooscap:<br />
cp -r /home/sfarina/source/GC_Bleeding_Edge/ ~<br />
<br />
or, (safer) you can grab my latest "snapshot"<br />
cp /home/sfarina/source/GC_BE_snapshot-latest.tgz .<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* Intel Ifort Fortran compiler - v11.1 - required to build geoschem<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
You can copy this folder as a tarball from /home/sfarina/gclibs.tgz or simply extract it directly to your home directory:<br />
cd ~<br />
tar zxvf /home/sfarina/gclibs.tgz<br />
<br />
This will extract the libraries folder to your home directory.<br />
<br />
=== Environment ===<br />
In order to get the compiler to run and recognize the libraries described above, some environment variables must be set. Below is an excerpt from my ''.bashrc''.<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
<br />
export FC="ifort"<br />
<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
<br />
ulimit -S -s unlimited<br />
<br />
If you are using bash, you can copy/paste this to your ''.bashrc''. Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11'' change instances of ''sfarina'' to your username.<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes due to recent GC development.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== Compiler ===<br />
Please note that the '''ONLY VERSION''' of the intel compiler which reliably compiles a working executable of geos-chem with TOMAS is version 11.1.<br />
Installation is described above in the libraries section.<br />
<br />
=== Make ===<br />
Glooscap allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do<br />
cd YOUR_CODE_DIR/GC_Bleeding_Edge/GeosCore<br />
pshell16<br />
make -j16 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on glooscap.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, '''ask'''.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. I have a copy built & installed at ''/home/sfarina/opt/bin/git'' that you can probably either copy or just use.<br />
==== Setup ====<br />
I have a copy of git installed at<br />
/home/sfarina/opt/bin<br />
You can either use this executable or build it yourself from source. To use this executable, add the following to your .bashrc<br />
export PATH="/home/sfarina/opt/bin:$PATH"<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have ''git'' installed, make a separate branch for yourself as soon as you make a copy of the code, this way we can easily trade/track updates/advances/bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
Now that you've successfully run the model, there are a few more hurdles to inspect your data.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
==== Copy ====<br />
Copy the relevant files to your postprocessing directory for a given run<br />
ctm.bpch<br />
diaginfo.dat<br />
tracerinfo.dat<br />
proc_one.pro<br />
averageCNCCN_XX.py <-- XX is TOMAS version<br />
plotCNCCN.py<br />
<br />
==== Split ====<br />
Use the script Bpch_Sep_Sal interactively from within the IDL environment to ctm.bpch into separate months<br />
For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
<br />
==== Create netcdf output ====<br />
Using the IDL script proc_one.pro, we extract information from the monthly .bpch files and save it to the standard netCDF<br />
Edit proc_one.pro to use the correct infile/outfiles<br />
Execute proc_one from your shell:<br />
idl proc_one.pro<br />
<br />
==== Counting CN and CCN ====<br />
Run averageCNCCN_XX.py, where XX is the model version<br />
For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
<br />
==== Plotting the Results====<br />
Edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version.<br />
plotCNCCN.py will automatically detect the model version and customize map names.<br />
To plot the surface and zonal average concentrations of CN3, CN10, CN40, and CN80 for august:<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
<br />
==== NCview ====<br />
You can also use ncview on the file ctm.nc to view individual species concentrations or nucleation rates.<br />
ncview ctm.nc<br />
ncview ctm_nuc.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups is a better idea.<br />
* if you have any questions or you are running into trouble, ''please ask'' either myself, Sajeev, or Jeff for help. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=User_talk:Salvatore_Farina&diff=13738User talk:Salvatore Farina2013-07-26T15:57:21Z<p>Salvatore Farina: moved the page to a new home.</p>
<hr />
<div>If you're here, you're probably looking for the [[TOMAS setup guide]]!</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_setup_guide&diff=13737TOMAS setup guide2013-07-26T15:56:15Z<p>Salvatore Farina: created TOMAS setup guide page as copy/paste from my talk page.</p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with TOMAS microphysics on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
However, the 'bleeding edge' code has many recent developments in GEOS-Chem/TOMAS that are not included in the public release, including parallel computing.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Code ===<br />
You can grab the absolute latest code from my source directory on glooscap:<br />
cp -r /home/sfarina/source/GC_Bleeding_Edge/ ~<br />
<br />
or, (safer) you can grab my latest "snapshot"<br />
cp /home/sfarina/source/GC_BE_snapshot-latest.tgz .<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* Intel Ifort Fortran compiler - v11.1 - required to build geoschem<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
You can copy this folder as a tarball from /home/sfarina/gclibs.tgz or simply extract it directly to your home directory:<br />
cd ~<br />
tar zxvf /home/sfarina/gclibs.tgz<br />
<br />
This will extract the libraries folder to your home directory.<br />
<br />
=== Environment ===<br />
In order to get the compiler to run and recognize the libraries described above, some environment variables must be set. Below is an excerpt from my ''.bashrc''.<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
<br />
export FC="ifort"<br />
<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
<br />
ulimit -S -s unlimited<br />
<br />
If you are using bash, you can copy/paste this to your ''.bashrc''. Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11'' change instances of ''sfarina'' to your username.<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes due to recent GC development.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== Compiler ===<br />
Please note that the '''ONLY VERSION''' of the intel compiler which reliably compiles a working executable of geos-chem with TOMAS is version 11.1.<br />
Installation is described above in the libraries section.<br />
<br />
=== Make ===<br />
Glooscap allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do<br />
cd YOUR_CODE_DIR/GC_Bleeding_Edge/GeosCore<br />
pshell16<br />
make -j16 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on glooscap.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, '''ask'''.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. I have a copy built & installed at ''/home/sfarina/opt/bin/git'' that you can probably either copy or just use.<br />
==== Setup ====<br />
I have a copy of git installed at<br />
/home/sfarina/opt/bin<br />
You can either use this executable or build it yourself from source. To use this executable, add the following to your .bashrc<br />
export PATH="/home/sfarina/opt/bin:$PATH"<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have ''git'' installed, make a separate branch for yourself as soon as you make a copy of the code, this way we can easily trade/track updates/advances/bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
Now that you've successfully run the model, there are a few more hurdles to inspect your data.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
==== Copy ====<br />
Copy the relevant files to your postprocessing directory for a given run<br />
ctm.bpch<br />
diaginfo.dat<br />
tracerinfo.dat<br />
proc_one.pro<br />
averageCNCCN_XX.py <-- XX is TOMAS version<br />
plotCNCCN.py<br />
<br />
==== Split ====<br />
Use the script Bpch_Sep_Sal interactively from within the IDL environment to ctm.bpch into separate months<br />
For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
<br />
==== Create netcdf output ====<br />
Using the IDL script proc_one.pro, we extract information from the monthly .bpch files and save it to the standard netCDF<br />
Edit proc_one.pro to use the correct infile/outfiles<br />
Execute proc_one from your shell:<br />
idl proc_one.pro<br />
<br />
==== Counting CN and CCN ====<br />
Run averageCNCCN_XX.py, where XX is the model version<br />
For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
<br />
==== Plotting the Results====<br />
Edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version.<br />
plotCNCCN.py will automatically detect the model version and customize map names.<br />
To plot the surface and zonal average concentrations of CN3, CN10, CN40, and CN80 for august:<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
<br />
==== NCview ====<br />
You can also use ncview on the file ctm.nc to view individual species concentrations or nucleation rates.<br />
ncview ctm.nc<br />
ncview ctm_nuc.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups is a better idea.<br />
* if you have any questions or you are running into trouble, ''please ask'' either myself, Sajeev, or Jeff for help. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=TOMAS_aerosol_microphysics&diff=13736TOMAS aerosol microphysics2013-07-26T15:55:28Z<p>Salvatore Farina: </p>
<hr />
<div>This page describes the TOMAS aerosol microphysics option in GEOS-Chem. TOMAS is one of two aerosol microphysics packages being incorporated into GEOS-Chem, the other being [[APM aerosol microphysics|APM]].<br />
<br />
== Overview ==<br />
<br />
The TwO-Moment Aerosol Sectional (TOMAS) microphysics package was developed for implementation into GEOS-Chem at Carnegie-Mellon University. Using a moving sectional and moment-based approach, TOMAS tracks two independent moments (number and mass) of the aerosol size distribution for a number of discrete size bins. It also contains codes to simulate nucleation, condensation, and coagulation processes. The aerosol species that are considered with high size resolution are sulfate, sea-salt, OC, EC, and dust. An advantage of TOMAS is the full size resolution for all chemical species and the conservation of aerosol number, the latter of which allows one to construct aerosol and CCN number budgets that will balance.<br />
<br />
=== Authors and collaborators ===<br />
* [mailto:petera@andrew.cmu.edu Peter Adams] ''(Carnegie-Mellon U.)'' -- Principal Investigator<br />
* [mailto:wtrivita@staffmail.ed.ac.uk Win Trivitayanurak] ''(Department of Highways, Thailand)''<br />
* [mailto:dwesterv@andrew.cmu.edu Dan Westervelt] ''(Carnegie-Mellon U.)''<br />
* [mailto:jeffrey.pierce@dal.ca Jeffrey Pierce] ''(Dalhousie U.)''<br />
* [mailto:sal.farina@gmail.com Salvatore Farina] ''(Colorado State U.)''<br />
<br />
Questions regarding TOMAS can be directed at Dan (e-mail linked above).<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 11:53, 27 January 2010 (EST)<br />
<br />
=== TOMAS User Groups ===<br />
<br />
{| border=1 cellspacing=0 cellpadding=5<br />
|- bgcolor="#cccccc"<br />
!User Group<br />
!Personnel<br />
!Projects<br />
|-valign="top"<br />
|[http://www.ce.cmu.edu/%7Eadams/ Carnegie-Mellon University]<br />
|[http://www.ce.cmu.edu/%7Eadams/people.htm#peter Peter Adams]<br>[http://www.ce.cmu.edu/~dwesterv/Site/Home.html Dan Westervelt]<br />
| [http://www.atmos-chem-phys-discuss.net/13/8333/2013/acpd-13-8333-2013.html New particle formation evaluation in GC-TOMAS] <br> Sensitivity of CCN to nucleation rates <br> Development of number tagging and source apportionment model for GC-TOMAS<br />
|-valign="top"<br />
|[http://fizz.phys.dal.ca/%7Epierce/ Dalhousie University] <br> [http://www.atmos.colostate.edu/faculty/pierce.php Colorado State]<br />
|[http://atm.dal.ca/Faculty/Jeffrey_Pierce.php Jeffrey Pierce]<br>Sal Farina<br>Stephen D'Andrea<br />
|Sensitivity of CCN to condensational growth rates <br> TOMAS parallelization <br> Others...<br />
|-valign="top"<br />
|Add yours here<br />
|<br />
|<br />
|}<br />
<br />
== TOMAS-specific setup ==<br />
TOMAS has its own run directories (run.Tomas) that can be downloaded from the Harvard FTP. The <tt>input.geos</tt> file will look slightly different from standard GEOS-Chem, and between versions.<br />
<br />
Pre- v9.02:<br />
To turn on TOMAS, see the "Microphysics menu" in <tt>input.geos</tt> and make sure TOMAS is set to '''T'''. <br />
<br />
v9.02 and later:<br />
TOMAS is enabled or disabled at compile time - the TOMAS flag in input.geos has been removed.<br />
<br />
<br />
TOMAS is a simulation type 3 and utilizes 171-423 tracers. Each aerosol species requires 30 tracers for the 30 bin size resolution, 12 for the 12 bin, etc. Here is the (abbreviated) default setup in input.geos for TOMAS-30 in v9.02 and later(see run.Tomas directory):<br />
<br />
Tracer # Description <br />
1- 62 Std Geos Chem <br />
63 H2SO4 <br />
64- 93 Number <br />
94-123 Sulfate <br />
124-153 Sea-salt <br />
154-183 Hydrophilic EC <br />
184-213 Hydrophobic EC <br />
214-243 Hydrophilic OC <br />
244-273 Hydrophobic OC <br />
274-303 Mineral dust <br />
304-333 Aerosol water<br />
<br />
TOMAS-40 requires 423 tracers (~360 TOMAS tracers for each of the 40-bin species, and ~62 standard GEOS-Chem tracers) <br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 18:48, 8 July 2013 (EDT)<br />
<br />
== Implementation notes ==<br />
<br />
TOMAS validation in [[GEOS-Chem v8-03-01]] was completed on 24 Feb 2010.<br />
<br />
=== Update April 2013 ===<br />
<br />
'''''This update was tested in the 1-month benchmark simulation [[GEOS-Chem_v9-02_benchmark_history#v9-02k|v9-02k]] and approved on 07 Jun 2013.'''''<br />
<br />
Sal Farina has been working with the GEOS-Chem Support Team to inline the TOMAS aerosol microphysics code into the <tt>GeosCore</tt> directory. All TOMAS-specific sections of code are now segregated from the rest of GEOS-Chem with C-preprocessor statements such as:<br />
<br />
#if defined( TOMAS )<br />
<br />
# if defined( TOMAS40 ) <br />
... Code for 40 bin TOMAS simulation (optional) goes here ...<br />
# elif defined( TOMAS12 )<br />
... Code for 12 bin TOMAS simulation (optional) goes here ...<br />
# elif defined( TOMAS15 )<br />
... Code for 15 bin TOMAS simulation (optional) goes here ...<br />
# else<br />
... Code for 30 bin TOMAS simulation (default) goes here ...<br />
# endif<br />
<br />
#endif <br />
<br />
TOMAS is now invoked by compiling GEOS-Chem with one of the following options:<br />
<br />
make -j4 TOMAS=yes ... # Compiles GEOS-Chem for the 30 bin (default) TOMAS simulation<br />
# -j4 compiles 4 files at a time; this reduces overall compilation time<br />
<br />
or<br />
<br />
make -j4 TOMAS40=yes ... # Compiles GEOS-Chem for the 40 bin (optional) TOMAS simulation<br />
# -j4 compiles 4 files at a time; this reduces overall compilation time<br />
<br />
All files in the old <tt>GeosTomas/</tt> directory have now been deleted, as these have been rendered obsolete.<br />
<br />
These updates will be included in [[GEOS-Chem v9-02]]. These modifications will not affect the existing GEOS-Chem simulations, as all TOMAS code is not compiled into the executable unless you specify either <tt>TOMAS=yes</tt> or <tt>TOMAS40=yes</tt> at compile time.<br />
<br />
We are in the process of updating the wiki to reflect these changes as they are implemented. <br />
<br />
--[[User:Bmy|Bob Y.]] 13:59, 23 April 2013 (EDT)<br><br />
--[[User:Salvatore Farina|Salvatore Farina]] 13:49, 4 June 2013 (EDT)<br />
<br />
=== Code structure ===<br />
<br />
'''''NOTE: This will be rendered obsolete by the [[#Update April 2013|re-integration of TOMAS into GEOS-Chem]], which is slated for [[GEOS-Chem v9-02]].'''''<br />
<br />
The main-level <tt>Code</tt> directory has now been divided into several subdirectories:<br />
<br />
GeosCore/ GEOS-Chem "core" routines<br />
GeosTomas/ Parallel copies of GEOS-Chem routines that reference TOMAS<br />
GeosUtil/ "Utility" modules (e.g. error_mod.f, file_mod.f, time_mod.f, etc.<br />
Headers/ Header files (define.h, CMN_SIZE, CMN_DIAG, etc.)<br />
KPP/ KPP solver directory structure<br />
bin/ Directory where executables are placed<br />
doc/ Directory where documentation is created<br />
help/ Directory for GEOS-Chem Help Screen<br />
lib/ Directory where library files are placed<br />
mod/ Directory where module files are placed<br />
obsolete/ Directory where obsolete versions of code are archived<br />
<br />
Because there were a lot of TOMAS-related modifications in several GEOS-Chem "core" routines, the routines that need to "talk" to TOMAS were placed into a separate subdirectory named <tt>GeosTomas/</tt>. The files in <tt>GeosTomas</tt> are:<br />
<br />
Files:<br />
------<br />
Makefile -- GEOS-Chem routines that have been<br />
aero_drydep.f modified to reference the TOMAS aerosol<br />
carbon_mod.f microphysics package. These are kept<br />
chemdr.f in a separate GeosTomas directory so that<br />
chemistry_mod.f they do not interfere with the routines<br />
cleanup.f in the GeosCore directory.<br />
diag3.f<br />
diag_mod.f The GeosTomas directory only needs to<br />
diag_pl_mod.f contain the files that have been modified<br />
drydep_mod.f for TOMAS. The Makefile will look for<br />
dust_mod.f all other files from the GeosCore directory<br />
emissions_mod.f using the VPATH option in GNU Make.<br />
gamap_mod.f<br />
initialize.f NOTE to GEOS-Chem developers: When you<br />
input_mod.f make changes to any of these routines<br />
isoropia_mod.f in the GeosCore directory, you must also<br />
logical_mod.f make the same modifications to the<br />
ndxx_setup.f corresponding routines in the GeosTomas<br />
planeflight_mod.f directory.<br />
seasalt_mod.f<br />
sulfate_mod.f Maybe in the near future we can work<br />
tomas_mod.f towards integrating TOMAS into the GeosCore<br />
tomas_tpcore_mod.f90 directory more cleanly. However, due to<br />
tpcore_mod.f the large number of modifications that were<br />
tpcore_window_mod.f necessary for TOMAS, it was quicker to<br />
tracerid_mod.f implement the TOMAS code in a separate<br />
wetscav_mod.f subdirectory. <br />
xtra_read_mod.f -- Bob Y. (1/25/10)<br />
<br />
Each of these files were merged with the corresponding files in the <tt>GeosCore</tt> subdirectory. Therefore, in addition to having the GEOS-Chem modifications from [[GEOS-Chem v8-02-05|v8-02-05]], these files also have the relevant TOMAS references.<br />
<br />
A few technical considerations dictated the placing of these files into a separate <tt>GeosTomas/</tt> directory:<br />
<br />
* The ND60 diagnostic in the standard GEOS-Chem code (in <tt>GeosCore/</tt>) is now used for the CH4 offline simulation, but in TOMAS it's used for something else. <br />
* Some parameters needed to be declared differently with for simulations with TOMAS. <br />
* Because not all GEOS-Chem users will choose to use TOMAS, we did not want to unnecessarily bog down the code in <tt>GeosCore/</tt> with references to TOMAS-specific routines. <br />
<br />
All of these concerns could be best solved by keeping parallel copies of the affected routines in the <tt>GeosTomas</tt> directory.<br />
<br />
--[[User:Bmy|Bob Y.]] 13:35, 25 February 2010 (EST)<br />
<br />
=== Building GEOS-Chem with TOMAS ===<br />
<br />
'''''NOTE: This will be rendered obsolete by the [[#Update April 2013|re-integration of TOMAS into GEOS-Chem]], which is slated for [[GEOS-Chem v9-02]].'''''<br />
<br />
The <tt>VPATH</tt> feature of [http://www.gnu.org/software/make/manual/make.html GNU Make] is used to simplify the compilation. When GEOS-Chem is compiled with the tomas target, the GNU Make utility will search for files in the <tt>GeosTomas/</tt> directory first. If it cannot find files there, it will then search the <tt>GeosCore/</tt> directory. Thus, if we make a change to a "core" GEOS-Chem routine in the <tt>GeosCore/</tt> subdirectory (say in <tt>dao_mod.f</tt> or <tt>diag49_mod.f</tt>), then those changes will automatically be applied when you build GEOS-Chem with TOMAS. Thus, we only need to keep in <tt>GeosTomas/</tt> separate copies of those files that have to "talk" with TOMAS.<br />
<br />
Several new targets were added to the <tt>Makefile</tt> in the top-level <tt>Code/</tt> directory:<br />
<br />
#=============================================================================<br />
# Targets for TOMAS aerosol microphysics code (win, bmy, 1/25/10)<br />
#=============================================================================<br />
<br />
.PHONY: tomas libtomas exetomas cleantomas<br />
<br />
tomas:<br />
@$(MAKE) -C $(GEOSTOM) TOMAS=yes all<br />
<br />
libtomas:<br />
@$(MAKE) -C $(GEOSTOM) TOMAS=yes lib<br />
<br />
exetomas:<br />
@$(MAKE) -C $(GEOSTOM) TOMAS=yes exe<br />
<br />
cleantomas:<br />
@$(MAKE) -C $(GEOSTOM) TOMAS=yes clean<br />
<br />
You can build GEOS-Chem with the TOMAS option by typing:<br />
<br />
make tomas ...<br />
<br />
This will automatically do the proper things to build the TOMAS code into GEOS-Chem, such as:<br />
<br />
* Adding a <tt>-DTOMAS</tt> C-preprocessor switch to the <tt>FFLAGS</tt> compiler flag settings in <tt>Makefile_header.mk</tt>. This will cause TOMAS-specific areas of code to be turned on.<br />
* Turning off OpenMP parallelization. For now the GEOS-Chem + TOMAS code needs to be run on a single processor. We continue to work on parallelizing the code.<br />
* Calling the Makefile in the <tt>GeosTomas/</tt> subdirectory to build the executable. The executable file is now named <tt>geostomas</tt> in order to denote that the TOMAS code is built in.<br />
<br />
The GEOS-Chem + TOMAS has been built on the following compilers<br />
<br />
* Intel Fortran compiler v10<br />
* Intel Fortran compiler v11.1 (20101201)<br />
* SunStudio 12<br />
<br />
--[[User:Bmy|Bob Y.]] 10:36, 27 January 2010 (EST)<br />
<br />
== Computational Information ==<br />
<br />
GC-TOMAS v9-02 (30 sections) on 8 processors: <br />
One year simulation = 7-8 days wall clock time<br />
<br />
More speedups are available using lower aerosol size resolution<br />
<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 11:00, 07 May 2013 (EST)<br />
<br />
== Set-up Guide ==<br />
<br />
This [[TOMAS setup guide]] was written for users on ACE-NET's Glooscap cluster, but may be more generally applicable.<br />
Please contact [mailto:sal.farina@gmail.com Salvatore Farina] for help in obtaining the latest development version of GEOS-Chem with TOMAS<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 11:55, 26 July 2013 (EDT)<br />
<br />
== Microphysics Code==<br />
The aerosol microphysics code is largely contained within the file <tt>tomas_mod.f</tt>. Tomas_mod and its subroutines are modular -- they use all their own internal variables. For details, see tomas_mod.f and comments. <br />
<br />
=== Nucleation ===<br />
The choice of nucleation theory is selected in the header section of <tt>tomas_mod.f</tt>. The choices are currently binary homogeneous nucleation as in Vehkamaki, 2001 or ternary homogenous nucleation as in Napari et al., 2002. The ternary nucleation rate is typically scaled by a globally uniform tuning factor of 10^-4 or 10^-5. Binary nucleation (Vehkamaki et al. 2002), ion-mediated nucleation (Yu, 2008) and activation nucleation (Kulmala, 2006) are options as well.<br />
<br />
In TOMAS-12 and TOMAS-30, nucleated particles follow the Kerminen approximation to grow to the smallest size bin. This has a tendency to overpredict the number of particles in the smallest bins of those models. See Y. H. Lee, J. R. Pierce, and P. J. Adams 2013 [http://www.geosci-model-dev-discuss.net/6/893/2013/gmdd-6-893-2013.html here] for more details on the consequences of this.<br />
<br />
=== Condensation ===<br />
<br />
=== Coagulation ===<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 14:08, 9 May 2011 (EST)<br />
<br />
== Validation ==<br />
<br />
GC-TOMAS [[GEOS-Chem v8-03-01|v8-03-01]] generally compares very well with observations and other models. Please see our [http://acmg.seas.harvard.edu/geos/wiki_docs/TOMAS/TOMAS_benchmark_ForHarvard.pdf GC-TOMAS v8-02-05 validation document] for more information and figures. <br />
<br />
Below are some results of benchmarking GC-TOMAS with earlier versions of the model as well as observations:<br />
<br />
[[Image:CN10_smaller.jpg]]<br />
<br />
'''Figure 1: CN10 concentrations predicted by GC-TOMAS v8-02-05 against observations. Descriptions of observational data can be found on p 5454 of Pierce et al, Atmos. Chem. Phys., 7, 2007.'''<br />
<br />
----<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 20:13, 10 February 2010 (EST)<br />
<br />
== Previous issues now resolved ==<br />
<br />
=== Segmentation Fault ===<br />
You may get an early segfault if your stacksize is not set to either unlimited or a very large number. To avoid this, you either have to change the value of an environmental variable (setenv command in <tt>.cshrc</tt>) or use the <tt>ulimit</tt> command. See [http://wiki.seas.harvard.edu/geos-chem/index.php/Machine_issues_%26_portability#Resetting_stacksize_for_Linux this page] for details.<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 20:20, 10 February 2010 (EST)<br />
<br />
== Outstanding issues ==<br />
<br />
=== Vertical Grids ===<br />
Currently, GC-TOMAS is only compatible with the reduced vertical grids:<br />
* [http://acmg.seas.harvard.edu/geos/doc/man/appendix_3.html#A3.3.1 GEOS3_30L]<br />
* [http://acmg.seas.harvard.edu/geos/doc/man/appendix_3.html#A3.4.1 GEOS4_30L]<br />
* [http://acmg.seas.harvard.edu/geos/doc/man/appendix_3.html#A3.5.1 GEOS5_47L]<br />
<br />
Development for the full vertical grids is ongoing.<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 20:43, 10 February 2010 (EST)<br />
<br />
=== Compile from GeosTomas directory ===<br />
<br />
'''''NOTE: This will be rendered obsolete by the [[#Update April 2013|re-integration of TOMAS into GEOS-Chem]], which is slated for [[GEOS-Chem v9-02]].'''''<br />
<br />
'''''[mailto:dwesterv@andrew.cmu.edu Dan Westervelt] wrote:'''''<br />
<br />
:I think there is something going wrong in my compilation, although errors have come up at both compile time and run time. The worst of the problems is this: I'll make a change to any fortran file in the code (even something meaningless like print*, 'foo') and hundreds of compile errors come out with fishy error messages such as (from ifort v10.1):<br />
<br />
***fortcom: Error: chemistry_mod.f, line 478: A kind type parameter must be a compile-time constant. [DP]<br />
REAL(kind=dp) :: RCNTRL(20)<br />
<br />
:Any advice? The errors I'm having are not unique to any version of GC, any type of met fields, any compiler, etc.<br />
<br />
'''''[mailto:yantosca@seas.harvard.edu Bob Yantosca] wrote:'''''<br />
<br />
:Make sure you are always in the GeosTomas subdirectory when you build the code. Sometimes there is a problem if you build the code from a higher level directory. This may have to do with the VPATH in the makefile.<br />
<br />
'''''[mailto:dwesterv@andrew.cmu.edu Dan Westervelt] wrote:'''''<br />
<br />
:Thanks, that seems to do the trick.<br />
<br />
--[[User:Bmy|Bob Y.]] 14:37, 14 April 2010 (EDT)<br />
<br />
== Other features of TOMAS ==<br />
Other varieties of TOMAS are suited for specific science questions, for example with nucleation studies where explicit aerosol dynamics are needed for nanometer-sized particles. <br />
<br />
=== Size Resolution ===<br />
<br />
'''TOMAS-30''': All 7 chemical species have size resolution ranging from 10 nm to 10 µm, spanned by 30 logarithmically spaced (mass doubling) bins.<br />
<br />
'''TOMAS-40''': Same as TOMAS-30 with 10 additional (mass doubling) sub-10nm bins with a lower limit ~1nm<br />
<br />
'''TOMAS-12''': All 7 chemical species have size resolution ranging from 10 nm to 1 µm spanned by 10 logarithmically spaced (mass quadrupling) bins and two supermicron bins. Coarser resolution than TOMAS-30 - Improved computation time. <br />
<br />
'''TOMAS-15''': Same as TOMAS-12 with 3 additional (mass quadrupling) sub-10nm bins with a lower limit ~2nm. Analogous to TOMAS40 with improved computation time.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 12:51, 4 June 2013 (EDT)<br />
<br />
=== Nesting and grid size ===<br />
TOMAS is implemented on a 2x2.5 North American domain. Developed by Jeffrey Pierce (jeffrey.pierce@dal.ca)<br />
<br />
=== AOD, CCN post-processing code ===<br />
Codes available for calculating aerosol optical depth using TOMAS predicted aerosol composition and size and Mie Theory. Also CCN concentrations calculated from TOMAS size-resolved composition and Kohler theory. Developed by Yunha Lee and Jeffrey Pierce, adapted for GEOS-Chem output by Jeffrey Pierce.<br />
<br />
--[[User:Dan Westervelt|Dan W.]] 2:00, 9 May 2011 (EST)<br />
<br />
== Reference ==<br />
'''Nucleation in GEOS-Chem'''<br />
Westervelt, D. M., Pierce, J. R., Riipinen, I., Trivitayanurak, W., Hamed, A., Kulmala, M., Laaksonen, A., Decesari, S., and Adams, P. J.: Formation and growth of nucleated particles into cloud condensation nuclei: model-measurement comparison, Atmos. Chem. Phys. Discuss., 13, 8333-8386, doi:10.5194/acpd-13-8333-2013, 2013. [http://www.atmos-chem-phys-discuss.net/13/8333/2013/acpd-13-8333-2013.html LINK]<br />
<br />
'''TOMAS implementation in GEOS-Chem:''' <br />
Trivitayanurak, W., Adams, P. J., Spracklen, D. V. and Carslaw, K. S.: Tropospheric aerosol microphysics simulation with assimilated meteorology: model description and intermodel comparison, Atmospheric Chemistry and Physics, 8(12), 3149-3168, 2008.<br />
<br />
'''TOMAS initial paper, sulfate only:''' <br />
Adams, P. J. and Seinfeld, J. H.: Predicting global aerosol size distributions in general circulation models, J Geophys Res-Atmos, 107(D19), -, doi:Artn 4370 Doi 10.1029/2001jd001010, 2002.<br />
<br />
'''TOMAS with sea-salt:'''<br />
Pierce, J.R., and Adams P.J., Global evaluation of CCN formation by direct emission of sea salt and growth of ultrafine sea salt, Journal of Geophysical Research-Atmospheres, 111 (D6), doi:10.1029/2005JD006186, 2006.<br />
<br />
'''TOMAS with carbonaceous aerosol:''' <br />
Pierce, J. R., Chen, K. and Adams, P. J.: Contribution of primary carbonaceous aerosol to cloud condensation nuclei: processes and uncertainties evaluated with a global aerosol microphysics model, Atmos. Chem. Phys., 7(20), 5447-5466, doi:10.5194/acp-7-5447-2007, 2007.<br />
<br />
'''TOMAS with dust:''' <br />
Lee, Y.H., K. Chen, and P.J. Adams, 2009: Development of a global model of mineral dust aerosol microphysics. Atmos. Chem. Phys., 8, 2441-2558, doi:10.5194/acp-9-2441-2009.</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=User_talk:Salvatore_Farina&diff=13735User talk:Salvatore Farina2013-07-26T15:44:03Z<p>Salvatore Farina: /* Overview */</p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with TOMAS microphysics on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
However, the 'bleeding edge' code has many recent developments in GEOS-Chem/TOMAS that are not included in the public release, including parallel computing.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Code ===<br />
You can grab the absolute latest code from my source directory on glooscap:<br />
cp -r /home/sfarina/source/GC_Bleeding_Edge/ ~<br />
<br />
or, (safer) you can grab my latest "snapshot"<br />
cp /home/sfarina/source/GC_BE_snapshot-latest.tgz .<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* Intel Ifort Fortran compiler - v11.1 - required to build geoschem<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
You can copy this folder as a tarball from /home/sfarina/gclibs.tgz or simply extract it directly to your home directory:<br />
cd ~<br />
tar zxvf /home/sfarina/gclibs.tgz<br />
<br />
This will extract the libraries folder to your home directory.<br />
<br />
=== Environment ===<br />
In order to get the compiler to run and recognize the libraries described above, some environment variables must be set. Below is an excerpt from my ''.bashrc''.<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
<br />
export FC="ifort"<br />
<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
<br />
ulimit -S -s unlimited<br />
<br />
If you are using bash, you can copy/paste this to your ''.bashrc''. Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11'' change instances of ''sfarina'' to your username.<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes due to recent GC development.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== Compiler ===<br />
Please note that the '''ONLY VERSION''' of the intel compiler which reliably compiles a working executable of geos-chem with TOMAS is version 11.1.<br />
Installation is described above in the libraries section.<br />
<br />
=== Make ===<br />
Glooscap allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do<br />
cd YOUR_CODE_DIR/GC_Bleeding_Edge/GeosCore<br />
pshell16<br />
make -j16 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on glooscap.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, '''ask'''.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. I have a copy built & installed at ''/home/sfarina/opt/bin/git'' that you can probably either copy or just use.<br />
==== Setup ====<br />
I have a copy of git installed at<br />
/home/sfarina/opt/bin<br />
You can either use this executable or build it yourself from source. To use this executable, add the following to your .bashrc<br />
export PATH="/home/sfarina/opt/bin:$PATH"<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have ''git'' installed, make a separate branch for yourself as soon as you make a copy of the code, this way we can easily trade/track updates/advances/bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
Now that you've successfully run the model, there are a few more hurdles to inspect your data.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
==== Copy ====<br />
Copy the relevant files to your postprocessing directory for a given run<br />
ctm.bpch<br />
diaginfo.dat<br />
tracerinfo.dat<br />
proc_one.pro<br />
averageCNCCN_XX.py <-- XX is TOMAS version<br />
plotCNCCN.py<br />
<br />
==== Split ====<br />
Use the script Bpch_Sep_Sal interactively from within the IDL environment to ctm.bpch into separate months<br />
For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
<br />
==== Create netcdf output ====<br />
Using the IDL script proc_one.pro, we extract information from the monthly .bpch files and save it to the standard netCDF<br />
Edit proc_one.pro to use the correct infile/outfiles<br />
Execute proc_one from your shell:<br />
idl proc_one.pro<br />
<br />
==== Counting CN and CCN ====<br />
Run averageCNCCN_XX.py, where XX is the model version<br />
For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
<br />
==== Plotting the Results====<br />
Edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version.<br />
plotCNCCN.py will automatically detect the model version and customize map names.<br />
To plot the surface and zonal average concentrations of CN3, CN10, CN40, and CN80 for august:<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
<br />
==== NCview ====<br />
You can also use ncview on the file ctm.nc to view individual species concentrations or nucleation rates.<br />
ncview ctm.nc<br />
ncview ctm_nuc.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups is a better idea.<br />
* if you have any questions or you are running into trouble, ''please ask'' either myself, Sajeev, or Jeff for help. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=User_talk:Salvatore_Farina&diff=13730User talk:Salvatore Farina2013-07-25T21:54:20Z<p>Salvatore Farina: /* Version Control */</p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with TOMAS microphysics on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
Lately I've been working on the "Bleeding Edge" code to address these issues. Here's a guide that should help you get started if you're using the glooscap cluster.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Code ===<br />
You can grab the absolute latest code from my source directory on glooscap:<br />
cp -r /home/sfarina/source/GC_Bleeding_Edge/ ~<br />
<br />
or, (safer) you can grab my latest "snapshot"<br />
cp /home/sfarina/source/GC_BE_snapshot-latest.tgz .<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* Intel Ifort Fortran compiler - v11.1 - required to build geoschem<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
You can copy this folder as a tarball from /home/sfarina/gclibs.tgz or simply extract it directly to your home directory:<br />
cd ~<br />
tar zxvf /home/sfarina/gclibs.tgz<br />
<br />
This will extract the libraries folder to your home directory.<br />
<br />
=== Environment ===<br />
In order to get the compiler to run and recognize the libraries described above, some environment variables must be set. Below is an excerpt from my ''.bashrc''.<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
<br />
export FC="ifort"<br />
<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
<br />
ulimit -S -s unlimited<br />
<br />
If you are using bash, you can copy/paste this to your ''.bashrc''. Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11'' change instances of ''sfarina'' to your username.<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes due to recent GC development.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== Compiler ===<br />
Please note that the '''ONLY VERSION''' of the intel compiler which reliably compiles a working executable of geos-chem with TOMAS is version 11.1.<br />
Installation is described above in the libraries section.<br />
<br />
=== Make ===<br />
Glooscap allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do<br />
cd YOUR_CODE_DIR/GC_Bleeding_Edge/GeosCore<br />
pshell16<br />
make -j16 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on glooscap.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, '''ask'''.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. I have a copy built & installed at ''/home/sfarina/opt/bin/git'' that you can probably either copy or just use.<br />
==== Setup ====<br />
I have a copy of git installed at<br />
/home/sfarina/opt/bin<br />
You can either use this executable or build it yourself from source. To use this executable, add the following to your .bashrc<br />
export PATH="/home/sfarina/opt/bin:$PATH"<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have ''git'' installed, make a separate branch for yourself as soon as you make a copy of the code, this way we can easily trade/track updates/advances/bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
Now that you've successfully run the model, there are a few more hurdles to inspect your data.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
==== Copy ====<br />
Copy the relevant files to your postprocessing directory for a given run<br />
ctm.bpch<br />
diaginfo.dat<br />
tracerinfo.dat<br />
proc_one.pro<br />
averageCNCCN_XX.py <-- XX is TOMAS version<br />
plotCNCCN.py<br />
<br />
==== Split ====<br />
Use the script Bpch_Sep_Sal interactively from within the IDL environment to ctm.bpch into separate months<br />
For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
<br />
==== Create netcdf output ====<br />
Using the IDL script proc_one.pro, we extract information from the monthly .bpch files and save it to the standard netCDF<br />
Edit proc_one.pro to use the correct infile/outfiles<br />
Execute proc_one from your shell:<br />
idl proc_one.pro<br />
<br />
==== Counting CN and CCN ====<br />
Run averageCNCCN_XX.py, where XX is the model version<br />
For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
<br />
==== Plotting the Results====<br />
Edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version.<br />
plotCNCCN.py will automatically detect the model version and customize map names.<br />
To plot the surface and zonal average concentrations of CN3, CN10, CN40, and CN80 for august:<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
<br />
==== NCview ====<br />
You can also use ncview on the file ctm.nc to view individual species concentrations or nucleation rates.<br />
ncview ctm.nc<br />
ncview ctm_nuc.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups is a better idea.<br />
* if you have any questions or you are running into trouble, ''please ask'' either myself, Sajeev, or Jeff for help. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=User_talk:Salvatore_Farina&diff=13729User talk:Salvatore Farina2013-07-25T21:47:14Z<p>Salvatore Farina: /* Branching and Commits */</p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with TOMAS microphysics on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
Lately I've been working on the "Bleeding Edge" code to address these issues. Here's a guide that should help you get started if you're using the glooscap cluster.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Code ===<br />
You can grab the absolute latest code from my source directory on glooscap:<br />
cp -r /home/sfarina/source/GC_Bleeding_Edge/ ~<br />
<br />
or, (safer) you can grab my latest "snapshot"<br />
cp /home/sfarina/source/GC_BE_snapshot-latest.tgz .<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* Intel Ifort Fortran compiler - v11.1 - required to build geoschem<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
You can copy this folder as a tarball from /home/sfarina/gclibs.tgz or simply extract it directly to your home directory:<br />
cd ~<br />
tar zxvf /home/sfarina/gclibs.tgz<br />
<br />
This will extract the libraries folder to your home directory.<br />
<br />
=== Environment ===<br />
In order to get the compiler to run and recognize the libraries described above, some environment variables must be set. Below is an excerpt from my ''.bashrc''.<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
<br />
export FC="ifort"<br />
<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
<br />
ulimit -S -s unlimited<br />
<br />
If you are using bash, you can copy/paste this to your ''.bashrc''. Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11'' change instances of ''sfarina'' to your username.<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes due to recent GC development.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== Compiler ===<br />
Please note that the '''ONLY VERSION''' of the intel compiler which reliably compiles a working executable of geos-chem with TOMAS is version 11.1.<br />
Installation is described above in the libraries section.<br />
<br />
=== Make ===<br />
Glooscap allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do<br />
cd YOUR_CODE_DIR/GC_Bleeding_Edge/GeosCore<br />
pshell16<br />
make -j16 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on glooscap.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, '''ask'''.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. I have a copy built/installed at /home/sfarina/opt/bin/git that you can probably either copy or just use.<br />
==== Setup ====<br />
I have a copy of git installed at<br />
/home/sfarina/opt/bin<br />
You can either use this executable or build it yourself from source. To use this executable, add the following to your .bashrc<br />
export PATH="/home/sfarina/opt/bin:$PATH"<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have ''git'' installed, make a separate branch for yourself as soon as you make a copy of the code, this way we can easily trade/track updates/advances/bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
Now that you've successfully run the model, there are a few more hurdles to inspect your data.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
==== Copy ====<br />
Copy the relevant files to your postprocessing directory for a given run<br />
ctm.bpch<br />
diaginfo.dat<br />
tracerinfo.dat<br />
proc_one.pro<br />
averageCNCCN_XX.py <-- XX is TOMAS version<br />
plotCNCCN.py<br />
<br />
==== Split ====<br />
Use the script Bpch_Sep_Sal interactively from within the IDL environment to ctm.bpch into separate months<br />
For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
<br />
==== Create netcdf output ====<br />
Using the IDL script proc_one.pro, we extract information from the monthly .bpch files and save it to the standard netCDF<br />
Edit proc_one.pro to use the correct infile/outfiles<br />
Execute proc_one from your shell:<br />
idl proc_one.pro<br />
<br />
==== Counting CN and CCN ====<br />
Run averageCNCCN_XX.py, where XX is the model version<br />
For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
<br />
==== Plotting the Results====<br />
Edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version.<br />
plotCNCCN.py will automatically detect the model version and customize map names.<br />
To plot the surface and zonal average concentrations of CN3, CN10, CN40, and CN80 for august:<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
<br />
==== NCview ====<br />
You can also use ncview on the file ctm.nc to view individual species concentrations or nucleation rates.<br />
ncview ctm.nc<br />
ncview ctm_nuc.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups is a better idea.<br />
* if you have any questions or you are running into trouble, ''please ask'' either myself, Sajeev, or Jeff for help. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=User_talk:Salvatore_Farina&diff=13728User talk:Salvatore Farina2013-07-25T21:46:43Z<p>Salvatore Farina: /* Other Advice / Issues */</p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with TOMAS microphysics on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
Lately I've been working on the "Bleeding Edge" code to address these issues. Here's a guide that should help you get started if you're using the glooscap cluster.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Code ===<br />
You can grab the absolute latest code from my source directory on glooscap:<br />
cp -r /home/sfarina/source/GC_Bleeding_Edge/ ~<br />
<br />
or, (safer) you can grab my latest "snapshot"<br />
cp /home/sfarina/source/GC_BE_snapshot-latest.tgz .<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* Intel Ifort Fortran compiler - v11.1 - required to build geoschem<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
You can copy this folder as a tarball from /home/sfarina/gclibs.tgz or simply extract it directly to your home directory:<br />
cd ~<br />
tar zxvf /home/sfarina/gclibs.tgz<br />
<br />
This will extract the libraries folder to your home directory.<br />
<br />
=== Environment ===<br />
In order to get the compiler to run and recognize the libraries described above, some environment variables must be set. Below is an excerpt from my ''.bashrc''.<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
<br />
export FC="ifort"<br />
<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
<br />
ulimit -S -s unlimited<br />
<br />
If you are using bash, you can copy/paste this to your ''.bashrc''. Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11'' change instances of ''sfarina'' to your username.<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes due to recent GC development.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== Compiler ===<br />
Please note that the '''ONLY VERSION''' of the intel compiler which reliably compiles a working executable of geos-chem with TOMAS is version 11.1.<br />
Installation is described above in the libraries section.<br />
<br />
=== Make ===<br />
Glooscap allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do<br />
cd YOUR_CODE_DIR/GC_Bleeding_Edge/GeosCore<br />
pshell16<br />
make -j16 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on glooscap.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, '''ask'''.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. I have a copy built/installed at /home/sfarina/opt/bin/git that you can probably either copy or just use.<br />
==== Setup ====<br />
I have a copy of git installed at<br />
/home/sfarina/opt/bin<br />
You can either use this executable or build it yourself from source. To use this executable, add the following to your .bashrc<br />
export PATH="/home/sfarina/opt/bin:$PATH"<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have git installed Make a separate branch for yourself as soon as you make a copy of the code, this way we can easily trade/track updates / advances / bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
Now that you've successfully run the model, there are a few more hurdles to inspect your data.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
==== Copy ====<br />
Copy the relevant files to your postprocessing directory for a given run<br />
ctm.bpch<br />
diaginfo.dat<br />
tracerinfo.dat<br />
proc_one.pro<br />
averageCNCCN_XX.py <-- XX is TOMAS version<br />
plotCNCCN.py<br />
<br />
==== Split ====<br />
Use the script Bpch_Sep_Sal interactively from within the IDL environment to ctm.bpch into separate months<br />
For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
<br />
==== Create netcdf output ====<br />
Using the IDL script proc_one.pro, we extract information from the monthly .bpch files and save it to the standard netCDF<br />
Edit proc_one.pro to use the correct infile/outfiles<br />
Execute proc_one from your shell:<br />
idl proc_one.pro<br />
<br />
==== Counting CN and CCN ====<br />
Run averageCNCCN_XX.py, where XX is the model version<br />
For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
<br />
==== Plotting the Results====<br />
Edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version.<br />
plotCNCCN.py will automatically detect the model version and customize map names.<br />
To plot the surface and zonal average concentrations of CN3, CN10, CN40, and CN80 for august:<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
<br />
==== NCview ====<br />
You can also use ncview on the file ctm.nc to view individual species concentrations or nucleation rates.<br />
ncview ctm.nc<br />
ncview ctm_nuc.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups is a better idea.<br />
* if you have any questions or you are running into trouble, ''please ask'' either myself, Sajeev, or Jeff for help. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=User_talk:Salvatore_Farina&diff=13727User talk:Salvatore Farina2013-07-25T21:45:55Z<p>Salvatore Farina: /* Developing */</p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with TOMAS microphysics on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
Lately I've been working on the "Bleeding Edge" code to address these issues. Here's a guide that should help you get started if you're using the glooscap cluster.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Code ===<br />
You can grab the absolute latest code from my source directory on glooscap:<br />
cp -r /home/sfarina/source/GC_Bleeding_Edge/ ~<br />
<br />
or, (safer) you can grab my latest "snapshot"<br />
cp /home/sfarina/source/GC_BE_snapshot-latest.tgz .<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* Intel Ifort Fortran compiler - v11.1 - required to build geoschem<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
You can copy this folder as a tarball from /home/sfarina/gclibs.tgz or simply extract it directly to your home directory:<br />
cd ~<br />
tar zxvf /home/sfarina/gclibs.tgz<br />
<br />
This will extract the libraries folder to your home directory.<br />
<br />
=== Environment ===<br />
In order to get the compiler to run and recognize the libraries described above, some environment variables must be set. Below is an excerpt from my ''.bashrc''.<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
<br />
export FC="ifort"<br />
<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
<br />
ulimit -S -s unlimited<br />
<br />
If you are using bash, you can copy/paste this to your ''.bashrc''. Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11'' change instances of ''sfarina'' to your username.<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes due to recent GC development.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== Compiler ===<br />
Please note that the '''ONLY VERSION''' of the intel compiler which reliably compiles a working executable of geos-chem with TOMAS is version 11.1.<br />
Installation is described above in the libraries section.<br />
<br />
=== Make ===<br />
Glooscap allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do<br />
cd YOUR_CODE_DIR/GC_Bleeding_Edge/GeosCore<br />
pshell16<br />
make -j16 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on glooscap.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, '''ask'''.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. I have a copy built/installed at /home/sfarina/opt/bin/git that you can probably either copy or just use.<br />
==== Setup ====<br />
I have a copy of git installed at<br />
/home/sfarina/opt/bin<br />
You can either use this executable or build it yourself from source. To use this executable, add the following to your .bashrc<br />
export PATH="/home/sfarina/opt/bin:$PATH"<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have git installed Make a separate branch for yourself as soon as you make a copy of the code, this way we can easily trade/track updates / advances / bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
Now that you've successfully run the model, there are a few more hurdles to inspect your data.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
==== Copy ====<br />
Copy the relevant files to your postprocessing directory for a given run<br />
ctm.bpch<br />
diaginfo.dat<br />
tracerinfo.dat<br />
proc_one.pro<br />
averageCNCCN_XX.py <-- XX is TOMAS version<br />
plotCNCCN.py<br />
<br />
==== Split ====<br />
Use the script Bpch_Sep_Sal interactively from within the IDL environment to ctm.bpch into separate months<br />
For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
<br />
==== Create netcdf output ====<br />
Using the IDL script proc_one.pro, we extract information from the monthly .bpch files and save it to the standard netCDF<br />
Edit proc_one.pro to use the correct infile/outfiles<br />
Execute proc_one from your shell:<br />
idl proc_one.pro<br />
<br />
==== Counting CN and CCN ====<br />
Run averageCNCCN_XX.py, where XX is the model version<br />
For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
<br />
==== Plotting the Results====<br />
Edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version.<br />
plotCNCCN.py will automatically detect the model version and customize map names.<br />
To plot the surface and zonal average concentrations of CN3, CN10, CN40, and CN80 for august:<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
<br />
==== NCview ====<br />
You can also use ncview on the file ctm.nc to view individual species concentrations or nucleation rates.<br />
ncview ctm.nc<br />
ncview ctm_nuc.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups is a better idea.<br />
* if you have any questions or you are running into trouble, please ask. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=User_talk:Salvatore_Farina&diff=13726User talk:Salvatore Farina2013-07-25T21:40:15Z<p>Salvatore Farina: /* Processing */</p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with TOMAS microphysics on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
Lately I've been working on the "Bleeding Edge" code to address these issues. Here's a guide that should help you get started if you're using the glooscap cluster.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Code ===<br />
You can grab the absolute latest code from my source directory on glooscap:<br />
cp -r /home/sfarina/source/GC_Bleeding_Edge/ ~<br />
<br />
or, (safer) you can grab my latest "snapshot"<br />
cp /home/sfarina/source/GC_BE_snapshot-latest.tgz .<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* Intel Ifort Fortran compiler - v11.1 - required to build geoschem<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
You can copy this folder as a tarball from /home/sfarina/gclibs.tgz or simply extract it directly to your home directory:<br />
cd ~<br />
tar zxvf /home/sfarina/gclibs.tgz<br />
<br />
This will extract the libraries folder to your home directory.<br />
<br />
=== Environment ===<br />
In order to get the compiler to run and recognize the libraries described above, some environment variables must be set. Below is an excerpt from my ''.bashrc''.<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
<br />
export FC="ifort"<br />
<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
<br />
ulimit -S -s unlimited<br />
<br />
If you are using bash, you can copy/paste this to your ''.bashrc''. Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11'' change instances of ''sfarina'' to your username.<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes due to recent GC development.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== Compiler ===<br />
Please note that the '''ONLY VERSION''' of the intel compiler which reliably compiles a working executable of geos-chem with TOMAS is version 11.1.<br />
Installation is described above in the libraries section.<br />
<br />
=== Make ===<br />
Glooscap allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do<br />
cd YOUR_CODE_DIR/GC_Bleeding_Edge/GeosCore<br />
pshell16<br />
make -j16 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on glooscap.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, ask.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. I have a copy built/installed at /home/sfarina/opt/bin/git that you can probably either copy or just use.<br />
==== Setup ====<br />
I have a copy of git installed at<br />
/home/sfarina/opt/bin<br />
You can either use this executable or build it yourself from source. To use this executable, add the following to your .bashrc<br />
export PATH="/home/sfarina/opt/bin:$PATH"<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have git installed Make a separate branch for yourself as soon as you make a copy of the code, this way we can easily trade/track updates / advances / bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
Now that you've successfully run the model, there are a few more hurdles to inspect your data.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
==== Copy ====<br />
Copy the relevant files to your postprocessing directory for a given run<br />
ctm.bpch<br />
diaginfo.dat<br />
tracerinfo.dat<br />
proc_one.pro<br />
averageCNCCN_XX.py <-- XX is TOMAS version<br />
plotCNCCN.py<br />
<br />
==== Split ====<br />
Use the script Bpch_Sep_Sal interactively from within the IDL environment to ctm.bpch into separate months<br />
For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
<br />
==== Create netcdf output ====<br />
Using the IDL script proc_one.pro, we extract information from the monthly .bpch files and save it to the standard netCDF<br />
Edit proc_one.pro to use the correct infile/outfiles<br />
Execute proc_one from your shell:<br />
idl proc_one.pro<br />
<br />
==== Counting CN and CCN ====<br />
Run averageCNCCN_XX.py, where XX is the model version<br />
For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
<br />
==== Plotting the Results====<br />
Edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version.<br />
plotCNCCN.py will automatically detect the model version and customize map names.<br />
To plot the surface and zonal average concentrations of CN3, CN10, CN40, and CN80 for august:<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
<br />
==== NCview ====<br />
You can also use ncview on the file ctm.nc to view individual species concentrations or nucleation rates.<br />
ncview ctm.nc<br />
ncview ctm_nuc.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups is a better idea.<br />
* if you have any questions or you are running into trouble, please ask. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farinahttps://wiki.seas.harvard.edu/geos-chem/index.php?title=User_talk:Salvatore_Farina&diff=13725User talk:Salvatore Farina2013-07-25T21:33:00Z<p>Salvatore Farina: /* Version Control */</p>
<hr />
<div>This page describes how to acquire the latest source code, data, and libraries required to build and run GEOS-Chem with TOMAS microphysics on the ace-net glooscap cluster.<br />
<br />
== Overview ==<br />
<br />
The latest public release of GEOS-Chem with TOMAS does not include many of the recent developments in aerosol science. It also cannot take advantage of parallel computing technologies.<br />
Lately I've been working on the "Bleeding Edge" code to address these issues. Here's a guide that should help you get started if you're using the glooscap cluster.<br />
<br />
== Getting Set Up ==<br />
<br />
=== Code ===<br />
You can grab the absolute latest code from my source directory on glooscap:<br />
cp -r /home/sfarina/source/GC_Bleeding_Edge/ ~<br />
<br />
or, (safer) you can grab my latest "snapshot"<br />
cp /home/sfarina/source/GC_BE_snapshot-latest.tgz .<br />
<br />
=== Libraries ===<br />
'''geos-chem-libraries-intel11''' is a bundle of software required to build and run the latest version of GEOS-Chem.<br />
Included in this package:<br />
* Intel Ifort Fortran compiler - v11.1 - required to build geoschem<br />
* NetCDF - Network Common Data Format libraries - required to read and write certain datasets<br />
* HDF5 - Hierarchical Data Format - required to read and write certain datasets<br />
* other dependencies - required for netcdf and hdf5<br />
<br />
You can copy this folder as a tarball from /home/sfarina/gclibs.tgz or simply extract it directly to your home directory:<br />
cd ~<br />
tar zxvf /home/sfarina/gclibs.tgz<br />
<br />
This will extract the libraries folder to your home directory.<br />
<br />
=== Environment ===<br />
In order to get the compiler to run and recognize the libraries described above, some environment variables must be set. Below is an excerpt from my ''.bashrc''.<br />
<br />
ROOT_LIBRARY_DIR="/home/sfarina/geos-chem-libraries-intel11"<br />
GC_BIN=$ROOT_LIBRARY_DIR/bin<br />
GC_INCLUDE=$ROOT_LIBRARY_DIR/include<br />
GC_LIB=$ROOT_LIBRARY_DIR/lib<br />
export GC_BIN<br />
export GC_INCLUDE<br />
export GC_LIB<br />
<br />
export FC="ifort"<br />
<br />
export LD_LIBRARY_PATH="/home/sfarina/geos-chem-libraries-intel11/lib"<br />
export PATH="/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64:/home/sfarina/opt/bin:$PATH"<br />
export LD_LIBRARY_PATH="/usr/local/gnu/lib64:/usr/local/gnu/lib:/home/sfarina/geos-chem-libraries-intel11/lib:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/lib/intel64/:/home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/idb/lib/intel64"<br />
export INTEL_LICENSE_FILE="/home/sfarina/geos-chem-libraries-intel11/software/intel/Compiler/11.1/080/Licenses"<br />
source /home/sfarina/geos-chem-libraries-intel11/Compiler/11.1/080/bin/ifortvars.sh intel64<br />
<br />
ulimit -S -s unlimited<br />
<br />
If you are using bash, you can copy/paste this to your ''.bashrc''. Once the compiler and libraries are installed in ''~/geos-chem-libraries-intel11'' change instances of ''sfarina'' to your username.<br />
source ~/.bashrc<br />
ifort --version<br />
<br />
If ifort returns<br />
ifort (IFORT) 11.1 20101201<br />
you should be all set to start compiling<br />
<br />
=== Data ===<br />
To set up the necessary data for GEOS-Chem, simply<br />
cd ~<br />
ln -s /home/sfarina/data .<br />
<br />
This will allow you to link to my data directory, which is mostly a collection of links to the data at ''/home/rmartin/group/ctm/'' with some changes due to recent GC development.<br />
'''DO NOT''' copy this directory, as it is many many many gigabytes, and is probably beyond your disk quota on glooscap.<br />
<br />
== Building GEOS-Chem/TOMAS ==<br />
<br />
=== Compiler ===<br />
Please note that the '''ONLY VERSION''' of the intel compiler which reliably compiles a working executable of geos-chem with TOMAS is version 11.1.<br />
Installation is described above in the libraries section.<br />
<br />
=== Make ===<br />
Glooscap allows you to use multicore interactive shells to do heavy processing. I invoke a 16 core shell to build geoschem. put this in your .bashrc<br />
alias pshell16="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_16 -pe openmp 16 bash"<br />
alias pshell8="qrsh -V -cwd -l h_rt=08:00:00 -l h_vmem=2.0G -l h_stack=12.5G -N IA_8 -pe openmp 8 bash"<br />
<br />
Then you can do<br />
cd YOUR_CODE_DIR/GC_Bleeding_Edge/GeosCore<br />
pshell16<br />
make -j16 tomas40<br />
<br />
This will build GEOS-Chem with 40 bin TOMAS using 16 processors at a time. As an added bonus, this will not choke up the rest of the users on glooscap.<br />
<br />
The available target names are:<br />
tomas <--TOMAS 30<br />
tomas12<br />
tomas15<br />
tomas40<br />
<br />
==== Important! ====<br />
When changing tomas versions, always always always do<br />
make realclean<br />
<br />
== Running GEOS-Chem with TOMAS ==<br />
<br />
=== Run Directories ===<br />
There are run directories for each of the tomas versions at:<br />
/net/samqfs/pierce/sfarina/standard_run_directories/<br />
<br />
Copy the tarballs (named 40.tgz, 30.tgz, etc.) to a standard location. You can then do<br />
tar zxvf YOUR_STANDARD_LOCATION/40.tgz<br />
to extract the appropriate run directory to your current working directory. The folder will be named ''run.TOMASXX'', where ''XX'' is 12, 15,30, or 40 depending on the version you would like to run.<br />
<br />
Once you have the appropriate version of geostomas compiled and your run directory extracted, copy the executable to your run directory.<br />
<br />
=== input.geos ===<br />
The input.geos file is where most of the runtime options for geoschem are configured.<br />
There are currently no TOMAS specific entries in the input.geos file, save for diagnostic output quantities.<br />
Please see the [http://acmg.seas.harvard.edu/geos/doc/man/chapter_5.html#5.2.1 Users' Guide] for more information.<br />
<br />
=== Submitting Jobs to the Parallel Queue ===<br />
In each folder is a file called ''parallel.sh''. Below is a description of some of the parameters:<br />
#!/bin/bash<br />
# $ -S /bin/bash<br />
./etc/profile<br />
#$ -o job_output<br />
#$ -l h_rt=100:00:00 #wall clock time requested from grid engine. Lower request times will have higher priority in the queue<br />
#$ -l h_vmem=2.0G #vmem requested from grid engine. 2.0 is sufficient for all versions at 4x5 and TOMAS15 at 2x2.5 on 16 cores<br />
#$ -l h_stack=12.5G #stack memory requested from grid engine<br />
#$ -N RUN_NAM #a name for your run<br />
#$ -pe openmp 16 #number of cores you are requesting from grid engine<br />
#$ -cwd #inherit properties from your current shell<br />
export OMP_NUM_THREADS=16 #number of openMP threads<br />
export KMP_STACKSIZE=500000000 #stacksize memory limit for each thread<br />
<br />
ulimit -t unlimited # cputime<br />
ulimit -f unlimited # filesize<br />
ulimit -c unlimited # coredumpsize<br />
ulimit -m unlimited # memoryuse<br />
ulimit -l unlimited # memorylocked<br />
<br />
cd YOUR_RUN DIRECTORY<br />
./geostomas > log<br />
<br />
You'll need to edit it slightly (run name and working directory), then run:<br />
qsub parallel.sh<br />
<br />
You can check on the status in the queue with<br />
qstat<br />
<br />
You can watch the logfile output of your simulation with<br />
tail -f log<br />
<br />
With some minimal editing, you can find some summary information from your runs using the script here<br />
/net/samqfs/pierce/sfarina/testruns/informed/hourstat.sh<br />
<br />
== Developing ==<br />
Writing for GEOS_Chem is pretty straightforward. Please try to follow the [http://acmg.seas.harvard.edu/geos/doc/man/appendix_7.html style guide] as much as possible. Most of TOMAS is contained within tomas_mod.F90, and you should be able to find what you need with a little work and a few invocations of ''grep''. If you can't find what you need, ask.<br />
<br />
=== Version Control ===<br />
Git! You should definitely use [http://git-scm.com/ git] to track your changes. I have a copy built/installed at /home/sfarina/opt/bin/git that you can probably either copy or just use.<br />
==== Setup ====<br />
I have a copy of git installed at<br />
/home/sfarina/opt/bin<br />
You can either use this executable or build it yourself from source. To use this executable, add the following to your .bashrc<br />
export PATH="/home/sfarina/opt/bin:$PATH"<br />
<br />
==== Branching and Commits ====<br />
<br />
Once you have git installed Make a separate branch for yourself as soon as you make a copy of the code, this way we can easily trade/track updates / advances / bugfixes.<br />
git checkout -b MY_NEW_BRANCH<br />
vi fictional_example_mod.F90<br />
git status<br />
git add fictional_example_mod.F90<br />
git commit<br />
<br />
==== Patching ====<br />
If I make some new changes to my branch of code, you will need to do a patch and merge. My current branch in git is called '''tomasmerge'''. If I provide you with '''update.patch''', this should do the trick:<br />
git checkout tomasmerge<br />
git apply update.patch<br />
git checkout MY_BRANCH<br />
git merge tomasmerge<br />
<br />
==== Reference ====<br />
There are many useful resources for git on the web. Here are some I found useful:<br />
* [http://git-scm.com/book/en/Git-Branching-Basic-Branching-and-Merging Branching and Merging]<br />
* [http://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/ Creating and Applying Patches]<br />
* [http://lostechies.com/joshuaflanagan/2010/09/03/use-gitk-to-understand-git/ Understanding git through gitk]<br />
<br />
=== Debugging ===<br />
There are two major ways of debugging: inserting massive amounts of print statements, or using a debugger. Both are useful.<br />
<br />
ifort comes with a debugger similar to gdb: iidb.<br />
geos-chem-libraries-intel11/Compiler/11.1/080/bin/intel64/iidb<br />
In order to use it, you must compile geostomas as follows<br />
make realclean<br />
make DEBUG=yes tomas<br />
<br />
Apart from the debugger and normal print statements, TOMAS has a very useful builtin called ''DEBUGPRINT'', that prints the values of the TOMAS size bins in a big table.<br />
<br />
== Post Processing ==<br />
Now that you've successfully run the model, there are a few more hurdles to inspect your data.<br />
<br />
=== Installing IDL ===<br />
Copy the IDL / gamap scripts from my home directory.<br />
cp -r ~sfarina/IDL ~<br />
<br />
Edit the following as needed, and add it to your .bashrc<br />
IDL_STARTUP="/home/sfarina/IDL/idl_startup/idl_startup.pro"<br />
IDL_DIR="/usr/local/itt/idl/idl80/"<br />
IDL_PATH="$IDL_DIR:/home/sfarina/IDL"<br />
module load idl/8.0<br />
<br />
=== Processing ===<br />
GEOS-Chem currently outputs all data in the form of a binary punch file (.bpch). These files must be handled using IDL. The process is outlined below:<br />
<br />
* Copy ctm.bpch, diaginfo.dat, tracerinfo.dat to your postprocessing directory for that run<br />
* Split ctm.bpch into separate months<br />
** Use the script Bpch_Sep_Sal interactively from within the IDL environment<br />
** For example, to extract august, 2005 from ctm.bpch<br />
idl<br />
> Bpch_Sep_Sal,'ctm.bpch','ctm.08.bpch',Tau0=nymd2tau(20050801) <br />
> exit<br />
* Create netcdf output from the monthly .bpch files <br />
** Copy the proc_one.pro to your directory<br />
** Edit proc_one.pro to use the correct infile/outfiles<br />
** Execute proc_one from your shell:<br />
idl proc_one.pro<br />
* Run averageCNCCN.py<br />
** Copy the correct version of averageCNCCN_XX.py, where XX is the number of model bins<br />
** For example, to bin and average the August results from TOMAS15: <br />
./averageCNCCN_15.py 08<br />
* Run plotCNCCN.py<br />
** Edit the script, or edit your directory name to be of the format YYY_run.TOMASXX, where YYY is a run number, and XX is the TOMAS version. it will automatically detect various parameters.<br />
./plotCNCCN.py 08<br />
<br />
Once you have completed this process, you will have a zonal and surface level map of CN3, CN10, CN40 and CN80 predicted by the model.<br />
You can also use ncview on the file ctm.nc to view individual species concentrations.<br />
ncview ctm.nc<br />
<br />
== Other Advice / Issues==<br />
* If you have followed these instructions and geoschem crashes without any output, try (un)commenting the ''"welcome to geoschem"'' and the following ''call flush'' lines from main.F<br />
* I use the GNU Bourne Again SHell (bash). I suggest you do the same. The csh is fine, but I have written all of my scripts using bash. Your life will probably be easier if you use bash.<br />
* It is a good idea to TAKE NOTES on the details of your simulations.<br />
* Making a backup of your code and any important files is a good idea. Making two backups is a better idea.<br />
* if you have any questions or you are running into trouble, please ask. I am usually able to respond to emails within a day, and am willing to use gchat or skype if need be.<br />
<br />
--[[User:Salvatore Farina|Salvatore Farina]] 17:28, 25 July 2013 (EDT)</div>Salvatore Farina