Setting up GEOS-Chem nested grid simulations: Difference between revisions

From Geos-chem
Jump to navigation Jump to search
Line 9: Line 9:
[[Creating_GEOS-Chem_run_directories|Create a run directory]] for your global 4° x 5° or 2° x 2.5° simulation.   
[[Creating_GEOS-Chem_run_directories|Create a run directory]] for your global 4° x 5° or 2° x 2.5° simulation.   


<blockquote>NOTE: There is no "preferred" resolution for generating boundary conditions.  More often that not, it will depend on what type of nested-grid simulation that you wish to run.   
<blockquote>NOTE: There is no "preferred" resolution for generating boundary conditions.  More often than not, the choice of global simulation resolution
will depend on the ype of nested-grid simulation that you wish to run.   


For example, saving boundary condition output from a 4&deg; x 5&deg; standard full chemistry simulation will take much less time than from a 2&deg; x 2.5&deg; simulation.  Also the netCDF file sizes from a 4&deg; x 5&deg; simulation are smaller than at 2&deg; x 2.5&deg;, which may matter to you if you have limited disk space available.   
For example, saving boundary condition output from a 4&deg; x 5&deg; standard full chemistry simulation will take much less time than from a 2&deg; x 2.5&deg; simulation.  Also the netCDF file sizes from a 4&deg; x 5&deg; simulation are smaller than at 2&deg; x 2.5&deg;, which may matter to you if you have limited disk space available.   

Revision as of 18:54, 13 July 2020

On this page we provide information about setting up the GEOS-Chem nested grid simulations.

GEOS-Chem 12.4.0 and later

In GEOS-Chem 12.4.0, the capability to define custom grids via FlexGrid was introduced. The new feature greatly facilitates the creation of custom nested grids by taking advantage of HEMCO's regridding and cropping capabilities. The steps for setting up a nested grid simulation using FlexGrid are described below.

Run a global simulation to create boundary conditions

Create a run directory for your global 4° x 5° or 2° x 2.5° simulation.

NOTE: There is no "preferred" resolution for generating boundary conditions. More often than not, the choice of global simulation resolution

will depend on the ype of nested-grid simulation that you wish to run.

For example, saving boundary condition output from a 4° x 5° standard full chemistry simulation will take much less time than from a 2° x 2.5° simulation. Also the netCDF file sizes from a 4° x 5° simulation are smaller than at 2° x 2.5°, which may matter to you if you have limited disk space available.

On the other hand, if you are setting up a nested-grid simulation for one of the specialty simulations (such as CH4 or Hg), then you may want to genereate 2° x 2.5° boundary condition output. These specialty simulations run much faster than then the full-chemistry simulations, and carry less species, thus resulting in more manageable file sizes.

In the HISTORY.rc file, turn on (remove the comment character # from) the Boundary Condition collection.

COLLECTIONS: 'Restart',
             'SpeciesConc',
             'Budget',
             'AerosolMass',
             'Aerosols',
             'CloudConvFlux',
             'ConcAfterChem',
             'DryDep',
             'JValues',
             'JValuesLocalNoon',
             'LevelEdgeDiags',      
             'ProdLoss',
             'StateChm',     
             'StateMet',      
             'WetLossConv',
             'WetLossLS',
             'Transport',
             'BoundaryConditions',
::

The BoundaryConditions collection by default will save out instantaneous concentrations of advected species every three hours to daily files. You may change those settings by modifying the BoundaryConditions collection section in the HISTORY.rc file. At the time of the FlexGrid implementation in GEOS-Chem 12.4.0, only global boundary condition files could be saved out. However, in GEOS-Chem 12.5.0 and later, you can save out regional BC files using the subset option (specified as lonMin lonMax latMin latMax) highlighted in red below. Using the subset option will greatly reduce the file size of your boundary condition files.

#==============================================================================
# %%%%% THE BoundaryConditions COLLECTION %%%%%
#
# GEOS-Chem boundary conditions for use in nested grid simulations
#
# Available for all simulations
#==============================================================================
  BoundaryConditions.template:   '%y4%m2%d2_%h2%n2z.nc4',
  BoundaryConditions.format:     'CFIO',
  BoundaryConditions.frequency:  00000000 030000
  BoundaryConditions.duration:   00000001 000000
  BoundaryConditions.mode:       'instantaneous'
  BoundaryConditions.LON_RANGE:  -130.0 -60.0,
  BoundaryConditions.LAT_RANGE:  10.0 60.0,
  BoundaryConditions.fields:     'SpeciesBC_?ADV?             ', 'GIGCchem',
::

NOTE: The .LON_RANGE and .LAT_RANGE options were added in GEOS-Chem 12.6.0. For subsetting with versions 12.4.0 and 12.5.0 used BoundaryConditions.subset : -130.0 -60.0 10.0 60.0.

Change any other run configurations as needed. If you do not need the output from your global simulation, you may choose to turn off most of the diagnostic output in HISTORY.rc and HEMCO_Diagn.rc -- we recommend keeping the SpeciesConc collection on in HISTORY.rc as a sanity check. Compile GEOS-Chem and run your global simulation as usual. You should see files named GEOSChem.BoundaryConditions.YYYYMMDD_0000z.nc4 (where YYYYMMDD are replaced by the simulation date) begin to appear in your run directory as your simulation runs.

--Melissa Sulprizio (talk) 12:46, 22 October 2019 (UTC)
--Bob Yantosca (talk) 18:53, 13 July 2020 (UTC)

Set up your nested grid run directory

Create a nested grid run directory from the Unit Tester as usual. There are a few sample nested grid run directories provided in the unit tester. If your nested domain and/or simulation type is not included, we recommend creating a global run directory and modifying the input files as described below.

In the input.geos file, modify the settings in the Grid Menu for your region. Please see the FlexGrid wiki page for example settings for the pre-existing nested grid domains. The example for the 0.25° x 0.3125° North America domain is provided below.

%%% GRID MENU %%%       :
Grid resolution         : 0.25x0.3125 
Longitude min/max       : -130.0 -60.0
Latitude  min/max       :   9.75  60.0
 Half-sized polar boxes?: F
Number of levels        : 47
Nested grid simulation? : T
 Buffer zone (N S E W ) :  3  3  3  3

In HEMCO_Config.rc, make sure the GC_BCs option is set to true and update the BC entry to point to your boundary condition files.

# ExtNr ExtName                on/off  Species 
0       Base                   : on    *
# ----- RESTART FIELDS ----------------------
    --> GC_RESTART             :       true     
    --> GC_BCs                 :       true
    --> HEMCO_RESTART          :       true

...

#==============================================================================
# --- GEOS-Chem boundary condition file ---
#==============================================================================
(((GC_BCs
* BC_  $ROOT/SAMPLE_BCs/v2019-05/tropchem/GEOSChem.BoundaryConditions.$YYYY$MM$DD_$HH$MNz.nc4 SpeciesBC_?ADV?  1980-2019/1-12/1-31/0-23 RFY xyz 1 * - 1 1
)))GC_BCs

Change any other run configurations as needed. Diagnostics that aren't needed may be turned off in HISTORY.rc and HEMCO_Diagn.rc to save disk space. In GEOS-Chem v12.4., you are not required to recompile GEOS-Chem when changing grids. You can therefore copy the geos or geos.mp executable from your global simulation and run your nested grid simulation as usual. If you encounter any errors, please see our Guide to GEOS-Chem error messages.

--Melissa Sulprizio (talk) 16:36, 12 July 2019 (UTC)

GEOS-Chem 12.3.2 and earlier

0.5x0.625 nested-grid simulations with GEOS-FP or MERRA-2

Create boundary conditions at coarse resolution

Run a 2x2.5 or 4x5 global simulation to save the boundary condition files. A single global simulation can output boundary conditions for multiple nested-grid regions. These regions include, North America, Europe, China, and/or a custom region. In the boundary condition run directory (compiled for 2x2.5 or 4x5), change the NESTED GRID MENU in input.geos accordingly. Take the 2x2.5 global simulation as an example:

%%% NESTED GRID MENU %%%:
Save TPCORE BC's        : T                  | Output boundary conditions?
Input BCs at 2x2.5?     : T                  | Are input BCs at 2 x 2.5? (see note below)
Over North America?     : T                  | Output NA BCs
TPCORE NA BC directory  : BC_2x25_NA/        | NA BC output directory
Over Europe?            : T                  | Output EU BCs
TPCORE EU BC directory  : BC_2x25_EU/        | EU BC output directory
Over China?             : F                  | Output CH BCs (for GEOS-5, GEOS-FP)
TPCORE CH BC directory  : BC_2x25_CH/        | CH BC output directory
Over Asia region?       : T                  | Output AS BCs (for MERRA-2)
TPCORE AS BC directory  : BC_2x25_AS/        | AS BC output directory
Over Custom Region?     : F                  | Output custom region BC's
TPCORE BC directory     : BC_2x25/           | Custom BC output directory
BC timestep [min]       : 180                | Frequency of BC output/input
LL box of BC region     :   9  26            | Custom region definition
UR box of BC region     :  29  41            | Custom region definition
1x1 offsets I0_W, J0_W  :   3   3   3   3    | Nested simulation edge width forced to BC's

NOTE: The option Input BCs at 2x2.5? should be set to T for 2x2.5 global simulations used to save out boundary conditions. If you do not set this option to T for 2x2.5 simulations, the BC files will not have the correct dimensions. This option should be set to F for 4x5 global simulations.

After the global simulation, the BC.YYYYMMDD files are then saved within the corresponding boundary condition directory.

Supported MERRA-2 nested-grid domains:

For the three nested-grid domains listed above, LL and UR boxes are defined inside the code and there is no need to define them in input.geos.

You need to set the LL and UR box of BC region in input.geos ONLY IF you are running the nested-grid simulation over a custom region. Otherwise, the LL and UR boxes index will not affect your simulation.

The first number in the LL and UR box refers to the longitude index of the global grid and the second number is the latitude index. As such, these index are different in 2x2.5 and 4x5 global grid.

Take the NA domain (10N-70N, 140W-40W) for example, which is not quite appropriate here because you don’t need to define them in input.geos, the LL box of the BC region has the index of [9, 26] (corresponding to the grid box of [140W, 10N]), and the UR box of the BC region has the index of [29, 41] (corresponding to the grid box of [40W, 70N]).

LL box of BC region     :   9  26            | Custom region definition, grid box of 140W, 10N in 4x5 resolution global scale
UR box of BC region     :  29  41            | Custom region definition, grid box of 40W, 70N in 4x5 resolution global scale

Setup and run nested 0.5 x 0.625 simulation

v10-01 and later versions

Creating a nested-grid run directory from the GEOS-Chem Unit Tester.

We recommend compiling GEOS-Chem from within the run directory. All run directories downloaded from the GEOS-Chem Unit Tester come with a router Makefile that enables you to easily compile and run GEOS-Chem from the run directory. Using this feature enables you to eliminate certain compile options from the make command that are always the same for a given run directory (e.g. MET, GRID, and NEST) and thereby avoid common mistakes. It also creates and stores compile logs locally for easy reference.

When creating a run directory from the GEOS-Chem Unit Tester, the input.geos and HEMCO_Config.rc file will automatically be set up for the nested grid domain that you specified.

Make sure that the auxiliary data for the nested simulation are already downloaded from the Harvard or Dalhousie data archives. See the HEMCO data directories page for more information.

You are now ready to start your nested-grid simulation. GEOS-Chem output will be saved to the trac_avg BPCH file.

Settings in input.geos

In the 0.5x0.625 run directory, following changes in input.geos are mandatory.

For China Region:

Root data directory     : GEOS_0.5x0.625_AS/
Global offsets I0, J0   : 384 158
Transport Timestep [min]: 10
Convect Timestep [min]  : 10
Emission Timestep [min] : 20
Chemistry Timestep [min]: 20

For North America Region:

Root data directory     : GEOS_0.5x0.625_NA/
Global offsets I0, J0   : 64 200
Transport Timestep [min]: 10
Convect Timestep [min]  : 10
Emission Timestep [min] : 20
Chemistry Timestep [min]: 20

For Europe Region:

Root data directory     : GEOS_0.5x0.625_EU/
Global offsets I0, J0   : 240 240
Transport Timestep [min]: 10
Convect Timestep [min]  : 10
Emission Timestep [min] : 20
Chemistry Timestep [min]: 20

Make sure that the auxiliary data for the nested simulation are already downloaded from the Harvard or Dalhousie data archives.

  • For CH: N/A
  • For NA: N/A
  • For EU: N/A

NOTE: The BPCH data files for the MERRA-2 0.5x0.625 nested grids are not available because these nested grids were implemented in GEOS-Chem v11-01 and utilize HEMCO. If you would like to run an older model version with the MERRA-2 nested grids, then you will need to regrid the BPCH input files.

You are now ready to start your nested-grid simulation.

--Melissa Sulprizio (talk) 22:28, 5 September 2017 (UTC)

0.25x0.3125 nested-grid simulations with GEOS-FP

Create boundary conditions at coarse resolution

Run a 2x2.5 or 4x5 global simulation to save the boundary condition files. A single global simulation can output boundary conditions for multiple nested-grid regions. These regions include, North America, Europe, China, and/or a custom region. In the boundary condition run directory (compiled for 2x2.5 or 4x5), change the NESTED GRID MENU in input.geos accordingly. Take the 2x2.5 global simulation as an example:

%%% NESTED GRID MENU %%%:
Save TPCORE BC's        : T                  | Output boundary conditions?
Input BCs at 2x2.5?     : T                  | Are input BCs at 2 x 2.5? (see note below)
Over North America?     : T                  | Output NA BCs
TPCORE NA BC directory  : BC_2x25_NA/        | NA BC output directory
Over Europe?            : T                  | Output EU BCs
TPCORE EU BC directory  : BC_2x25_EU/        | EU BC output directory
Over China?             : T                  | Output CH BCs (for GEOS-5, GEOS-FP)
TPCORE CH BC directory  : BC_2x25_CH/        | CH BC output directory
Over Asia region?       : F                  | Output AS BCs (for MERRA-2)
TPCORE AS BC directory  : BC_2x25_AS/        | AS BC output directory
Over Custom Region?     : F                  | Output custom region BC's
TPCORE BC directory     : BC_2x25/           | Custom BC output directory
BC timestep [min]       : 180                | Frequency of BC output/input
LL box of BC region     :   9  26            | Custom region definition
UR box of BC region     :  29  41            | Custom region definition
1x1 offsets I0_W, J0_W  :   3   3   3   3    | Nested simulation edge width forced to BC's

NOTE: The option Input BCs at 2x2.5? should be set to T for 2x2.5 global simulations used to save out boundary conditions. If you do not set this option to T for 2x2.5 simulations, the BC files will not have the correct dimensions. This option should be set to F for 4x5 global simulations.

After the global simulation, the BC.YYYYMMDD files are then saved within the corresponding boundary condition directory.

Supported GEOS-5 nested-grid domains:

For the three nested-grid domains listed above, LL and UR boxes are defined inside the code and there is no need to define them in input.geos.

You need to set the LL and UR box of BC region in input.geos ONLY IF you are running the nested-grid simulation over a custom region. Otherwise, the LL and UR boxes index will not affect your simulation.

The first number in the LL and UR box refers to the longitude index of the global grid and the second number is the latitude index. As such, these index are different in 2x2.5 and 4x5 global grid.

Take the NA domain (10N-70N, 140W-40W) for example, which is not quite appropriate here because you don’t need to define them in input.geos, the LL box of the BC region has the index of [9, 26] (corresponding to the grid box of [140W, 10N]), and the UR box of the BC region has the index of [29, 41] (corresponding to the grid box of [40W, 70N]).

LL box of BC region     :   9  26            | Custom region definition, grid box of 140W, 10N in 4x5 resolution global scale
UR box of BC region     :  29  41            | Custom region definition, grid box of 40W, 70N in 4x5 resolution global scale

Setup and run nested 0.25 x 0.3125 simulation

v10-01 and later versions

Creating a nested-grid run directory from the GEOS-Chem Unit Tester.

We recommend compiling GEOS-Chem from within the run directory. All run directories downloaded from the GEOS-Chem Unit Tester come with a router Makefile that enables you to easily compile and run GEOS-Chem from the run directory. Using this feature enables you to eliminate certain compile options from the make command that are always the same for a given run directory (e.g. MET, GRID, and NEST) and thereby avoid common mistakes. It also creates and stores compile logs locally for easy reference.

When creating a run directory from the GEOS-Chem Unit Tester, the input.geos and HEMCO_Config.rc file will automatically be set up for the nested grid domain that you specified.

Make sure that the auxiliary data for the nested simulation are already downloaded from the Harvard or Dalhousie data archives. See the HEMCO data directories page for more information.

You are now ready to start your nested-grid simulation. GEOS-Chem output will be saved to the trac_avg BPCH file.

v9-01-03 and earlier versions

Make a 0.255x0.3125 run directory by coping the 2x2.5 or 4x5 run directory. Make sure the correct 0.25x0.3125 restart file is available. If a restart file is not available, you can regrid a 2x2.5 or 4x5 restart file using the GAMAP routine REGRIDH_RESTART and crop to the nested domain using CREATE_NESTED. You can also try emailing the Nested Model Working Group (geos-chem-regional [at] seas.harvard.edu) to see if anyone can provide you with a restart file.

In the 0.25x0.3125 run directory, following changes in input.geos are mandatory.

For China Region:

Root data directory     : GEOS_0.25x0.3125_CH/
Global offsets I0, J0   : 800 420
Transport Timestep [min]: 5
Convect Timestep [min]  : 5
Emission Timestep [min] : 10
Chemistry Timestep [min]: 10

For North America Region:

Root data directory     : GEOS_0.25x0.3125_NA/
Global offsets I0, J0   : 160 399
Transport Timestep [min]: 5
Convect Timestep [min]  : 5
Emission Timestep [min] : 10
Chemistry Timestep [min]: 10

For Europe Region:

Root data directory     : GEOS_0.25x0.3125_EU/
Global offsets I0, J0   : TBD
Transport Timestep [min]: 5
Convect Timestep [min]  : 5
Emission Timestep [min] : 10
Chemistry Timestep [min]: 10

Turn off emissions for any regions not covered in the nested simulation. The following EMISSIONS MENU options in input.geos are recommended.

For China Region:

%%% EMISSIONS MENU %%%  :
Turn on emissions?      : T
Emiss timestep (min)    : 10
Include anthro emiss?   : T
 => Scale to (1985-2005): -1
 => Use EMEP emissions? : F
 => Use BRAVO emissions?: F
 => Use EDGAR emissions?: T
 => Use STREETS emiss?  : T
 => Use CAC emissions?  : F
 => USE NEI2005 emiss?  : F
 => Use RETRO emiss?    : T
 => Use AEIC emissions? : T
Use RCP emiss (anth+bf)?: F
    => RCP scenario?    : RCP60
    => RCP year?        : 2000
Use EPA/NEI99 (anth+bf)?: F
    w/ ICARTT modif.?   : F
    w/ VISTAS NOx emis? : F
Include biofuel emiss?  : T
Include biogenic emiss? : T
 => Use MEGAN inventory?: T
 => Use PCEEA model?    : F
 => Use MEGAN for MONO? : T
 => Isoprene scaling    : 1
Include biomass emiss?  : T
 => Seasonal biomass?   : F
 => Scaled to TOMSAI?   : F
 => Use GFED2 biomass?  :---
    => monthly GFED2?   : F
    => 8-day GFED2?     : F
    => 3-hr GFED2?      : F
    => synoptic GFED2?  : F
 => Use GFED3 biomass?  :---
    => monthly GFED3?   : T 
    => daily GFED3?     : F 
    => 3-hr GFED3?      : F
Individual NOx sources  :---
 => Use RCP aircraft NOx: F
 => Use lightning NOx   : T
    => Spat-seas constr?: T
 => Use soil NOx        : T
    => soilNOx rst file?: restart.soilnox.YYYYMMDDhh
 => Use fertilizer NOx  : T
NOx scaling             : 1
Use ship emissions      :---
 => global EDGAR ?      : T
 => global ICOADS ?     : F
 => global RCP ?        : F
 => EMEP over EUROPE ?  : F
 => ship SO2 Corbett ?  : F
 => ship SO2 Arctas ?   : F
Use COOKE BC/OC (N. Am.): F
Use historical emiss?   : F
 => What decade?        : 2000
Bromine switches        :---
 => Use Warwick VSLS?   : T
 => Use seasalt Br2?    : T
 => 1ppt MBL BRO Sim.?  : F
 => Bromine scaling     : 1

For North America Region

%%% EMISSIONS MENU %%%  :
Turn on emissions?      : T
Emiss timestep (min)    : 10
Include anthro emiss?   : T
 => Scale to (1985-2005): -1
 => Use EMEP emissions? : F
 => Use BRAVO emissions?: T
 => Use EDGAR emissions?: T
 => Use STREETS emiss?  : F
 => Use CAC emissions?  : T
 => USE NEI2005 emiss?  : T
 => Use RETRO emiss?    : T
 => Use AEIC emissions? : T
Use RCP emiss (anth+bf)?: F
    => RCP scenario?    : RCP60
    => RCP year?        : 2000
Use EPA/NEI99 (anth+bf)?: F
    w/ ICARTT modif.?   : F
    w/ VISTAS NOx emis? : F
Include biofuel emiss?  : T
Include biogenic emiss? : T
 => Use MEGAN inventory?: T
 => Use PCEEA model?    : F
 => Use MEGAN for MONO? : T
 => Isoprene scaling    : 1
Include biomass emiss?  : T
 => Seasonal biomass?   : F
 => Scaled to TOMSAI?   : F
 => Use GFED2 biomass?  :---
    => monthly GFED2?   : F
    => 8-day GFED2?     : F
    => 3-hr GFED2?      : F
    => synoptic GFED2?  : F
 => Use GFED3 biomass?  :---
    => monthly GFED3?   : T 
    => daily GFED3?     : F 
    => 3-hr GFED3?      : F
Individual NOx sources  :---
 => Use RCP aircraft NOx: F
 => Use lightning NOx   : T
    => Spat-seas constr?: T
 => Use soil NOx        : T
    => soilNOx rst file?: restart.soilnox.YYYYMMDDhh
 => Use fertilizer NOx  : T
NOx scaling             : 1
Use ship emissions      :---
 => global EDGAR ?      : T
 => global ICOADS ?     : F
 => global RCP ?        : F
 => EMEP over EUROPE ?  : F
 => ship SO2 Corbett ?  : F
 => ship SO2 Arctas ?   : F
Use COOKE BC/OC (N. Am.): F
Use historical emiss?   : F
 => What decade?        : 2000
Bromine switches        :---
 => Use Warwick VSLS?   : T
 => Use seasalt Br2?    : T
 => 1ppt MBL BRO Sim.?  : F
 => Bromine scaling     : 1

For Europe Region:

%%% EMISSIONS MENU %%%  :
Turn on emissions?      : T
Emiss timestep (min)    : 10
Include anthro emiss?   : T
 => Scale to (1985-2005): -1
 => Use EMEP emissions? : T
 => Use BRAVO emissions?: F
 => Use EDGAR emissions?: T
 => Use STREETS emiss?  : F
 => Use CAC emissions?  : F
 => USE NEI2005 emiss?  : F
 => Use RETRO emiss?    : T
 => Use AEIC emissions? : T
Use RCP emiss (anth+bf)?: F
    => RCP scenario?    : RCP60
    => RCP year?        : 2000
Use EPA/NEI99 (anth+bf)?: F
    w/ ICARTT modif.?   : F
    w/ VISTAS NOx emis? : F
Include biofuel emiss?  : T
Include biogenic emiss? : T
 => Use MEGAN inventory?: T
 => Use PCEEA model?    : F
 => Use MEGAN for MONO? : T
 => Isoprene scaling    : 1
Include biomass emiss?  : T
 => Seasonal biomass?   : F
 => Scaled to TOMSAI?   : F
 => Use GFED2 biomass?  :---
    => monthly GFED2?   : F
    => 8-day GFED2?     : F
    => 3-hr GFED2?      : F
    => synoptic GFED2?  : F
 => Use GFED3 biomass?  :---
    => monthly GFED3?   : T 
    => daily GFED3?     : F 
    => 3-hr GFED3?      : F
Individual NOx sources  :---
 => Use RCP aircraft NOx: F
 => Use lightning NOx   : T
    => Spat-seas constr?: T
 => Use soil NOx        : T
    => soilNOx rst file?: restart.soilnox.YYYYMMDDhh
 => Use fertilizer NOx  : T
NOx scaling             : 1
Use ship emissions      :---
 => global EDGAR ?      : T
 => global ICOADS ?     : F
 => global RCP ?        : F
 => EMEP over EUROPE ?  : F
 => ship SO2 Corbett ?  : F
 => ship SO2 Arctas ?   : F
Use COOKE BC/OC (N. Am.): F
Use historical emiss?   : F
 => What decade?        : 2000
Bromine switches        :---
 => Use Warwick VSLS?   : T
 => Use seasalt Br2?    : T
 => 1ppt MBL BRO Sim.?  : F
 => Bromine scaling     : 1

Turn off the output boundary switch in NESTED GRID MENU in input.geos.

Save TPCORE BC's        : F

Otherwise, the BC files saved from the global simulation will be overwritten.

To compile GEOS-Chem for the nested-grid simulation, navigate to your code directory and change the grid size in Headers/define.h as follows. Note that only one of NESTED_CH, NESTED_NA or NESTED_EU can be set during a single compilation.

!----- Grid sizes -----
!#define NESTED_CH     'NESTED_CH'
#define NESTED_NA     'NESTED_NA'
!#define NESTED_EU     'NESTED_EU'
!#define GRID05x0666   'GRID05x0666'
!#define GRID05x0625   'GRID05x0625'
#define GRID025x03125 'GRID025x03125'
!#define GRID1x1       'GRID1x1'
!#define GRID1x125     'GRID1x125'
!#define GRID2x25      'GRID2x25'
!#define GRID4x5       'GRID4x5'
#define GRIDREDUCED    'GRIDREDUCED'

After compiling, copy the geos executable file from the code directory to the 0.25x0.3215 run directory.

Make sure that the auxiliary data for the nested simulation are already downloaded from the Harvard or Dalhousie data archives.

You are now ready to start your nested-grid simulation. GEOS-Chem output will be saved to the ctm.bpch file.

--Melissa Sulprizio (talk) 22:28, 5 September 2017 (UTC)

Previous issues that are now resolved

Segmentation fault in v11-01

If you are setting up a nested-grid simulation with the GEOS-Chem v11-01 (public release 01 Feb 2017), then you should be aware that there is a bug that will cause GEOS-5 simulations to die with a seg fault. This error was caused by a typo in module GeosCore/tpcore_window_mod.F90.

For a description of the fix, please see this wiki post.

--Bob Yantosca (talk) 20:45, 1 June 2017 (UTC)

Transport fix

Lin Zhang reported a problem with the transport for nested simulations with GEOS-5 met fields. The problem was traced back to the fact that neither the latest pressure fixer nor the new advection have been set up to be used with nested grids. The problems in both were:

  • the use of polar caps for the first and last latitude bands of the nested domain, although there is no pole in the domain.
  • the use of periodicity both in longitude and latitude.

To solve the problems we modified both modules so that variables are updated only for an inner window of the domain, so that we do not need periodicity. The buffer zone for the pressure fixer is equal to 3 grid cells in all directions (defined in INIT_PJC_PFIX in GeosCore/pjc_pfix_geos5_window_mod.f). The buffer zone for the transport is the zone where the Boundary Conditions are fixed (as defined at the end of the NESTED MENU in input.geos). Note that the buffer zone for the advection must be larger than the buffer zone for the pressure-fixer.

The fix will be available in GEOS-Chem v8-03-02.

--Ccarouge 16:35, 4 August 2010 (EDT)

Jeff Pierce replied:

I applied this fix, and the RAM requirements were increased. It went from 11 GB to 14 GB. I am using 148 tracers, which is why the RAM requirement is already high. However, I'm guessing this will affect everyone to some degree.

--Bob Y. 13:10, 15 November 2010 (EST)

SMVGEAR and photolysis errors

Steve Yim wrote:

I have questions about the nested GEOS-Chem simulation. The simulation stopped with multiple lines of messages showing " Too many levels in photolysis code" and "TOO LOW DEC YFAC.", please see the attachment.
  Too many levels in photolysis code: need 33386 but NL dimensioned as  1500
  Too many levels in photolysis code: need 19542 but NL dimensioned as  1500
  ...
     ### CHEMDR: after FAST-J
    - PHYSPROC: Trop chemistry at 2006/12/17 18:00
  SMVGEAR: DELT= 4.15E-16 TOO LOW DEC YFAC. KBLK, KTLOOP, NCS, TIME, TIMREMAIN, YFAC, EPS =
   ****   24    1  3.696E+07 3.501E+03 1.000E+00 1.000E-01
  SMVGEAR: DELT= 4.56E-16 TOO LOW DEC YFAC. KBLK, KTLOOP, NCS, TIME, TIMREMAIN, YFAC, EPS =
   ****   24    1  3.696E+07 3.499E+03 1.000E-02 1.000E-01
  ...
  SMVGEAR: TOO MANY DECREASES OF YFAC  
I checked the GEOS-Chem wiki which also shows this error. Then, according to this discussion in the GEOS-Chem wiki, I checked the OPMIE.f, it is indeed set as 1500 now. Because in my log file of the simulation, the level needed in the photolysis code can be up to 46000, even shows as ***** (Too many levels in photolysis code: need ***** but NL dimensioned as 1500). I think this value is just for Global GEOS-Chem. Do you have any suggested value for nested GEOS-Chem?
For the error of "TOO LOW DEC YFAC.", the output messages show the YFAC is decreasing from 1 to 1e-18, please see the attachment. It shows the time step is too small, and even smaller than the minimum allowable value of DELT (time step < HMIN (1.0d-15 sec)) as shown in the smvgear.f. Do you have any idea to solve this problem?

Lin Zhang wrote:

I met the error before; it was due to wrong or improper boundary conditions. Increasing the NL level would not solve the error. Are you using the 4x5 boundary condition? I haven't met the error now with the 2x2.5 boundary condition.

--Bob Y. 12:56, 15 November 2010 (EST)

Monoterpene SOA zero bug fix

Jeffrey Pierce wrote:

I found a bug in 8.03.01 that makes monoterpene SOA zero in the nested model. This appears to affect both when SOA is done online and offline when MEGAN is being used.
The issue is that at other resolutions there are separate annual emission factors (AEF) for various monoterpenes ('APINE' , 'BPINE' , 'LIMON' , 'SABIN' , 'MYRCN' , CAREN' , 'OCIMN'), however, for 0.5x0.666 they are all lumped into a single AEF, 'MONOT'.
However, GET_EMMONOG_MEGAN, the megan routine that scales monoterpene emissions for LAI, temp etc. does not recognise 'MONOT', however, it does not give an error, it just doesn't emit any monoterpenes. I have fixed this by adding a case for MONOT in GET_EMMONOG_MEGAN and giving it the same LDF as alpha pinene. I also needed to add a grid IF statement to GET_EMMONOT_MEGAN to feed MONOT to GET_EMMONOG_MEGAN.
I couldn't find any discussion of this on the wiki, so I don't think it has been fixed (Though I havent checked 9.1.1).
I figured I'd ask you guys before I get with the git.

Yuxuan Wang replied:

The bug in the nested-grid SOA simulation reported by Jeff is caused by the code not allowing the nested-grid simulation to regrid emissions on the fly.
This is due to historical reasons: the nested-grid code was introduced in the standard model before the regridding-on-the-fly capability became standard, so the nested-grid code directly read from input data regridded offline. We've corrected a few occasions, but apparently the SOA code skipped our attention.
The bug can be solved simply by commenting out the nested-grid option as follows:
In megan_mod.f (the line numbers below refer to Code.v9-01-01):
Original Code:
  line 2716    #if   defined( GRID05x0666 ) 
  line 2717           CALL GET_AEF_05x0666     ! GEOS-5 nested grids only
  line 2718    #else
  line 2719           CALL GET_AEF             ! Global simulations
  line 2720    #endif  
Changed to:
  line 2716     !#if defined( GRID05x0666 ) 
  line 2717     !    CALL GET_AEF_05x0666      ! GEOS-5 nested grids only
  line 2718     !#else
  line 2719         CALL GET_AEF               ! Global and nested-grid simulations
  line 2720     !#endif
We tested the changes and it worked fine. We will put this bug fix on the wiki.

--Libao Chai 13:05, 24 May 2011 (EST)

Parallelization error in nested grid simulations

Description of the issue

Jintai Lin wrote:

I would like to report a parallelization bug in the nested Asia model (0.5x0.666) detected by the Unit Tester (see below). The bug relates to the outmost grid cells (see attached figure), likely associated with the tpcore tranport. So far I have not identified the exact cause.
###############################################################################
### VALIDATION OF GEOS-CHEM OUTPUT FILES 
### In directory: geos5_05x0666_fullchem_ch 
### 
### File 1    : trac_avg.geos5_05x0666_fullchem_ch.2005070100.sp
### File 2    : trac_avg.geos5_05x0666_fullchem_ch.2005070100.mp
### Sizes     : IDENTICAL (607119524 and 607119524) 
### Checksums : DIFFERENT (4077605235 and 3378936865) 
### Diffs     : DIFFERENT 
### 
### File 1    : trac_rst.geos5_05x0666_fullchem_ch.2005070101.sp
### File 2    : trac_rst.geos5_05x0666_fullchem_ch.2005070101.mp
### Sizes     : IDENTICAL (199697128 and 199697128) 
### Checksums : DIFFERENT (2087982217 and 1808442322) 
### Diffs : DIFFERENT 
### 
### File 1    : soil_rst.geos5_05x0666_fullchem_ch.2005070101.sp
### File 2    : soil_rst.geos5_05x0666_fullchem_ch.2005070101.mp
### Sizes     : IDENTICAL (258536 and 258536) 
### Checksums : DIFFERENT (2433800300 and 79124321) 
### Diffs     : DIFFERENT 
###############################################################################

--Melissa Sulprizio 17:34, 10 March 2014 (EDT)

Solution

These updates were validated in the 1-month benchmark simulation v10-01d and approved on 03 Jun 2014.

Jintai Lin wrote:

I might have figured out a fix to the parallelization bug.
In tpcore_geos5_window_mod.F90, results in the northern and southern edges are averaged through the subroutine xpavg. It is not clear why this averaging would lead to the parallelization bug. However, there is actually no need to do the averaging at all. The averaging is supposed used to mitigate the transport problem at the two poles, but the nested model domains never reach the poles. Therefore we should turn off the averaging by commenting out all the calls to xpavg.
In addition, after the tracer transport, the tracer field q is updated (around line 819). However, the 3 gridcells at each of the southern and northern edges are also updated. This should be avoided, because transport at these edge gridcells are not meaningful.
Also, the tracer field is not initialized at the beginning of the parallelized do loop to do tracer transport. Therefore I added an initialization step.
Moreover, the variable MFLEW and MFLNS should be claimed private in the openMP directive and also be initialized at the beginning of the do loop. This fix affects the ND24 and ND25 diagnostics. (Melissa also suggested this bug fix previously.)
Finally, it appears that fixing this bug also fixes another parallelization bug previously related to the isoropia scheme. Specifically, when I fix this tpcore bug, the Unit Tester leads to identical simulations with and without turning on the openMP parallization, no matter whether isoropia is turned on or off. I test 4 days, from 2005/07/01 to 2005/07/05.
The same bug and fix applies to both geos5 (tpcore_geos5_window_mod.F90) and geosfp (tpcore_geosfp_window_mod.F90) transport.

--Melissa Sulprizio 11:28, 21 May 2014 (EDT)