- Hardware and Software Requirements
- Downloading Source Code
- Obtaining a Run Directory
- Setting Up the GCHP Environment
- Basic Example Run
- Run Configuration Files
- Advanced Run Examples
- Output Data
- Developing GCHP
GCHP works as a layer around GEOS-Chem, simulating the more complex environment of a full atmospheric global circulation model (AGCM). Most model updates will involve editing GEOS-Chem source code as you would with GEOS-Chem "classic" (GCC). However, certain updates such as specifying output variables or adding new input fields will require development within the GCHP-specific source code. In addition, sometimes debugging will lead you into the MAPL source code. This page provides an overview of the code structure to help navigate and debug GCHP.
High-level Execution of GEOS-Chem Classic
GCC primarily consists of a single, monolithic code. When running the GEOS-Chem executable geos, the main routine in GeosCore/main.F performs the following functions:
- Read in simulation settings from input.geos
- Set up arrays to hold data such as species concentrations and meteorological data
- Loop though the following steps until the simulation is complete:
Although the code for each of these functions is found in different files (e.g. chemistry_mod.F, transport_mod.F), all of the routines are called from main.F.
High-level Execution of GCHP
The primary difference with GCHP is that main.F is replaced by the GMAO MAPL framework. MAPL provides an interface with ESMF which allows the different components to be entirely unaware of each other's existence and all communication is standardized. The functional flow now looks more like this:
- Initialize the MAPL_Cap process. MAPL will:
- Establish a generic input component, called ExtData
- Establish a generic output component, called History
- Establish a generic CTM component, called GCHP
- Determine which modules will be performing which function. In GCHP, GEOS-Chem will calculate chemistry, FV3Dycore will calculate transport, and emissions are completed through HEMCO
- For each component, send an “Initialize” command
- Send the “Run” command to the CTM component. The CTM component will loop through the following steps:
- Request input data from ExtData
- Send a “Run” command to each component in the CTM
- Send output data to History
- Once the CTM component is done, send a “Finalize” command to all components and exit
All ancillary operations, such as data regridding and parallelization, are handled by MAPL. Each core is unaware of the existence of each other core. This means that, in a 6-CPU run, there are six distinct instances of the GEOS-Chem component running; each one will see ⅙ of the available domain, and be fed data as if that domain were all that existed. Components can request data from other domains (e.g. the transport core will request data from adjacent domains) but this communication is all handled through MAPL.
Source Code Structure
GCHP source code can be sub-divided into five parts:
- GCHP wrapper interface routines (GCHP)
- Cubed-Sphere Finite Volume Dynamical Core (FVDycore)
- Earth System Modeling Framework (ESMF)
- NASA-GMAO Mapping, Analysis and Prediction Layer (MAPL)
The GEOS-Chem source code is the same as you would download for running GCC. It contains C preprocessor directives specifying which parts of the code should be compiled in a high performance computing (HPC) environment. You can enable HPC in GEOS-Chem by additionally downloading the GCHP wrapper and storing it in the top-level GEOS-Chem source code directory as a sub-directory called GCHP. The GCHP directory is designed to use GEOS-Chem source code within a set of wrapper functions that interface GEOS-Chem's routines to the ESMF using, in part, MAPL and the HPC-capable cubed-sphere dynamics core FVDycore.
The GCHP directory contains four subdirectories and several Fortran-90 and header files. Description of each are as follows:
- FVdycoreCubed_GridComp: The HPC-capable cubed-sphere dynamics core. GEOS-Chem's serial dynamical core is not capable of operating in a distributed environment, requiring that an HPC-capable version be included within the GCHP system. In 2014, NASA GMAO made available a stand-alone version of the Finite-Volume Cubed-Sphere Dynamical Core used in GEOS, which was adapted to the GCHP and resides within the FVdycoreCubed_GridComp subdirectory. The FV dycore is able to read in meteorological fields in either lat-lon or cubed sphere formats.
- Shared: Contains NASA GMAO's MAPL and Shared library packages used to facilitate coupling between components and provide the primary interface with ESMF. Problems with MAPL will lead you into this directory. However, be aware that error traceback for many run directory problems will lead you to MAPL code and the problem is usually not the code itself. Carefully check that your configuration files (*.rc) are properly set before attempting to change MAPL code to fix the issue.
- ESMF: Contains ESMF infrastructure source code for version v5.2.0rp2. See ESMF/README in the source code for more information.
- Registry: Contains information used by MAPL at compile time to generate the Fortran interface between the various quantities needed by GEOS-Chem and ESMF, MAPL, and FVdycore.
- *.F90 and *.H files in the GCHP directory: These Fortran routines and header files replace the GEOS-Chem classic main.F functionality. They also consist of ESMF and MAPL interface code that couples GEOS-Chem routines in an ESMF environment. gigc_chunk_mod.F90 calls the various methods within GEOS-Chem necessary to input, initialize, calculate, and output GEOS-Chem scientific quantities. gigc_history_exports_mod.F90 handles the GCHP diagnostics. Files that contain GridComp in the name are ESMF gridded components which can be thought of as the building blocks of an ESMF application, each with imports, exports, and an internal state.