- Hardware and Software Requirements
- Downloading Source Code
- Obtaining a Run Directory
- Setting Up the GCHP Environment
- Basic Example Run
- Configuring a Run
- Output Data
- Developing GCHP
- Run Configuration Files
GCHP works as a layer around GEOS-Chem, simulating the more complex environment of a full atmospheric global circulation model (AGCM). Most model updates will involve editing GEOS-Chem source code as you would with GEOS-Chem "classic" (GCC). However, certain updates such as specifying output variables or adding new input fields will require development within the GCHP-specific source code. In addition, sometimes debugging will lead you into the MAPL source code. This page provides an overview of the code structure to help navigate and debug GCHP.
High-level Execution of GEOS-Chem Classic
GCC primarily consists of a single, monolithic code. When running the GEOS-Chem executable geos, the main routine in GeosCore/main.F performs the following functions:
- Read in simulation settings from input.geos
- Set up arrays to hold data such as species concentrations and meteorological data
- Loop though the following steps until the simulation is complete:
Although the code for each of these functions is found in different files (e.g. chemistry_mod.F, transport_mod.F), all of the routines are called from main.F.
High-level Execution of GCHP
The primary difference with GCHP is that main.F is replaced by the GMAO MAPL framework. MAPL provides an interface with ESMF which allows the different components to be entirely unaware of each other's existence and all communication is standardized. The functional flow now looks more like this:
- Initialize the MAPL_Cap process. MAPL will:
- Establish a generic input component, called ExtData
- Establish a generic output component, called History
- Establish a generic CTM component, called GCHP
- Determine which modules will be performing which function. In GCHP, GEOS-Chem will calculate chemistry, FV3Dycore will calculate transport, and emissions are completed through HEMCO
- For each component, send an “Initialize” command
- Send the “Run” command to the CTM component. The CTM component will loop through the following steps:
- Request input data from ExtData
- Send a “Run” command to each component in the CTM
- Send output data to History
- Once the CTM component is done, send a “Finalize” command to all components and exit
All ancillary operations, such as data regridding and parallelization, are handled by MAPL. Each core is unaware of the existence of each other core. This means that, in a 6-CPU run, there are six distinct instances of the GEOS-Chem component running; each one will see ⅙ of the available domain, and be fed data as if that domain were all that existed. Components can request data from other domains (e.g. the transport core will request data from adjacent domains) but this communication is all handled through MAPL.
ESMF/MAPL Gridded Component Hierarchy
The presence and structure of the GCHP configuration files is due to the ESMF and MAPL structure on which GCHP is built. The basic element of an ESMF program is the "component" of which there are two different types: gridded and coupler. Components are organized and interact with one another hierarchically, as parent and child. The GCHP is built exclusively of gridded components, often denoted "GC" or "GridComp", with the top level, or Cap component, simply denoted "Cap" (Figure 1). The Cap is a simple MAPL program that initializes ESMF, MAPL, and associated resources. It has three children: Root, History, and ExtData. Below are brief descriptions of each:
- Root: The Root component controls the operation and interaction of all of the components comprising the model system. Hierarchically, it is the parent or ancestor of all scientific operations. The only operations that occur outside of Root are the initialization by Cap, ExtData, and History. The Root component can be given a name, specified in GCHP configuration file Cap.rc. The Root name for GCHP is simply "GCHP".
- ExtData: ExtData stands for "External Data" and is an internal MAPL gridded component used to read data from netcdf files on disk. More specifically, ExtData populates fields in the "Import" states within the MAPL hierarchy. Only fields designated as part of a component's import state can be filled with ExtData. Information in GCHP configuration file ExtData.rc provides the ExtData component information about the input data such as file path and read frequency.
- History: The History component is an internal MAPL gridded component used to manage output streams from a MAPL hierarchy. For GCHP, it is used for writing output to NetCDF files. History is able write variables to file that exist in the "Export" states of any component in the GCHP hierarchy. It also has some limited capability to interpolate the fields horizontally writing them. Information about what variables to write are specified in the GCHP configuration file HISTORY.rc
Source Code Structure
GCHP source code can be sub-divided into five parts:
- GCHP wrapper interface routines (GCHP)
- Cubed-Sphere Finite Volume Dynamical Core (FVDycore)
- Earth System Modeling Framework (ESMF)
- NASA-GMAO Mapping, Analysis and Prediction Layer (MAPL)
The GEOS-Chem source code is the same as you would download for running GCC. It contains C preprocessor directives specifying which parts of the code should be compiled in a high performance computing (HPC) environment. You can enable HPC in GEOS-Chem by additionally downloading the GCHP wrapper and storing it in the top-level GEOS-Chem source code directory as a sub-directory called GCHP. The GCHP directory is designed to use GEOS-Chem source code within a set of wrapper functions that interface GEOS-Chem's routines to the ESMF using, in part, MAPL and the HPC-capable cubed-sphere dynamics core FVDycore.
The GCHP directory contains four subdirectories and several Fortran-90 and header files. Description of each are as follows:
- FVdycoreCubed_GridComp: The HPC-capable cubed-sphere dynamics core. GEOS-Chem's serial dynamical core is not capable of operating in a distributed environment, requiring that an HPC-capable version be included within the GCHP system. In 2014, NASA GMAO made available a stand-alone version of the Finite-Volume Cubed-Sphere Dynamical Core used in GEOS, which was adapted to the GCHP and resides within the FVdycoreCubed_GridComp subdirectory. The FV dycore is able to read in meteorological fields in either lat-lon or cubed sphere formats.
- Shared: Contains NASA GMAO's MAPL and Shared library packages used to facilitate coupling between components and provide the primary interface with ESMF. Problems with MAPL will lead you into this directory. However, be aware that error traceback for many run directory problems will lead you to MAPL code and the problem is usually not the code itself. Carefully check that your configuration files (*.rc) are properly set before attempting to change MAPL code to fix the issue.
- ESMF: Contains ESMF infrastructure source code for version v5.2.0rp2. See ESMF/README in the source code for more information.
- Registry: Contains information used by MAPL at compile time to generate the Fortran interface between the various quantities needed by GEOS-Chem and ESMF, MAPL, and FVdycore.
- *.F90 and *.H files in the GCHP directory: These Fortran routines and header files replace the GEOS-Chem classic main.F functionality. They also consist of ESMF and MAPL interface code that couples GEOS-Chem routines in an ESMF environment. gigc_chunk_mod.F90 calls the various methods within GEOS-Chem necessary to input, initialize, calculate, and output GEOS-Chem scientific quantities. gigc_history_exports_mod.F90 handles the GCHP diagnostics. Files that contain GridComp in the name are ESMF gridded components which can be thought of as the building blocks of an ESMF application, each with imports, exports, and an internal state.
GCHP Updates Required with GEOS-Chem Classic Updates
This section is in progress.
Source Code Dependencies on Compiler and MPI
This section is a work in progress.