GCHP Hardware and Software Requirements
- Hardware and Software Requirements
- Downloading Source Code and Data Directories
- Obtaining a Run Directory
- Setting Up the GCHP Environment
- Running GCHP: Basics
- Running GCHP: Configuration
- Output Data
- Developing GCHP
- Run Configuration Files
The following is a short list of hardware requirements for GCHP and related information:
- GCHP requires a minimum of 6 cores to run, and beyond this the number of cores must be a multiple of 6.
- Most GCHP and GEOS-Chem Classic users use the Intel Fortran Compiler (ifort). Starting with version 12.5.0, GCHP also has compatibility with open-source GNU compilers. Regardless of whether you use an Intel compiler or not, we highly recommend that you use Intel CPUs, preferably Xeon. It is a known issue that ifort does not optimize well on AMD chips (this was actually intended). Don't worry too much about CPU speed if you can use Xeon CPUs.
- Most clusters are built upon nodes with 16 to 32 CPUs each. One node of 32 CPUs will provide sufficient resources for standard GCHP c48 runs. It will have the added benefit of not needing to use the network interconnect for MPI. In general, GCHP will have better performance with more cores per node due to how input handling is parallelized.
- GCHP can run with ~7 GB per CPU if using C24, the cubed sphere resolution equivalent to 4° x 5°. Higher resolutions will require more memory per core depending on how many cores you use for your run. We have found that the biggest bottleneck for running high resolution simulations is the amount of memory available per node since it limits the memory available per core. The best solution when running into memory per core limitations is to request more nodes, reserve all memory per node by requesting all cores, and use fewer cores per node than you have requested for your actual GCHP run.
- InfiniBand is recommended if you can afford it. If not, a 10 Gigabit Ethernet is a good alternative when using high-end interconnects like Cisco or high-end Hewlett-Packard (HP). A 1000 Mbps Gigabit Ethernet should also work if you get an optimized router like Cisco or HP, but it will be less speed-dependent and have more packet latency. This is minimized in the high-end network interconnects.
- If using an interconnect, it would be very helpful if the system had two: one interconnect for file transfer, log-in, etc, and the other interconnect for MPI-only communication. Using two interconnects in this way prevents problems such as file I/O interfering with MPI packet transfer. This is probably now standard on turn-key systems. Ask your local cluster administrator what is available for you to use.
GCHP requires C and Fortran compilers, netCDF C and Fortran libraries, an implementation of a Message Passing Interface (MPI), git version control software, and C-preprocessor software. This software is standard on many primary HPC systems in use by academic and scientific institutions. If any of the required software is not available then you must acquire it to compile and run GCHP.
For systems with pre-installed packages, such as an institutional compute cluster, you must determine how each package was compiled to ensure that you will be able to build and run GCHP successfully. It is absolutely necessary to use MPI and netCDF libraries built with the same compiler you will use to build GEOS-Chem and ESMF. If your pre-installed packages were not built with the same compilers then you will need to build them yourself or have them built for you by your organization's technical support staff.
We are currently only supporting GCHP on Linux distributions.
You should acquire source code by cloning or forking the GEOS-Chem and GCHP repositories on GitHub using git. The GCHP Makefile assumes use of git version >1.8. Otherwise you will get an error that the -C option is not available. Check the default git version on your cluster by typing 'git --version' at the command line. If you have version 1.8 and prior then check with your system administrator on modules available with a more recent version of git, or download a newer version for free from the web.
If you are new to git then you should take time to get familiar with it. The UCAR CESM project has an excellent git guide, CIME Git Workflow, that is a great resource regardless of your skill level. The same basic workflow can be used for GEOS-Chem.
GNU Make is required to build all past versions of GCHP.
CMake will be required for building an upcoming version of GCHP but is not currently required. Stay tuned for more information on this transition from GNU Make to CMake for building GCHP. GNU Make will still be required to build ESMF.
The Message Passing Interface (MPI) is the fundamental requirement for distributed memory computing. We currently support using GCHP with OpenMPI version 3.1. OpenMPI 3.0 will also work with GCHP, but versions prior to it will not.
Running very old versions of GCHP has also been successful with SGI MPT, MPICH, and MVAPICH2, a derivation of MPICH for use with the Infiniband interconnect. All tests were configured with Intel Fortran + Intel C, Intel Fortran + GNU C, or GNU Fortran + GNU C.
More recent versions of GCHP have only been tested with OpenMPI. We have had reports of diagnostic write problems when using MPICH and MVAPICH2 on some systems. Switching to OpenMPI version 3.0 or 3.1 appears to resolve the problems. We therefore recommend trying OpenMPI 3 before other implementations of MPI if possible. See the GCHP issues page on GitHub for more information.
You may test which compiler your version of MPI is using with:
$ mpif90 --version (for Fortran Compiler) $ mpicc --version (for C compiler)
Starting in GCHP 12.5.0 we recommend using the Intel Fortran Compiler (ifort) 17.0.4 or above. Prior versions of GCHP are compatible with earlier versions of ifort but an update to the GMAO MAPL library in 12.5.0 requires 17.0.4 or later.
GCHP 12.5.0 also compiles and runs successfully with the GNU Compiler Collection fortran compiler (gfortran). GCHP versions prior 12.5.0 will compile successfully with gfortran but will not run due to a bug. See the wiki post on this issue for more information on this issue.
*** Important *** The NASA finite-volume cubed-sphere (FV3) dynamical core used for advection is at least 50% slower when compiled with gfortran. Only use gfortran is absolutely necessary or if you do not mind the performance hit.
A summary of past reported version compatibility is displayed in the table below. Our testing is not exhaustive so compilers not listed may be compatible with GCHP. If you are using GCHP version and compiler combination not listed below please contact the GEOS-Chem Support Team to have your information added.
|Compiler||GCHP Versions Tested||Result|
|ifort 14||DevKit||Failed to compile prerequisite libraries|
|ifort 13||v11-02b||Relatively slow compared to more recent versions of ifort|
|ifort 184.108.40.206||12.2.0||Fail; use 17.0.4 or later|
|gfortran 8.2||12.2.x, 12.3.x, 12.4.x||Compilation successful, but run fails. See the wiki post on this issue for more information.|
|ifort 17.0.4||12.2.0 thru current version||Success; earlier versions of ifort17 are buggy and should not be used|
The Earth System Modeling Framework (ESMF) library provides a software infrastructure that allows different components of Earth System Models to communicate with each other, using MPI parallelization. For more information about ESMF, please see: https://www.earthsystemcog.org/projects/esmf/.
Current versions of GCHP come with ESMF source code that you build as part of the GCHP build sequence. The following ESMF versions are used:
- Prior to GCHP 12.2.0: v5.2.0rp2
- GCHP 12.2.0 - 12.4.0: v7.1.0r
- GCHP 12.5.0 to present: v8.0.0 beta snapshot 28.
We are in the process of restructuring GCHP so that ESMF is built and accessed as an external library. Once we make this switch you will need to download and build ESMF outside of GCHP for long-term use. Please see instructions on the ESMF webpage for "How to Git Clone ESMF". Follow instructions to "clone the entire repository" so that you will be able to change ESMF versions using git without downloading again. We anticipate using the ESMF v8.0.0 public release once it is available.
Once downloaded, read
ESMF/README for information about build requirements. Environment variables related to C and Fortran compilers that are required for GEOS-Chem are also required for ESMF. These variables are listed in the README, and are also detailed on the GEOS-Chem wiki page for environment variables that specify compiler name. See also the environment examples files that come with GCHP run directories. At minimum, you also need to define environment
ESMF_DIR to be the path to your ESMF clone.
The ESMF README contains instructions for required libraries and how to build ESMF. You should load libraries you plan on running with GCHP (fortran/C compilers, NetCDF, and MPI) prior to building ESMF. The most straightforward way to build is as follows, but please read the README so that you are fully aware of options:
$ export ESMF_DIR=/path/to/ESMF $ cd $ESMF_DIR $ make $ make install
make builds ESMF and calling
make install places the build into the
$ESMF_DIR/DEFAULTINSTALLDIR directory. The build files are placed within subdirectories specifying identifying information including compiler. They are not wiped out when you clean with
make distclean. These two features enable you to clean and rebuild ESMF with different compilers in advance of needing them to build and run GCHP.
Please contact ESMF Support at https://www.earthsystemcog.org/projects/esmf/contactus/ if you run into problems downloading or building ESMF. Please open a GCHP GitHub issue if you encounter problems in ESMF when compiling or building GCHP.
To compile certain ESMF components in GCHP, a compatible C pre-processor (cpp) needs to be available. If you are having compilation errors in the GCHP/Shared directory, you may check your C pre-processor version with the following command:
$ cpp --version
cpp versions in the 4.x series (4.8.5) have been used to successfully build GCHP, while at least one version in the 5.x series (5.4.0) have failed to correctly process the Fortran sources in the Shared directories for compilation.
Goddard Fortran Template Libary (gFTL)
Starting GCHP 12.5.0, MAPL requires a fortran template library developed by NASA. The library can be cloned from https://github.com/nasa/gFTL. Complete instructions for download are provided during creation of a GCHP run directory. However, you can prepare in advance by doing the following to acquire the library locally on your system. Please note that building gFTL requires CMake build software.
- Navigate to directory where you want to download gFTL
- Type the following at the command prompt:
$ git clone https://github.com/Goddard-Fortran-Ecosystem/gFTL $ cd gFTL $ git checkout v1.0.0 $ mkdir build $ cd build $ cmake .. -DCMAKE_INSTALL_PREFIX=../install $ make install
You should then verify success by checking that directories
A netCDF library installation is required to run GCHP. If you are using GEOS-Chem on a shared computer system, chances are that your IT staff will have already installed one or more netCDF library versions that you can use. Please note that parallel reading and writing requires netCDF-4 and requires that it be compiled with parallel-enabled HDF5 libraries. NetCDF-3 does not have parallel capabilities.
Starting with netCDF-4.2, the Fortran netCDF library split off as an independent distribution to be built after building the C library. Prior to that version the Fortran netCDF library was bundled with the C library in a single distribution of netCDF. We have successfully built GCHP using versions of netCDF both before and after the split. The only difference to be aware of is that using netCDF-4.2 requires setting additional environment variables.
If you are using netCDF-4.2 and later versions then you will need to include the following in your bashrc (this example assumes you are using bash):
export GC_BIN="$NETCDF_HOME/bin" export GC_INCLUDE="$NETCDF_HOME/include" export GC_LIB="$NETCDF_HOME/lib" export GC_F_BIN="$NETCDF_FORTRAN_HOME/bin" export GC_F_INCLUDE="$NETCDF_FORTRAN_INCLUDE" export GC_F_LIB="$NETCDF_FORTRAN_LIB"
If using earlier versions of netCDF (prior to 4.2) then you should only include the following:
export GC_BIN="$NETCDF_HOME/bin" export GC_INCLUDE="$NETCDF_HOME/include" export GC_LIB="$NETCDF_HOME/lib"
For more information about netCDF libraries, please see the wiki Guide to netCDF in GEOS-Chem.
Downloading Software using Spack Package Manager
The Spack Package Manager may be used to download and build CMake, MPI, and NetCDF libraries needed for GCHP. You will need to have a C/C++/Fortran compiler such as GNU Compiler Collection available locally before you start. More information is coming soon.