GCHP Hardware and Software Requirements

From Geos-chem
Jump to: navigation, search

Next | User Manual Home | GCHP Home

  1. Hardware and Software Requirements
  2. Downloading Source Code
  3. Obtaining a Run Directory
  4. Setting Up the GCHP Environment
  5. Compiling
  6. Basic Example Run
  7. Configuring a Run
  8. Output Data
  9. Developing GCHP
  10. Run Configuration Files


Hardware Requirements

The following is a short list of hardware requirements for GCHP:

  1. GCHP requires a minimum of 6 cores to run.
  2. The number of cores that you run GCHP on must be a multiple of 6.
  3. Most GCHP and GEOS-Chem "classic" users use the Intel Fortran Compiler (ifort) although you may now also use open-source GNU compilers. Regardless of whether you use an Intel compiler or not we highly recommend that you use Intel CPUs, preferably Xeon. It is a known issue that ifort does not optimize well on AMD chips (this was actually intention). Don't worry too much about CPU speed; if you can use Xeon CPUs then you will be OK.
  4. Most clusters are built upon nodes with 16 to 32 CPUs each. One node of 32 CPUs will provide sufficient resources for standard GCHP runs. It will have the added benefit of not needing to use the network interconnect for MPI.
  5. GCHP can run with ~7 GB per CPU if using C24, the cubed sphere resolution equivalent to 4° x 5°. Higher resolutions will require more memory per core depending on how many cores you use for your run. We have found that the biggest bottleneck for running high resolution simulations is the amount of memory available per node since it limits the memory available per core. The best solution when running into memory per core limitations is to request more nodes, reserve all memory per node by requesting all cores, and use fewer cores per node than you have requested for your actual GCHP run.
  6. InfiniBand is recommended if you can afford it. If not, a 10 Gigabit Ethernet is a good alternative when using high-end interconnects like Cisco or high-end Hewlett-Packard (HP). A 1000 Mbps Gigabit Ethernet should also work if you get an optimized router like Cisco or HP, but it will be less speed-dependent and have more packet latency. This is minimized in the high-end network interconnects.
  7. If you are using an interconnect, it would be very helpful if the system had two: one interconnect for file transfer, log-in, etc, and the other interconnect for MPI-only communication. Using two interconnects in this way prevents problems such as file I/O interfering with MPI packet transfer. This is probably now standard on turn-key systems. Ask your local cluster administrator what is available for you to use.

Software Requirements

The recommended software detailed in this section is standard on many primary HPC systems in use by academic and scientific institutions. Furthermore, it is now possible to compile and run GCHP using 100% open-source software. GCHP requires C and Fortran compilers, an implementation of a Message Passing Interface (MPI), C-preprocessor software, and netCDF C and Fortran libraries. If any of the required software is not available then you must acquire it to compile and run GCHP.

Please be aware that it is absolutely necessary to use the same compilers, whether Fortran or C, when compiling GEOS-Chem, GCHP, MPI, NetCDF and their prerequisites. For systems with pre-installed packages, such as an institutional compute cluster, you must determine how each package was compiled to ensure that you will be able to build and run GCHP successfully. If your pre-installed packages were not built with the same compilers then you will need to build them yourself or have them built for you by your organization's technical support staff.

Operating Systems

We are currently only supporting GCHP on Linux distributions.

MPI Implementations

The Message Passing Interface (MPI) is the fundamental requirement for distributed memory computing. Successful GCHP tests have used OpenMPI, MVAPICH2, MPICH, and SGI MPT, all configured with Intel Fortran + Intel C, Intel Fortran + GNU C, or GNU Fortran | GNU C.

You may test which compiler your version of MPI is using with:

$ mpif90 --version (for Fortran Compiler)
$ mpicc  --version (for C compiler)

Please note that GCHP versions prior to v11-02d do not work with MPICH; they instead work with MVAPICH2 which is a derivation of MPICH for use with the Infiniband interconnect. We recommend that users who do not have an Infiniband interconnect use MPICH instead of MVAPICH2. For more information see the MPICH MPI implementation not readily usable with GCHP wiki post.

Fortran Compiler

GCHP has been successfully compiled with the GNU Compiler Collection (GCC) and the Intel Fortran Compiler (ifort). A summary of version compatibility is displayed in the table below.

Compiler GCHP Versions Tested Result
Intel Fortran 13 v11-02b Relatively slow compared to more recent versions of ifort
Intel Fortran 14 DevKit Failed to compile prerequisite libraries
Intel Fortran 15 v11-02d Success
Intel Fortran 16 v11-02d Success
Intel Fortran 17.0.4 v11-02c Success
GNU Compiler Collection 5.2.0 v11-02c,v11-02d Success
GNU Compiler Collection 7.1.0 v11-02-release-candidate Success

We have found that IFORT 15 works best with both MPAVICH2 and OpenMPI MPI implementations on the Harvard Odyssey cluster using Intel chips. Compiling with ifort on AMD chips results in a significant slow-down due to the design of the compiler. Timing tests to determine the relative performance of using Intel versus the open-source GNU compiler are in progress.

If you have tried to compile GCHP with a compiler or version not listed above please contact the GEOS-Chem Support Team to report your findings.

C/Fortran Pre-processor

To compile certain ESMF components in GCHP, a compatible C pre-processor (cpp) needs to be available. If you are having compilation errors in the GCHP/Shared directory, you may check your C pre-processor version with the following command:

$ cpp --version

cpp versions in the 4.x series (4.8.5) have been used to successfully build GCHP, while at least one version in the 5.x series (5.4.0) have failed to correctly process the Fortran sources in the Shared directories for compilation.

NetCDF Libraries

A netCDF library installation is required to run GCHP. If you are using GEOS-Chem on a shared computer system, chances are that your IT staff will have already installed one or more netCDF library versions that you can use. Please note that parallel reading and writing requires netCDF-4 and requires that it be compiled with parallel-enabled HDF5 libraries. NetCDF-3 does not have parallel capabilities.

Starting with netCDF-4.2, the Fortran netCDF library split off as an independent distribution to be built after building the C library. Prior to that version the Fortran netCDF library was bundled with the C library in a single distribution of netCDF. We have successfully built GCHP using versions of netCDF both before and after the split. The only difference to be aware of is that using netCDF-4.2 requires setting additional environment variables.

If you are using netCDF-4.2 and later versions then you will need to include the following in your bashrc (this example assumes you are using bash):

export GC_BIN="$NETCDF_HOME/bin"
export GC_INCLUDE="$NETCDF_HOME/include"
export GC_LIB="$NETCDF_HOME/lib"
export GC_F_BIN="$NETCDF_FORTRAN_HOME/bin"
export GC_F_INCLUDE="$NETCDF_FORTRAN_INCLUDE"
export GC_F_LIB="$NETCDF_FORTRAN_LIB"

If using earlier versions of netCDF (prior to 4.2) then you should only include the following:

export GC_BIN="$NETCDF_HOME/bin"
export GC_INCLUDE="$NETCDF_HOME/include"
export GC_LIB="$NETCDF_HOME/lib"

These environment variables are included in the sample GCHP bashrc files included in all GCHP run directories. Please see the Getting Started with GCHP section of the wiki for more information on setting up your environment. Just remember that if you change any of the sample bash environment files to work on your system then you should check your netCDF version number to determine what environment variables you need.


Next | User Manual Home | GCHP Home

--Lizzie Lundgren (talk) 22:55, 5 March 2018 (UTC)