GEOS-Chem High Performance Working Group

From Geos-chem
Revision as of 17:16, 23 August 2018 by Lizzie Lundgren (Talk | contribs) (Welcome)

Jump to: navigation, search

GCHP Home

Welcome

Welcome to the GEOS-Chem High Performance (GCHP) Working Group wiki page! GCHP represents the next generation of GEOS-Chem with a distributed memory capability enabling efficient scaling across many cores and finer resolution simulations. GCHP also has the advantage of using a cubed-sphere geometry enabling more accurate transport and eliminating the polar singularity inherent to lat-lon grids.

If you are working on a project using GCHP, please add your name and project description to the GCHP projects list located on this page. Also help us pool performance information across systems by contributing your GCHP run information, including your system specifications, on our GCHP Timing Tests page.

If you would like to stay informed of GCHP developments, please join the GEOS-Chem High Performance Working Group mailing list. We also encourage you to join our GCHP Slack workspace. Please contact the GEOS-Chem Support Team with GCHP questions and feedback.

Contact information

Email

GEOS-Chem HP Working Group Chairs
GEOS-Chem HP email list geos-chem-hp [at] g.harvard.edu
To subscribe to email list Either
  • Send an email to geos-chem-hp+subscribe [at] g.harvard.edu

Or

To unsubscribe from email list Either
  • Send an email to geos-chem-hp+unsubscribe [at] g.harvard.edu

Or

Slack

The GCHP workspace on Slack facilitates easy communication between GCHP users. Contact the Lizzie Lundgren (elundgren [at] seas.harvard.edu) to join.

HP Computing Platforms

If you are a GCHP user, please add your system information below.

User Group Platform and OS Contact Person
Dalhousie Platform: Linux 2.6.18; OS: CentOS 5.7 Junwei Xu
Dalhousie Platform: Linux 2.6.32; OS: Red Hat 6.4 Junwei Xu
Harvard Platform: Linux 3.10.0; OS: CentOS 7.4.1707
Intel(R) Xeon(R) CPU E5-2683 v4 @ 2.10GHz
GEOS-Chem Support Team
MIT [Selin group] Platform: Linux 4.9.13; OS: Fedora 24 Daniel Rothenberg
MIT [LAE] Platform: Linux 3.13.0; OS: Ubuntu 14.04 LTS Sebastian D Eastham
Add yours here!    

Current GCHP Projects

If you are working on a project using GCHP, please add your group name, contact information, and a short description of your project to the table below.

User Group Description Contact Person
Harvard Core development, validation, and documentation Lizzie Lundgren
Jintai Lin (ACM @ PKU) Impacts of small-scale processes on global and regional chemistry (OH, O3 and CO in particular) Jintai Lin
Christoph Keller Coupling of GEOS-Chem to the GEOS Earth System Model and Data Assimilation System. Christoph Keller
Daniel Rothenberg (Selin Group @ MIT) Offline simulations with CESM/CAM meteorology Daniel Rothenberg
Add yours here!    

GCHP User Reports

If you are using GCHP, whether for a scientific project or simply to test out on your local system, please share your successes here. We also encourage you to join the GCHP Slack channel to connect with other users. Please report any bugs you find to the GEOS-Chem Support Team so that we can add them to the GCHP outstanding issues page.

To fill out this section, copy the template and fill out your information. You can add as many runs as you wish. Feel free to edit the tables to best suit the information you are trying to show.

Template: your name and group here

Cluster name Compilers MPI NetCDF library # of Nodes # or CPUs GCHP version Grid resolution Duration [days:hrs] Met source Met resolution Dynamic timestep [min] Chemistry timestep [min] Total run time [hrs]

Add notes here.

Bob Yantosca (GCST @ Harvard)

Cluster name Compilers MPI NetCDF library # of Nodes # or CPUs GCHP version Grid resolution Duration [days:hrs] Met source Met resolution Dynamic timestep [min] Chemistry timestep [min] Total run time [hr:min]
Odyssey gfortran 8.2.0 OpenMPI 3.1.1 netcdf 4.1.3 1 6 ? C24 00:06 GEOS-FP 0.25° x 0.3125° 10 20 00:17
Odyssey gfortran 7.1.0 OpenMPI 3.1.1 netcdf 4.1.3 1 6 ? C24 00:06 GEOS-FP 0.25° x 0.3125° 10 20 00:19

NOTES:

  • Timing tests reflect a 6-hour GCHP "out-of-the-box" run (with run directory "gchp_standard"). Such tests allow us to verify that updates made in the GEOS-Chem "Classic" code base won't break GCHP.
  • This version of GCHP corresponds to GEOS-Chem "Classic" version 12.0.1.

Colin Lee, Dalhousie University

Cluster name Compilers MPI NetCDF library # of Nodes # of CPUs GCHP version Grid resolution Duration [days-hrs] Met source Met resolution Dynamic timestep [min] Chemistry timestep [min] Total run time [hrs]
Graham ifort 2016 openmpi 2.1.1 4 1 24 ??? c24 31-0 GEOS-FP 2x25 10 20 14:22
Graham ifort 2016 openmpi 2.1.1 4 4 96 ??? c96 0-1 GEOS-FP 2x25 5 20 0:18
Graham ifort 2016 openmpi 2.1.1 4 4 96 ??? c96 0-1 GEOS-FP 0.25x0.3125 5 20 0:30
Graham ifort 2016 openmpi 2.1.1 4 72 864 ??? c360 0-1 GEOS-FP 2x2.5 5 20 1:30
Graham ifort 2016 openmpi 2.1.1 4 72 864 ??? c360 1-0 GEOS-FP 2x2.5 5 20 6:57

Notes

  • Running on > approx. 1000 cores at c360 appears to cause a seg fault in call to SIN function around CubeToLaLon.F90:432

Sebastian Eastham, MIT (LAE)

Cluster name Compilers MPI NetCDF library # of Nodes # or CPUs GCHP version Grid resolution Duration [days-hrs] Met source Met resolution Dynamic timestep [min] Chemistry timestep [min] Total run time [hrs]
Nehalem ifort 2015 MPICH2 v3.3 NetCDF C 4.4.1, NetCDF Fortran 4.4.4 2 36 11-02c C90 31-0 GEOS-FP 0.25x0.3125 5 10 68

Notes: using 36 cores out of a total of 48 available from 2 nodes with a 10 GB direct ethernet connection.

GCHP Home