GEOS-Chem benchmarking: Difference between revisions

From Geos-chem
Jump to navigation Jump to search
Line 16: Line 16:


#A commit is pushed to any development branch[[#Notes|<sup>1</sup>]] in the [https://github.com/geoschem/GCClassic <tt>geoschem/GCClassic</tt> "superproject" repository]
#A commit is pushed to any development branch[[#Notes|<sup>1</sup>]] in the [https://github.com/geoschem/GCClassic <tt>geoschem/GCClassic</tt> "superproject" repository]
#A commit is pushed to any development branch[[#Notes<sup>1</sup>]] in the [https://github.com/geoschem/GCHP <tt>geoschem/GCHP</tt> "superproject" repository].
#A commit is pushed to any development branch[[#Notes|<sup>1</sup>]] in the [https://github.com/geoschem/GCHP <tt>geoschem/GCHP</tt> "superproject" repository].


Evaluation tables are posted to [https://gc-dashboard.org <tt>gc-dashboard.org</tt>] upon successful completion of each 1-hour benchmark simulation.  The evaluation tables include information on OH metrics, emissions totals, global mass, and a summary table.
Evaluation tables are posted to [https://gc-dashboard.org <tt>gc-dashboard.org</tt>] upon successful completion of each 1-hour benchmark simulation.  The evaluation tables include information on OH metrics, emissions totals, global mass, and a summary table.

Revision as of 18:13, 22 May 2024

Objectives

Benchmarking supports the maintenance of GEOS-Chem as a robust state-of-the-science facility with a nimble grass-roots approach and strong version control. Benchmarking has four main objectives:

  1. Document a consistent GEOS-Chem model configuration, and the expected characteristics of that configuration.
  2. Support version control through traceability, and by confirming the expected behavior of model developments submitted by the community.
  3. Track the evolution of the model over the years.
  4. Promote scientific transparency of GEOS-Chem.

Procedure

The GEOS-Chem benchmarking procedure is described below. GEOS-Chem uses semantic versioning (i.e. X.Y.Z version labels).

1-hour benchmarks

1-hour benchmarks serve as "sanity checks" and are useful in determining if two successive updates result in identical model output. These benchmarks are triggered when:

  1. A commit is pushed to any development branch1 in the geoschem/GCClassic "superproject" repository
  2. A commit is pushed to any development branch1 in the geoschem/GCHP "superproject" repository.

Evaluation tables are posted to gc-dashboard.org upon successful completion of each 1-hour benchmark simulation. The evaluation tables include information on OH metrics, emissions totals, global mass, and a summary table.

1-month benchmarks

1-month benchmarks (aka "alpha benchmarks") are used to determine how adding a new science feature into GEOS-Chem will change the model output. These are triggered when:

  1. An alpha tag2 is pushed to any development branch1 in the geoschem/GCClassic superproject repository
  2. An alpha tag2 is pushed to any development branch[[#Notes1]] in the geoschem/GCHP "superproject" repository.

Evaluation plots and tables are posted to gc-dashboard.org upon successful completion of each 1-hour benchmark simulation. These include comparison plots of species concentrations, emissions, aerosol optical depth, J-Values, as well as the same tables produced for the 1-hour benchmarks.

etc

  1. Any update to the GEOS-Chem source code or run directories will change the GEOS-Chem version number (X.Y.Z).
  2. Z versions will be released at intervals determined by the GEOS-Chem Support Team (GCST) and may include bug fixes or updates that do not impact the full-chemistry simulation.
  3. Any change impacting the standard full-chemistry simulation will require a Y version change and a dedicated 1-month benchmark. The benchmark results will be posted on the wiki and an email will be sent to the developer(s) and the GEOS-Chem Steering Committee (GCSC).
  4. The developer(s) and GCSC will assess the benchmark results and review a benchmark assessment form on the wiki. If there are any concerns about the benchmark results, the GCST will be notified and further investigation and/or benchmarking may be required.
  5. If the update is for a specialty simulation (e.g. CO2, CH4, Hg), then a further benchmark may be conducted by the appropriate Working Group.
  6. Once the developer is satisfied with the changes in the 1-month benchmark, GEOS-Chem Model Scientist Daniel Jacob will review the results and approve the new internal version.
  7. 1-year full-chemistry and/or transport tracer benchmarks for Y versions will be conducted only if justifiably requested by the developer or by GEOS-Chem Steering Committee members.
  8. Each new major version release (i.e. X version) will be subject to a 1-year benchmark to be inspected by the GEOS-Chem Steering Committee before approval.

Notes

  1. The development branches are dev/X.Y.Z or dev/no-diff-to-benchmark.

List of GEOS-Chem benchmarks

Links to past 1-month and 1-year benchmark simulations can be found on the GEOS-Chem versions wiki page.

Benchmark output archive

Output files and evaluation plots for 1-month and 1-year benchmark simulations are archived at Harvard as summarized below. GEOS-Chem users may utilize these output for comparisons against their own simulations.

Directory Description
https://gc-dashboard.org/search?searchString=&1Hr=1Hr&GCHP=GCHP&GCC=GCC Contains the following data from the 1-hour benchmarks used to evaluate GEOS-Chem:
  • Evaluation plots & tables
  • Run log
  • Run directory (tarball)
  • Diagnostic files (tarball)
  • Restart Files (tarball)
https://gc-dashboard.org/search?searchString=&1Mon=1Mon&GCHP=GCHP&GCC=GCC Contains the following data from the 1-month benchmarks used to evaluate GEOS-Chem:
  • Evaluation plots & tables
  • Run log
  • Run directory (tarball)
  • Diagnostic files (tarball)
  • Restart Files (tarball)
http://ftp.as.harvard.edu/gcgrid/geos-chem/1yr_benchmarks/ Contains the following data from the 1-year benchmarks used to evaluate GEOS-Chem:
  • Evaluation plots
  • Restart files (tarball)
  • Model output (tarball)
  • Log files (tarball)
  • Input files (tarball)
http://ftp.as.harvard.edu/gcgrid/geos-chem/10yr_benchmarks/ Contains the following data from the 10-year benchmarks used to evaluate GEOS-Chem:
  • Evaluation plots & tables
  • Restart files (tarball)
  • Model output (tarball)
  • Log files (tarball)
  • Input files (tarball)

NOTE: "tarball" refers to a *.tar.gz file. This is an archive of files & folders created with tar cvzf and can be extracted with tar xzvf.

Benchmark plotting routines

The benchmark plotting routines are included with GCPy, a Python took kit available for GEOS-Chem.