Skip to Content

CESM

A. Description

Developed by NCAR, CESM (Community Earth System Model) is a community code that simulates the Earth's climate over time.  With the ability to couple together numerous Earth system components (land, atmosphere, oceans, etc.) at a wide array of resolutions, CESM simulations can be used to predict possible future climate states for different scenarios (e.g., for increasing concentrations of atmospheric carbon dioxide).

For more information, see the CESM homepage.

CESM 1.2.2

B1. Obtaining the Code

The easy way:

On the Blue Waters system, copy and extract the tarball at /sw/contrib/cesm/cesm1_2_2-20200603.tgz

This version of CESM 1.2.2, last updated on 6/3/20, has been modified to compile and run on the Blue Waters system.  In particular, the Blue Waters machine file (cesm1_2_2/scripts/ccsm_utils/Machines/env_mach_specific.bluewaters) and compiler configuration file (cesm1_2_2/scripts/ccsm_utils/Machines/config_compilers.xml) have been updated to reflect the most recent set of modules and flags that should be used.


The understand-everything-that-was-modified way:

Before downloading the code, the Subversion module needs to be loaded to use a newer version of svn (the default version on Blue Waters is too old):

module load Subversion

Check out CESM 1.2.2 from the svn repository (further info on downloading CESM can be found here, and you may need to register here to get the svn password):

svn co https://svn-ccsm-models.cgd.ucar.edu/cesm1/release_tags/cesm1_2_2 cesm1_2_2

Replace the Blue Waters machine file (env_mach_specific.bluewaters) with the most recent version with updated modules:

cp /sw/contrib/cesm/env_mach_specific.bluewaters cesm1_2_2/scripts/ccsm_utils/Machines/env_mach_specific.bluewaters

Update the location of the pio repository in cesm1_2_2/SVN_EXTERNAL_DIRECTORIES.  Change the models/utils/pio entry to:

models/utils/pio                   https://github.com/NCAR/ParallelIO.git/tags/pio1_8_12/pio

Update the location of the genf90 repository in cesm1_2_2/tools/cprnc/SVN_EXTERNAL_DIRECTORIES.  Change the genf90 entry to:

genf90 https://github.com/PARALLELIO/genf90/tags/genf90_140121

Several instances of the variable lsize in the files cesm1_2_2/models/lnd/clm/src/cpl_mct/lnd_comp_mct.F90 and cesm1_2_2/models/lnd/clm/src/util_share/decompInitMod.F90 need to be made unique for compilation to work correctly.  Examples can be found in /sw/contrib/cesm.  If you just want to copy the updated files to your copy of CESM:

cp /sw/contrib/cesm/decompInitMod.F90 cesm1_2_2/models/lnd/clm/src/util_share/decompInitMod.F90
cp /sw/contrib/cesm/lnd_comp_mct.F90 cesm1_2_2/models/lnd/clm/src/cpl_mct/lnd_comp_mct.F90

A line in cesm1_2_2/models/atm/cam/src/control/cam_history.F90 was modified to eliminate some errors.  The updated version can be copied like so:

cp /sw/contrib/cesm/cam_history.F90 cesm1_2_2/models/atm/cam/src/control/cam_history.F90

Version 18.10.0 of the PGI compiler does not require modifications to the flags in scripts/ccsm_utils/Machines/config_compilers.xml as long as the netcdf module versions remain the same as with the last version (cesm1_2_2-20190221.tgz).  Otherwise, -mcmodel=medium must be added to the C and Fortran flags (CFLAGS and FFLAGS) and -dynamic must be added to the linker flags (LDFLAGS) in scripts/ccsm_utils/Machines/config_compilers.xml.  If you just want to copy the updated file to your copy of CESM:

cp /sw/contrib/cesm/config_compilers.xml cesm1_2_2/scripts/ccsm_utils/Machines/config_compilers.xml

B2. Creating and Building a Case

Supported grids/resolutions
Supported component sets

The Subversion module needs to be loaded to use a newer version of svn (the default version on Blue Waters is too old). If you haven't already loaded it (as you would have done if you checked out the code yourself in the previous step), do so before you build the code:

module load Subversion

Because configuring CESM can be complicated, we recommend writing a script for each new case you create.  Here's an example of a script that you could use.  You would, of course, need to modify the paths, case parameters, etc. for your case.  If you use the script below, be sure that you change $USER to your own username.

#!/bin/csh -f

### set env variables
setenv CASENAME my_case
setenv CESMROOT /u/sciteam/$USER/cesm1_2_2

$CESMROOT/scripts/create_newcase -case $CASENAME -res ne120_t12 -mach bluewaters -compset B_2000_CAM5 -compiler pgi
cd $CASENAME

### modify case parameters
./xmlchange -file env_run.xml -id RUN_STARTDATE -val '1979-01-01'
./xmlchange -file env_run.xml -id STOP_OPTION -val 'ndays'
./xmlchange -file env_run.xml -id REST_N -val '5'
./xmlchange -file env_run.xml -id RESUBMIT -val '0'

# change run type to startup (default was hybrid)
./xmlchange -file env_run.xml -id RUN_TYPE -val 'startup'

# change the input directory
./xmlchange -file env_run.xml -id DIN_LOC_ROOT -val '/scratch/sciteam/$USER/cesm_data'

### call cesm_setup and create user namelists
./cesm_setup

### build the case
./*.build

### submit the case for running on the system
./*.submit

An input data directory needs to be selected where CESM can store the input files that it downloads.  The environment variable CESMDATAROOT contains this path and is set in cesm1_2_2/scripts/ccsm_utils/Machines/env_mach_specific.bluewaters.  At the very least, "${USER}" in the CESMDATAROOT path needs to be changed to your username.  If you want the input data files to stick around for future CESM runs, you may want to consider changing the path to somewhere in your project directory to avoid the scratch purge.

If the directory pointed to by CESMDATAROOT does not exist, create it with the mkdir command.

Next, run cesm_setup to set up the case.  Once that completes, the path in DIN_LOC_ROOT in env_run.xml in the newly created case directory needs to be changed to the path to which you set CESMDATAROOT (this is done via the xmlchange tool in the example script above).

After that, call the build script ($CASENAME.build) in the case directory to compile the case.  During busier times, this could take an hour or more on Blue Waters.  We recommend using the pgi compiler.

Once the build has completed, submit the case via the submit script ($CASENAME.submit) in the case directory.  This runs some checks and submits the $CASENAME.run PBS script from the case directory.

Output data will be written to scratch at /scratch/sciteam/$USER/$CASENAME.  The combined stdout/stderr file ($CASENAME.o$JOBID) will be written to the case directory.

 

CESM 2

CESM 2 is python-based.

B1. Obtaining the Code

The code should be downloaded from the CESM 2 git repository.  A few releases are available.  To save one of the latest versions, 2.1.0, in a directory named cesm2_1_0, run:

git clone -b release-cesm2.1.0 https://github.com/ESCOMP/cesm.git cesm2_1_0

B2. Configuring CESM 2

With the code in cesm2_1_0, change to that directory, load an older version of bwpy to get access to python 2.7 (as well as the Subversion module because the system default is too old), and run checkout_externals to configure the code.  See http://www.cesm.ucar.edu/models/cesm2/release_download.html for dealing with svn errors when running checkout_externals.

cd cesm2_1_0
module load bwpy/0.3.2 Subversion
./manage_externals/checkout_externals

B3. Creating and Building a Case

Supported grids/resolutions
Supported component sets

As when configuring CESM 2, you need to have python 2.7 loaded to create and build a case.  You also need to set the CESMDATAROOT environment variable, which specifies a directory where CESM 2 can store the input files that it downloads.  If the directory pointed to by CESMDATAROOT does not exist, create it with the mkdir command.

module load bwpy/0.3.2
export CESMDATAROOT=/scratch/sciteam/$USER/cesm_data

As with CESM 1.x, we recommend writing a script, such as the one below, to configure and build a case.  If you use this example, be sure to modify the paths, case parameters, etc. for your case.  You also need to change $USER to your own username, set PSN to the project under which you want to run your case, and set DIN_LOC_ROOT to the same path as CESMDATAROOT.  CESM2 automatically recognizes that it's being built on Blue Waters, so it's not necessary to specify the machine type when calling create_newcase.  The default compiler for CESM 2 is set to Intel.

#!/bin/csh -f

### set env variables
setenv CASENAME my_case
setenv CESMROOT /u/sciteam/$USER/cesm2_1_0
setenv PSN baga

$CESMROOT/cime/scripts/create_newcase --case $CASENAME --res f19_g16_rx1 --compset A --project $PSN
cd $CASENAME

### modify case parameters
./xmlchange -file env_run.xml -id RUN_STARTDATE -val '1979-01-01'
./xmlchange -file env_run.xml -id STOP_OPTION -val 'ndays'
./xmlchange -file env_run.xml -id REST_N -val '5'
./xmlchange -file env_run.xml -id RESUBMIT -val '0'

# change run type to startup
./xmlchange -file env_run.xml -id RUN_TYPE -val 'startup'

# change the input directory
./xmlchange -file env_run.xml -id DIN_LOC_ROOT -val '/scratch/sciteam/$USER/cesm_data'

### set up the case
./case.setup

### build the case
./case.build

### submit the case for running on the system
./case.submit

In the above script, after the case is created with create_newcase and parameters set for the run using xmlchange, cesm_setup is run to set up the case.

Once that completes, the build script, case.build, is called to compile the case.

Once the build has completed, the case is submitted via the case.submit script.  Two jobs are created.  The first runs the simulation, and the second archives the results upon successful completion of the simulation.

Output data will be written to scratch at /scratch/sciteam/$USER/$CASENAME, the results are archived in /scratch/sciteam/$USER/archive/$CASENAME, and the combined stdout/stderr files ($CASENAME.run.o$JOBID and $CASENAME.st_archive.o$JOBID) will be written to the case directory.

 

Additional Information / References

CESM 1.2

CESM 2.x