Run ACCESS-ESM1.5
About
ACCESS-ESM1.5 is a fully-coupled global climate model, combining atmosphere, land, ocean, sea ice, ocean biogeochemistry and land biogeochemistry components. A description of the model and its components is available in the ACCESS-ESM1.5 overview.
The instructions below outline how to run ACCESS-ESM1.5 using ACCESS-NRI's software deployment pipeline, specifically designed to run on the National Computating Infrastructure (NCI) supercomputer Gadi.
If you are unsure whether ACCESS-ESM1.5 is the right choice for your experiment, take a look at the overview of ACCESS Models.
All ACCESS-ESM1.5 configurations are open source, licensed under CC BY 4.0 and available on ACCESS-NRI GitHub.
ACCESS-ESM1.5 release notes are available on the ACCESS-Hive Forum and are updated when new releases are made available.
Prerequisites
General prerequisites
Before running ACCESS-ESM1.5, you need to Set Up your NCI Account.
Model-specific prerequisites
-
MOSRS account
The Met Office Science Repository Service (MOSRS) is a server run by the UK Met Office (UKMO) to support collaborative development with other partners organisations. MOSRS contains the source code and configurations for some model components in ACCESS-ESM1.5 (e.g., the UM).
To apply for a MOSRS account, please contact your local institutional sponsor. -
Join the vk83, ki32 and ki32_mosrs projects at NCI
To join these projects, request membership on the respective vk83, ki32 and ki32_mosrs NCI project pages.Tip
To request membership for the ki32_mosrs subproject you need to be member of the ki32 project first.
For more information on joining specific NCI projects, refer to How to connect to a project.
-
Payu
Payu is a workflow management tool for running numerical models in supercomputing environments, for which there is extensive documentation.
Payu on Gadi is available through a dedicatedconda
environment in the vk83 project.
After joining the vk83 project, load thepayu
module:module use /g/data/vk83/modules module load payu
To check that payu is available, run:
payu --version
payu --version 1.1.5 Warning
payu version >=1.1.5 is required
Get ACCESS-ESM1.5 configuration
All released ACCESS-ESM1.5 configurations are available from the ACCESS-ESM1.5 configs GitHub repository.
Released configurations are tested and supported by ACCESS-NRI, as an adaptation of those originally developed by CSIRO and CLEX CMS.
For more information on ACCESS-ESM1.5 configurations, check ACCESS-ESM1.5 page.
More information about the available experiments and the naming scheme of the branches can also be found in the ACCESS-ESM1.5 configs GitHub repository.
The first step is to choose a configuration from those available.
For example, if the required configuration is the co2 concentration driven pre-industrial configuration, then the branch to select is release-preindustrial+concentrations
.
To clone this branch to a location on Gadi, run:
mkdir -p ~/access-esm1.5
cd ~/access-esm1.5
payu clone -b expt -B release-preindustrial+concentrations https://github.com/ACCESS-NRI/access-esm1.5-configs preindustrial+concentrations
In the example above the payu clone
command clones the concentration driven pre-industrial configuration (-B release-preindustrial+concentrations
)
to a new experiment branch (-b expt
) to a directory named preindustrial+concentrations
.
Admonition
Anyone using a configuration is advised to clone only a single branch (as shown in the example above) and not the entire repository.
Tip
payu uses branches to differentiate between different experiments in the same local git repository.
For this reason, it is recommended to always set the cloned branch name (expt
in the example above) to something meaningful for the planned experiment.
For more information refer to this payu tutorial.
Run ACCESS-ESM1.5 configuration
If you want to modify your configuration, refer to Edit ACCESS-ESM1.5 configuration.
ACCESS-ESM1.5 configurations run on Gadi through a PBS job submission managed by payu.
The general layout of a payu-supported model run consists of two main directories:
- The control directory contains the model configuration and serves as the execution directory for running the model (in this example, the cloned directory
~/access-esm1.5/preindustrial+concentrations
). - The laboratory directory, where all the model components reside. For ACCESS-ESM1.5, it is typically
/scratch/$PROJECT/$USER/access-esm
.
This separates the small text configuration files from the larger binary outputs and inputs. In this way, the control directory can be in the $HOME
directory (as it is the only filesystem actively backed-up on Gadi). The quotas for $HOME
are low and strict, which limits what can be stored there, so it is not suitable for larger files.
The laboratory directory is a shared space for all payu experiments using the same model.
Inside the laboratory directory there are two subdirectories:
work
→ a directory where payu automatically creates a temporary subdirectory while the model is run. The temporary subdirectory gets created as part of a run and then removed after the run succeeds.archive
→ the directory where the output is stored following each successful run.
Within each of the above directories payu automatically creates subdirectories uniquely named according to the experiment being run.
Payu also creates symbolic links in the control directory pointing to the archive
and work
directories.
This design allows multiple self-resubmitting experiments that share common executables and input data to be run simultaneously.
Admonition
Files on the /scratch
drive, such as the laboratory directory, might get deleted if not accessed for several days and the /scratch
drive is limited in space. For these reasons, all model runs which are to be kept should be moved to /g/data/
by enabling the sync step in payu. To know more refer to Syncing output data.
Run configuration
To run ACCESS-ESM1.5 configuration execute the following command from within the control directory:
payu run
This will submit a single job to the queue with a run length given by runtime
in the config.yaml
file.
Tip
You can add the -f
option to payu run
to let the model run even if there is an existing non-empty work
directory, created from a previous failed run or from running payu setup
.
Monitor ACCESS-ESM1.5 runs
The payu run
command prints out the PBS job-ID
(formatted as <9-digit-number>.gadi-pbs
), as the last line to the terminal.
To print out information on the status of a specific job, you can execute the following command:
qstat <job-ID>
To show the status of all your submitted PBS jobs, you can execute the following command:
qstat -u $USER
The default name of your job is the name of the payu control directory (preindustrial+concentrations
in the example above).
This can be overwritten by altering the jobname
in the PBS resources section of the config.yaml
file.
S indicates the status of your run, where:
- Q → Job waiting in the queue to start
- R → Job running
- E → Job ending
- H → Job on hold
If there are no jobs listed with your jobname
(or if no job is listed), your run either successfully completed or was terminated due to an error.
For more information, check NCI documentation.
Stop a run
If you want to manually terminate a run, you can do so by executing:
qdel <job-ID>
Tip
If you started an ACCESS-ESM1.5 run using the -n
option (e.g., to run the model for several years), but subsequently decide not to keep running after the current process completes, you can create a file called stop_run
in the control directory.
This will prevent payu from submitting another job.
Error and output log files
PBS output files
When the model completes a run, PBS writes the standard output and error streams to two files inside the control directory: <jobname>.o<job-ID>
and <jobname>.e<job-ID>
, respectively.
These files usually contain logs about payu tasks, and give an overview of the resources used by the job.
To move these files to the archive
directory, use the following command:
payu sweep
Model log files
While the model is running, payu saves the model standard output and error streams in the access.out
and access.err
files inside the control directory, respectively.
You can examine the contents of these files to check on the status of a run as it progresses (or after a failed run has completed).
Warning
At the end of a successful run these log files are archived to the archive
directory and will no longer be found in the control directory. If they remain in the control directory after the PBS job for a run has completed it means the run has failed.
Model Live Diagnostics
ACCESS-NRI developed the Model Live Diagnostics framework to check, monitor, visualise, and evaluate model behaviour and progress of ACCESS models currently running on Gadi.
For a complete documentation on how to use this framework, check the Model Diagnostics documentation.
Trouble-shooting
If payu doesn't run correctly for some reason, a good first step is to run the following command from within the control directory:
payu setup
This command will:
- create the laboratory and
work
directories based on the experiment configuration - generate manifests
- report useful information to the user, such as the location of the laboratory where the
work
andarchive
directories are located
This can help to isolate issues such as permissions problems accessing files and directories, missing files or malformed/incorrect paths.
ACCESS-ESM1.5 outputs
At the end of a successful model run, output files, restart files and log files are moved from the work
directory to the archive
directory.
Symbolic links to these directories are also provided in the control directory for convenience.
If a model run is unsuccessful, the work
directory is left untouched to facilitate the identification of the cause of the model failure.
Outputs and restarts are stored in subfolders within the archive
directory, subdivided for each run of the model.
Output and restart folders are called outputXXX
and restartXXX
, respectively, where XXX is the run number starting from 000
.
Model components are separated into subdirectories within the output and restart directories.
Edit ACCESS-ESM1.5 configuration
This section describes how to modify ACCESS-ESM1.5 configuration.
The modifications discussed in this section can change the way ACCESS-ESM1.5 is run by payu, or how its specific model components are configured and coupled together.
The config.yaml
file located in the control directory is the Master Configuration file, which controls the general model configuration. It contains several parts, some of which it is more likely will need modification, and others which are rarely changed without having a deep understanding of how the model is configured.
To find out more about configuration settings for the config.yaml
file, refer to how to configure your experiment with payu.
Run lengths
One of the most common changes is to adjust the duration of the model run.
ACCESS-ESM1.5 simulations are split into smaller run lengths, each with the duration specified by the runtime
settings in the config.yaml
file:
The length of an ACCESS-ESM1.5 run is controlled by the runtime
settings in the config.yaml
file:
runtime:
years: 1
months: 0
days: 0
Warning
The run length (controlled by runtime
) should be left at 1 year for ACCESS-ESM1.5 experiments in order to avoid errors. Shorter simulations can be useful when setting up and debugging new experiments, however they require additional configuration changes. See the section Running for less than one year for details.
To run ACCESS-ESM1.5 configuration for multiple subsequent run lengths (each with duration runtime
in the config.yaml
file), use the option -n
with the payu run
command:
payu run -f -n <number-of-runs>
This will run the configuration number-of-runs
times, resulting in a total experiment length of runtime * number-of-runs
. The runs will be split across a number of consecutive PBS jobs submitted to the queue, as controlled by the runspersub
value specified in the config.yaml file.
Understand runtime
, runspersub
, and -n
parameters
It is possible to have more than one model run per queue submit. With the correct use of runtime
, runspersub
and -n
parameters, you can have full control of your experiment.
runtime
defines the run length.runspersub
defines the maximum number of runs for every PBS job submission.walltime
defines the maximum time of every PBS job submission.-n
sets the number of runs to be performed.
Now some practical examples:
-
Run 20 years of simulation with resubmission every 5 years
To have a total experiment length of 20 years with a 5-year resubmission cycle, leaveruntime
as the default value of1 year
, setrunspersub
to5
andwalltime
to10:00:00
. Then, run the configuration with-n
set to20
:This will submit subsequent jobs for the following years: 1 to 5, 6 to 10, 11 to 15, and 16 to 20, which is a total of 4 PBS jobs.payu run-f -n 20
-
Run 7 years of simulation with resubmission every 3 years
To have a total experiment length of 7 years with a 3-year resubmission cycle, leaveruntime
as the default value of1 year
, setrunspersub
to3
andwalltime
to6:00:00
. Then, run the configuration with-n
set to7
:This will submit subsequent jobs for the following years: 1 to 3, 4 to 6, and 7, which is a total of 3 PBS jobs.payu run -f -n 7
Tip
The walltime
must be set to be long enough that the PBS job can complete. The model usually runs a single year in 90 minutes or less, but the walltime
for a single model run is set to 2:30:00
out of an abundance of caution to make sure the model has time to run when there are occasional slower runs for unpredictable reasons. When setting runspersub > 1
the walltime
doesn't need to be a simple multiple of 2:30:00
because it is highly unlikely that there will be multiple anomalously slow runs per submit.
Running for less than one year
When debugging changes to a model, it is common to reduce the run length to minimise resource consumption and return faster feedback on changes. In order to run the model for a single month, the runtime
can be changed to
runtime:
years: 0
months: 1
days: 0
With the default configuration settings, the sea ice component of ACCESS-ESM1.5 will produce restart files only at the end of each year. If valid restart files are required when running shorter simulations, the sea ice model configuration should be modified so that restart files are produced at monthly frequencies. To do this, change the dumpfreq = 'y'
setting to dumpfreq = 'm'
in the cice_in.nml
configuration file located in the ice
subdirectory of the control directory.
Modify PBS resources
If the model has been altered and needs more time to complete, more memory, or needs to be submitted under a different NCI project, you will need to modify the following section in the config.yaml
:
# PBS configuration
# If submitting to a different project to your default, uncomment line below
# and replace PROJECT_CODE with appropriate code. This may require setting shortpath
# project: PROJECT_CODE
# Force payu to always find, and save, files in this scratch project directory
# shortpath: /scratch/PROJECT_CODE
# Note: if laboratory is relative path, it is relative to shortpath/$USER
laboratory: access-esm
jobname: pre-industrial
queue: normal
walltime: 2:30:00
These lines can be edited to change the PBS directives for the PBS job.
By default the model will be submitted to the PBS queue using your default project. To run ACCESS-ESM1.5 using the resources of a specific project, for example the lg87
project (ESM Working Group), uncomment the line beginning with # project
by deleting the #
symbol and replace PROJECT_CODE
with lg87
:
project: lg87
Warning
If more than one project is used to run an ACCESS-ESM1.5 configuration the shortpath
option also needs to be uncommented and the path to the desired /scratch/PROJECT_CODE
directory added.
This ensures the same /scratch
location is used for the laboratory, regardless of which project is used to run the experiment.
To run ACCESS-ESM1.5, you need to be a member of a project with allocated Service Units (SU). For more information, check how to join relevant NCI projects.
Syncing output data
The laboratory directory is typically under the /scratch
storage on Gadi, where files are regularly deleted once they have been unaccessed for a period of time. For this reason climate model outputs need to be moved to a location with longer term storage.
On Gadi, this is typically in a folder under a project code on /g/data
.
Payu has built-in support to sync outputs, restarts and a copy of the control directory git history to another location.
This feature is controlled by the following section in the config.yaml
file:
# Sync options for automatically copying data from ephemeral scratch space to
# longer term storage
sync:
enable: False # set path below and change to true
path: null # Set to location on /g/data or a remote server and path (rsync syntax)
enable
to True
, and set path
to a location on /g/data
, where payu will copy output and restart folders.
Warning
The ACCESS-ESM1.5 configurations include a postprocessing script which converts atmospheric outputs to NetCDF format. This script runs in a separate PBS job and prevents the output and restart files of the most recent run from being automatically synced.
After a series of runs and the final post-processing is completed, manually execute payu sync
in the control directory to sync the final output and restart files.
Saving model restarts
ACCESS-ESM1.5 outputs restart files after every run to allow for subsequent runs to start from a previously saved model state.
Restart files can occupy a significant amount of disk space, and keeping a lot of them is often not necessary.
The restart_freq
field in the config.yaml
file specifies a strategy for retaining restart files.
This can either be a number (in which case every nth restart file is retained), or one of the following pandas-style datetime frequencies:
YS
→ start of the yearMS
→ start of the monthD
→ dayH
→ hourT
→ minuteS
→ second
For example, to preserve the ability to restart ACCESS-ESM1.5 every 50 model-years, set:
restart_freq: '50YS'
The most recent sequential restarts are retained, and only deleted after a permanently archived restart file has been produced.
For more information, check payu Configuration Settings documentation.
Other configuration options
Warning
The following sections in the config.yaml
file control configuration options that are rarely modified, and often require a deeper understanding of how ACCESS-ESM1.5 is structured to be safely changed.
Model configuration
This section tells payu which driver to use for the main model (access
refers to ACCESS-ESM1.5).
model: access
Submodels
ACCESS-ESM1.5 is a coupled model deploying multiple submodels (i.e. model components).
This section specifies the submodels and configuration options required to execute ACCESS-ESM1.5 correctly.
Each submodel contains additional configuration options that are read in when the submodel is running. These options are specified in the subfolder of the control directory, whose name matches the submodel's name
(e.g., configuration options for the atmosphere
submodel are in the ~/access-esm/preindustrial+concentrations/atmosphere
directory).
Expand to show the full submodels
section
submodels:
- name: atmosphere
model: um
ncpus: 192
exe: um_hg3.exe
input:
# Aerosols
- /g/data/vk83/configurations/inputs/access-esm1p5/modern/pre-industrial/atmosphere/aerosol/global.N96/2020.05.19/OCFF_1850_ESM1.anc
- /g/data/vk83/configurations/inputs/access-esm1p5/modern/pre-industrial/atmosphere/aerosol/global.N96/2020.05.19/BC_hi_1850_ESM1.anc
- /g/data/vk83/configurations/inputs/access-esm1p5/modern/pre-industrial/atmosphere/aerosol/global.N96/2020.05.19/scycl_1850_ESM1_v4.anc
- /g/data/vk83/configurations/inputs/access-esm1p5/modern/pre-industrial/atmosphere/aerosol/global.N96/2020.05.19/Bio_1850_ESM1.anc
- /g/data/vk83/configurations/inputs/access-esm1p5/modern/share/atmosphere/aerosol/global.N96/2020.05.19/biogenic_351sm.N96L38
- /g/data/vk83/configurations/inputs/access-esm1p5/modern/share/atmosphere/aerosol/global.N96/2020.05.19/sulpc_oxidants_N96_L38
- /g/data/vk83/configurations/inputs/access-esm1p5/modern/share/atmosphere/aerosol/global.N96/2020.05.19/DMS_conc.N96
# Forcing
- /g/data/vk83/configurations/inputs/access-esm1p5/modern/pre-industrial/atmosphere/forcing/global.N96/2020.05.19/ozone_1850_ESM1.anc
- /g/data/vk83/configurations/inputs/access-esm1p5/modern/pre-industrial/atmosphere/forcing/resolution_independent/2020.05.19/volcts_18502000ave.dat
# Land
- /g/data/vk83/configurations/inputs/access-esm1p5/modern/pre-industrial/atmosphere/land/biogeochemistry/global.N96/2020.05.19/Ndep_1850_ESM1.anc
- /g/data/vk83/configurations/inputs/access-esm1p5/modern/share/atmosphere/land/soiltype/global.N96/2020.05.19/qrparm.soil_igbp_vg
- /g/data/vk83/configurations/inputs/access-esm1p5/modern/share/atmosphere/land/vegetation/global.N96/2020.05.19/cable_vegfunc_N96.anc
- /g/data/vk83/configurations/inputs/access-esm1p5/share/atmosphere/land/biogeochemistry/resolution_independent/2020.05.19/modis_phenology_csiro.txt
- /g/data/vk83/configurations/inputs/access-esm1p5/share/atmosphere/land/biogeochemistry/resolution_independent/2020.05.19/pftlookup_csiro_v16_17tiles_wtlnds.csv
- /g/data/vk83/configurations/inputs/access-esm1p5/share/atmosphere/land/biogeophysics/resolution_independent/2020.05.19/def_soil_params.txt
- /g/data/vk83/configurations/inputs/access-esm1p5/share/atmosphere/land/biogeophysics/resolution_independent/2020.05.19/def_veg_params.txt
# Spectral
- /g/data/vk83/configurations/inputs/access-esm1p5/share/atmosphere/spectral/resolution_independent/2020.05.19/spec3a_sw_hadgem1_6on
- /g/data/vk83/configurations/inputs/access-esm1p5/share/atmosphere/spectral/resolution_independent/2020.05.19/spec3a_lw_hadgem1_6on
# Grids
- /g/data/vk83/configurations/inputs/access-esm1p5/modern/share/atmosphere/grids/global.N96/2020.05.19/qrparm.mask
- /g/data/vk83/configurations/inputs/access-esm1p5/share/atmosphere/grids/resolution_independent/2020.05.19/vertlevs_G3
# STASH
- /g/data/vk83/configurations/inputs/access-esm1p5/share/atmosphere/stash/2020.05.19/
- name: ocean
model: mom
ncpus: 180
exe: fms_ACCESS-CM.x
input:
# Biogeochemistry
- /g/data/vk83/configurations/inputs/access-esm1p5/modern/share/ocean/biogeochemistry/global.1deg/2020.05.19/dust.nc
- /g/data/vk83/configurations/inputs/access-esm1p5/modern/share/ocean/biogeochemistry/global.1deg/2020.05.19/ocmip2_press_monthly_om1p5_bc.nc
- /g/data/vk83/configurations/inputs/access-esm1p5/share/ocean/biogeochemistry/global.1deg/2024.07.12/bgc_param.nc
- /g/data/vk83/configurations/inputs/access-esm1p5/modern/unused/ocean/biogeochemistry/global.1deg/2020.05.19/ocmip2_fice_monthly_om1p5_bc.nc
- /g/data/vk83/configurations/inputs/access-esm1p5/modern/unused/ocean/biogeochemistry/global.1deg/2020.05.19/ocmip2_xkw_monthly_om1p5_bc.nc
# Tides
- /g/data/vk83/configurations/inputs/access-esm1p5/modern/share/ocean/tides/global.1deg/2020.05.19/roughness_amp.nc
- /g/data/vk83/configurations/inputs/access-esm1p5/modern/share/ocean/tides/global.1deg/2020.05.19/tideamp.nc
# Shortwave
- /g/data/vk83/configurations/inputs/access-esm1p5/modern/share/ocean/shortwave_penetration/global.1deg/2020.05.19/ssw_atten_depth.nc
# Grids
- /g/data/vk83/configurations/inputs/access-esm1p5/modern/share/ocean/grids/mosaic/global.1deg/2020.05.19/grid_spec.nc
- name: ice
model: cice
ncpus: 12
exe: cice_access_360x300_12x1_12p.exe
input:
# Grids
- /g/data/vk83/configurations/inputs/access-esm1p5/modern/share/ice/grids/global.1deg/2020.05.19/kmt.nc
- /g/data/vk83/configurations/inputs/access-esm1p5/modern/share/ice/grids/global.1deg/2020.05.19/grid.nc
# Climatology
- /g/data/vk83/configurations/inputs/access-esm1p5/modern/share/ice/climatology/global.1deg/2020.05.19/monthly_sstsss.nc
- name: coupler
model: oasis
ncpus: 0
input:
# Grids
- /g/data/vk83/configurations/inputs/access-esm1p5/modern/share/coupler/grids/global.oi_1deg.a_N96/2020.05.19/grids.nc
- /g/data/vk83/configurations/inputs/access-esm1p5/modern/share/coupler/grids/global.oi_1deg.a_N96/2020.05.19/areas.nc
- /g/data/vk83/configurations/inputs/access-esm1p5/modern/share/coupler/grids/global.oi_1deg.a_N96/2020.05.19/masks.nc
# Remapping weights
- /g/data/vk83/configurations/inputs/access-esm1p5/modern/share/coupler/remapping_weights/global.oi_1deg.a_N96/2020.05.19/rmp_cice_to_um1t_CONSERV_FRACNNEI.nc
- /g/data/vk83/configurations/inputs/access-esm1p5/modern/share/coupler/remapping_weights/global.oi_1deg.a_N96/2020.05.19/rmp_um1u_to_cice_CONSERV_FRACNNEI.nc
- /g/data/vk83/configurations/inputs/access-esm1p5/modern/share/coupler/remapping_weights/global.oi_1deg.a_N96/2020.05.19/rmp_um1t_to_cice_CONSERV_DESTAREA.nc
- /g/data/vk83/configurations/inputs/access-esm1p5/modern/share/coupler/remapping_weights/global.oi_1deg.a_N96/2020.05.19/rmp_cice_to_um1u_CONSERV_FRACNNEI.nc
- /g/data/vk83/configurations/inputs/access-esm1p5/modern/share/coupler/remapping_weights/global.oi_1deg.a_N96/2020.05.19/rmp_um1v_to_cice_CONSERV_FRACNNEI.nc
- /g/data/vk83/configurations/inputs/access-esm1p5/modern/share/coupler/remapping_weights/global.oi_1deg.a_N96/2020.05.19/rmp_um1t_to_cice_CONSERV_FRACNNEI.nc
- /g/data/vk83/configurations/inputs/access-esm1p5/modern/share/coupler/remapping_weights/global.oi_1deg.a_N96/2020.05.19/rmp_cice_to_um1v_CONSERV_FRACNNEI.nc
Collate
Rather than outputting a single diagnostic file over the whole model horizontal grid, the ocean component MOM typically generates diagnostic outputs as tiles, each of which spans a portion of model grid.
The collate
section in the config.yaml
file controls the process that combines these smaller files into a single output file.
# Collation
collate:
exe: mppnccombine.spack
restart: true
mem: 4GB
walltime: 1:00:00
mpi: false
restart
field is set to true
.
Runlog
runlog: true
git
if runlog
is set to true
.
Warning
This should not be changed as it is an essential part of the provenance of an experiment.
payu updates the manifest files for every run, and relies on runlog
to save this information in the git
history, so there is a record of all inputs, restarts, and executables used in an experiment.
Userscripts
userscripts:
# Apply land use changes after each run
run: ./scripts/update_landuse_driver.sh
Run scripts or subcommands at various stages of a payu submission. The above example comes from the release-historical+concentrations
configuration, where the update_landuse_driver.sh
is used to apply historical land use changes at the end of each run.
For more information about specific userscripts
fields, check the relevant section of payu Configuration Settings documentation.
Postscripts
Postprocessing scripts that run after payu has completed all steps. Scripts that might alter the output directory, for example, can be run as postscripts. These run in PBS jobs separate from the main model simulation.
postscript: -v PAYU_CURRENT_OUTPUT_DIR,PROJECT -lstorage=${PBS_NCI_STORAGE} ./scripts/NetCDF-conversion/UM_conversion_job.sh
All ACCESS-ESM1.5 configurations include the NetCDF conversion postscript mentioned above. This script converts the UM's fields file format output to NetCDF in order to facilitate analysis and reduce storage requirements. By default, the conversion script will delete the fields files upon successful completion, leaving only the NetCDF output. This automatic deletion can be disabled by commenting out the --delete-ff
command line flag from the conversion job submission script located in the control directory under scripts/NetCDF-conversion/UM_conversion_job.sh
.
That means changing
esm1p5_convert_nc $PAYU_CURRENT_OUTPUT_DIR --delete-ff
to
esm1p5_convert_nc $PAYU_CURRENT_OUTPUT_DIR # --delete-ff
Miscellaneous
The following configuration settings should never require changing:
stacksize: unlimited
qsub_flags: -W umask=027
Edit a single ACCESS-ESM1.5 component configuration
Each of ACCESS-ESM1.5 components contains additional configuration options that are read in when the model component is running.
These options are typically useful to modify the physics used in the model, the input data, or the model variables saved in the output files.
These configuration options are specified in files located inside a subfolder of the control directory, named according to the submodel's name
specified in the config.yaml
submodels
section (e.g., configuration options for the ocean component are in the ~/access-esm/preindustrial+concentrations/ocean
directory).
To modify these options please refer to the User Guide of the respective model component.
Create a custom ACCESS-ESM1.5 build
All the executables needed to run ACCESS-ESM1.5 are pre-built into independent configurations using Spack.
To customise ACCESS-ESM1.5's build (for example to run ACCESS-ESM1.5 with changes in the source code of one of its component), refer to Modify an ACCESS model's source code.
Controlling model output
Selecting the variables to save from a simulation can be a balance between enabling future analysis and minimising storage requirements. The choice and frequency of variables saved by each model can be configured from within each submodel's control directory.
Each submodel's control directory contains detailed and standard presets for controlling the output, located in the diagnostic_profiles
subdirectories (e.g. ~/access-esm/preindustrial+concentrations/ice/diagnostic_profiles
for the sea ice submodel). The detailed profiles request a large number of variables at higher frequencies, while the standard profiles restrict the output to variables more regularly used across the community. Details on the variables saved by each preset are available in this Hive Forum topic.
Selecting a preset output profile to use in a simulation can be done by pointing the following symbolic links to the desired profile:
STASHC
in the atmosphere control directory.diag_table
in the ocean control directory.ice_history.nml
in the ice control directory.
For example, to select the detailed output profile for the atmosphere:
Get Help
If you have questions or need help regarding ACCESS-ESM1.5, consider creating a topic in the Earth System Model category of the ACCESS-Hive Forum.
For assistance on how to request help from ACCESS-NRI, follow the guidelines on how to get help.