prism project for integrated earth system modelling an ... · prism project for integrated earth...

68
PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by the European Commission under Contract EVR1-CT2001-40012 The MPI-Met PRISM Earth System Model Adaptation Guide Edited by: Stephanie Legutke and Veronika Gayler PRISM-Report Series-08 1. Edition, release prism_2-4 January 24 2005

Upload: others

Post on 07-Sep-2019

13 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

PRISMProject for Integrated Earth System Modelling

An Infrastructure Project for Climate Research in Europefunded by the European Commissionunder Contract EVR1-CT2001-40012

The MPI-Met

PRISM Earth System Model

Adaptation Guide

Edited by:Stephanie Legutke and Veronika Gayler

PRISM-Report Series-08

1. Edition, release prism_2-4

January 24���

2005

Page 2: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

Copyright Noticec

�Copyright 2003 by PRISM

All rights reserved.No parts of this document should be either reproduced or commercially used without prioragreement by PRISM representatives.

How to get assistance?The individual work packages of the PRISM project can be contacted as listed below.PRISM publications can be download from the WWW server of the PRISM project under theURL: <http://prism.enes.org/Results/Documents/>

Phone Numbers and Electronic Mail AdressesElectronic mail adresses of the individual work packages are composed as follows :prism_ work package @prism.enes.orgName Phone PRISM Work Package

Stephanie Legutke +49-40-41173104 wp3iVeronika Gayler +49-40-41173138 wp3i

Page 3: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

Contents

1 Introduction 1

2 The MPI-Met Component Models 32.1 ECHAM5 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

2.1.1 Input and Output Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52.1.2 Conditional Compilation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62.1.3 Namelist Parameters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

2.2 MPI-OM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72.2.1 Input and Output Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82.2.2 Conditional Compilation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92.2.3 Namelist Parameters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

2.3 HAMOCC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102.3.1 Input and Output Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102.3.2 Conditional Compilation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112.3.3 Namelist Parameters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

3 PRISM Coupling Strategies 133.1 Main- and Submodel Relationship . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 133.2 Coupling via the PRISM Coupler . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

3.2.1 The namcouple . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

4 The MPI-Met Coupled Model Constellations 254.1 MPI-OM stand-alone . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

4.1.1 Retrieving the Source Code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 254.1.2 Compilation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 254.1.3 Model Execution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26

4.2 MPI-OM + HAMOCC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 274.2.1 Exchange Fields . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 274.2.2 Retrieving the Source Code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 284.2.3 Compilation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 304.2.4 Model Execution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

4.3 MPI-OM + ECHAM5 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 314.3.1 Exchange Fields . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 324.3.2 The Coupling Routines . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 354.3.3 Retrieving the Source Code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 394.3.4 Compilation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 394.3.5 Model Execution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40

4.4 ECHAM5 + MPI-OM + HAMOCC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 404.4.1 Exchange Fields . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 404.4.2 Retrieving the Source Code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 424.4.3 Compilation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42

i

Page 4: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

ii CONTENTS

4.4.4 Model Execution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 444.5 Exchange from HAMOCC to (EC)HAM . . . . . . . . . . . . . . . . . . . . . . . . . . . 44

4.5.1 Exchange Fields . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 444.5.2 Retrieving the Source Code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 454.5.3 Compilation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 464.5.4 Model Execution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46

5 Compilation 495.1 Conditional Compilation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 495.2 Library Compilation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 505.3 Generation of Compile Scripts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 505.4 The Compilation Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51

6 Modification of MPI-M model coupling algorithms 526.1 Changing the number of processors the component models run on . . . . . . . . . . . . . 526.2 Changing the exchange frequencies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52

6.2.1 Fields going from the ocean to the atmosphere . . . . . . . . . . . . . . . . . . . 526.2.2 Fields going from the ocean to the atmosphere . . . . . . . . . . . . . . . . . . . 53

6.3 Changing the interpolation methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 536.4 Changing the standard output extent for exchange control . . . . . . . . . . . . . . . . . . 536.5 Changing the model resolution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 536.6 Adding exchange fields . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54

7 The PRISM CVS Repository 557.1 Download via Direct Connection (pserver) . . . . . . . . . . . . . . . . . . . . . . . . . . 557.2 Download via the Web . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 567.3 The CVS modules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56

8 Interfacing with the GUI 57

Page 5: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

List of Figures

2.1 Surface flux calculation in ECHAM5 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42.2 ECHAM5 model grid (T21) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52.3 MPI-OM grid (grob) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

3.1 The lag concept of OASIS3 for concurrent and sequential model runs . . . . . . . . . . . 14

4.1 Field exchange between MPI-OM and HAMOCC . . . . . . . . . . . . . . . . . . . . . . 294.2 Exchange algorithm between ECHAM5 and MPI-OM . . . . . . . . . . . . . . . . . . . 324.3 Use of ECHAM5 heat and freshwater fluxes in MPI-OM . . . . . . . . . . . . . . . . . . 334.4 Implementation of the coupling routines in MPI-OM . . . . . . . . . . . . . . . . . . . . 354.5 Implementation of the coupling routines in ECHAM5 . . . . . . . . . . . . . . . . . . . . 364.6 Exchange algorithm between ECHAM5, MPI-OM and HAMOCC . . . . . . . . . . . . . 424.7 Field exchange between ECHAM5, MPI-OM and HAMOCC including DMS . . . . . . . 45

iii

Page 6: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

iv LIST OF FIGURES

Page 7: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

List of Tables

2.1 Input files for ECHAM5 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62.2 Output files of ECHAM5 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62.3 List of namelist variables of ECHAM5 . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72.4 Input files for MPI-OM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92.5 Output files of MPI-OM. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92.6 List of cpp flags for MPI-OM physical parameterisations . . . . . . . . . . . . . . . . . . 92.7 Namelist specifications of MPI-OM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102.8 Input files for HAMOCC. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112.9 Output files of HAMOCC. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112.10 Namelist specifications of HAMOCC . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

3.1 Configurable entry names of the namcouple base file for MPI-AO (ECHO) . . . . . . . . . 18

4.1 Model specific parameters of the MPI-OM setup file . . . . . . . . . . . . . . . . . . . . 274.2 HAMOCC routines called from the main model MPI-OM . . . . . . . . . . . . . . . . . . 274.3 HAMOCC modules used by the main model MPI-OM . . . . . . . . . . . . . . . . . . . 284.4 MPI-OM modules used by the sub-model HAMOCC . . . . . . . . . . . . . . . . . . . . 284.5 Model specific parameters of the MPI-OB setup file . . . . . . . . . . . . . . . . . . . . . 314.6 Model specific parameters of the MPI-AO setup file . . . . . . . . . . . . . . . . . . . . . 414.7 Extra parameters of the MPI-AOB-HAM setup file . . . . . . . . . . . . . . . . . . . . . 47

5.1 List of cpp flags related to coupling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 505.2 The libraries needed by the components of the MPI-Met coupled model . . . . . . . . . . 50

7.1 List of the CVS modules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56

v

Page 8: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

vi LIST OF TABLES

Page 9: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

Chapter 1

Introduction

The Max Planck Institute for Meteorology (MPI-Met) in Hamburg has developed global and regionalclimate models of the atmosphere, the ocean and the cryosphere. Models describing biophysical andbiogeochemical processes are also being developed and are becoming important components in coupledmodels of the earth system.

The PRISM project has been established to enhance the efficiency of earth system modelling throughdevelopment of a common coupling software infrastructure under standardised coding conventions. Thesoftware comprises the coupler OASIS3 and its associated libraries for grid and time transformation ofexchange data, for model I/O in netCDF format (CF metadata convention) and for time control of theexperiments. In addition, standardised and portable compilation and running environments (SCE andSRE) and graphical user interfaces (GUIs) have been developed to facilitate compilation and execution ofcoupled models.

This report describes how the MPI-Met models are adapted to the software and how the system of coupledmodels can be used and extended. The PRISM software itself is documented in Mangili et al. (2003),Valcke et al. (2004), Legutke and Gayler (2004) and Gayler and Legutke (2004).

The PRISM models and the PRISM infrastructure are evolving. Foreseen changes and developments thatare underway are discussed. When realised, they will be included in a new edition of the same volumeof the PRISM report series. The models and the infrastructure described herein correspond to the finalstate of the PRISM project. They can be retrieved from the PRISM CVS repository using release tagprism 2-4.

PRISM aims at an infrastructure that allows for easy exchange of model source codes and model resultsbetween European research centres. The COSMOS project, an initiative of three Max Planck Institutesand the Potsdam Institute for Climate Impact Research, will use the PRISM infrastructure and software forthe development of a German community coupled earth system model. However, the way the models arecoupled in COSMOS might differ slightly from the PRISM aproche. More information on the COSMOSmodels is provided at http://cosmos.enes.org.

Chapter 2 gives an overview of the three component models developed at the MPI-Met that have been usedwithin the PRISM project: ECHAM5, MPI-OM and HAMOCC. The coupling strategies supported withinthe PRISM system are illustrated in chapter 3. The coupling of MPI-OM and HAMMOCC is realised asa main- submodel relationship whereas ECHAM5 and MPI-OM are coupled using the coupler OASIS3.

The various coupled constellations of the MPI-Met component models are described in chapter 4. Thechapter contains a close description of the coupling interfaces between the component model. Also in-cluded are instructions on how to compile and how to run the different coupled models.

1

Page 10: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

2 CHAPTER 1. INTRODUCTION

Page 11: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

Chapter 2

The MPI-Met Component Models

The PRISM project has defined six subsystems of the earth’s climate system which define six categoriesof component models. These are models of the general circulation of the atmosphere (AGCMs) and ofatmospheric chemistry (ACMs), models of the ocean general circulation (OGCMs) and of the marinebio-geo-chemistry (BGCMs), sea ice models (SIMs), and models for land surface processes (LSMs).

The most recent AGCM of the MPI-Met is ECHAM5. It runs in a coupled configuration with the OGCMMPI-OM, also recently developed at the MPI-Met. These two component models define the base of theMPI-Met earth system model and also of the COSMOS earth system model, which is the German com-munity physical climate model. Optionally, the climate model is comprising the BGCM HAMOCC, alsodeveloped at the MPI-Met. A sea ice model is embedded in the ocean model code and will therefore notbe addressed as a separate component model in the present report. For a precise definition of compo-nent models read the PRISM handbook on the Standard Compilation Environment (Legutke and Gayler(2004)). The inclusion of a comprehensive land surface and chemistry component are present researchactivities at the institute. The adaptation of these models to the PRISM system is in work and will bedocumented in a later edition of this report.

In the following sections each of the MPI-Met component models used within the PRISM system is brieflydescribed. Special emphasis is put on those aspects that are directly related to the coupling physics.

2.1 ECHAM5

ECHAM5 is the 5th generation and most recent model of the ECHAM AGCM series developed at the MPI-Met. Depending on the configuration, the model resolves the atmosphere up to 10 hPa for troposphericstudies or up to 0.01 hPa for middle atmosphere studies (often referred to as MAECHAM5). The middle-atmosphere version is not yet used in a PRISM coupled model.

Prognostic variables are vorticity, divergence, temperature, surface pressure, specific moisture, cloud wa-ter, cloud ice, and width and skewness of the cloudiness distribution. The prognostic equations for vor-ticity, divergence, temperature, and surface pressure are solved by the spectral transform method. Theprognostic equations for specific moisture and cloud variables are solved in grid point space using windfields derived from vorticity and divergence.

ECHAM5 includes a land surface scheme which is implicitly coupled to the atmosphere (Schulz et al.(2001)). As mentioned above, a sea ice model is part of the ocean model MPI-OM.

Grid cells of ECHAM5 can be partially covered by land, ocean, and sea ice. Surface fluxes of reflectedsolar and outgoing long wave radiation, turbulent fluxes of sensible and latent heat, moisture and momen-tum are calculated separately for the land, the sea ice, and the open ocean fraction of each grid cell usingthe different conditions and characteristics (e.g. skin temperature, roughness parameters) defined for eachof the surfaces. At a specific height above the ground, the blending height (Groetzner et al. (1996)), allconditions are assumed to be horizontally homogeneous. The blending height is related to the advective

3

Page 12: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

4 CHAPTER 2. THE MPI-MET COMPONENT MODELS

Figure 2.1: Illustration of the surface flux calculation in ECHAM5 with the blending height concept. ECHAM5does not use the fluxes calculated on the smaller part of a cell (check pattern arrows). They are used forinterpolation only. The blue-grey stripes represent the small scale water/ice distribution on the cells.

space scale. The concept is based on the underlying assumption that the scales of the surface conditionsare smaller than the grid scale. This is also an assumption underlying the sea ice rheology (Hibler (1979)).The spatial scales of the sea ice surface (ice floes) vary between some centimetres and about 10 km. Inits default version, the model is formulated on 19 or 31 layers defined in hybrid sigma-pressure verticalcoordinates with the top level at 10hPa. For both discretisations the lowest computation level (blendingheight) is about 30m.

Depending on the area fraction of land in a grid cell, the atmosphere uses either the land flux (if the landfraction is � 50 %), or the ocean/sea ice flux if the ocean fraction is the larger. However, all fluxes cal-culated over sea water or sea ice are used for the transformation of the fluxes from the atmosphere to theocean model grid in order to better represent the coastal gradients and also to improve the flux conserva-tion. The latter has also motivated the use of the partial land/sea cell partitioning for the atmosphere grid.The atmosphere coastal cells are locally adapted so that the coast lines of both grids define the same globalocean and continental surface areas. The concept is illustrated in Figure 2.1 sketching the surface layers ofthe atmosphere and the ocean with the heat and freshwater flux components and their interpolation to theocean grid. The check pattern arrows are ignored for the budget calculations in the atmosphere, howeverthey are used for the interpolations between the grids. The ocean does not have partial cells. The trans-formation of the surface conditions from the ocean to the atmosphere grid is based on a surface averagingmethod (compare section 3.2.1).

The surface albedo on land depends on the specified background albedo, the forest fraction of the cell,the snow depth on the ground and canopy, and of the surface slope and temperature (Roske (2001)). Onsea ice, it is a function of skin temperature, while on sea water it is constant. The function over sea icedepends on the existence of snow.

The model contains a scheme to simulate the lateral water flows on the land and discharge into the oceans(Dumenil and Todini (1992) and Hagemann and Dumenil (1996)). Precipitation onto grid cells being part

Page 13: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

2.1. ECHAM5 5

of the continental ice sheets (Greenland and Antarctica) normally accumulate over time since sublimationis usually much smaller than precipitation.

An ice sheet model which simulates meltwater surface runoff and discharge or mass discharge by icestreams or glacier calving is not yet coupled to the MPI-Met PRISM model. In order to close the freshwaterbudget in the coupled system, the precipitation over the ice sheets is accumulated during each coupled timestep and is then distributed to the coastal ocean cells and passed to the ocean together with the river andcontinental boundary runoff. The latent heat of fusion corresponding to the discharged water volume ispassed with the net heat flux, assuming that all discharge from the ice sheets happens as frozen fresh waterat freezing point.

The model and the climate it simulates is described in Roeckner et al. (2003) and Roeckner et al. (2004).

2.1.1 Input and Output Data

Three dimensional spectral initial data for the prognostic variables vorticity, divergence, temperature, andspecific humidity are read from file AresAlev jan spec.nc1. The data represent January values. Thedata, and therefore the file name, depends on the horizontal grid (Ares) and the number of vertical levels(Alev).

January initial data for the soil scheme are read from file AresOres jan surf.nc. In addition, thefile contains the grid definition, masks and orography as well as surface data such as background albedo,vegetation type, soil characteristics, and surface roughness. Since the model formulation allows for partialland coverage in a cell, which is determined in order to fit the coast lines of the ocean and the atmospheregrids, this file depends on both, the atmosphere’s and the ocean’s (Ores) horizontal resolutions.

250

252.5

255

257.5

260

262.5

265

267.5

270

272.5

275

277.5

280

282.5

285

287.5

290

292.5

295

297.5

300

302.5

305

307.5

310

0 30 60 90 120 150 180 210 240 270 300 330-90

-70

-50

-30

-10

10

30

50

70

90

Figure 2.2: ECHAM5 grid (T21) with SST overlay. The figure was generated with the PRISM [low-end] graphicpackage.

Climatological monthly land surface data of vegetation ratio (AresOres VGRATCLIM.nc) and leafarea index (AresOres VLTCLIM.nc) have been interpolated from a 1km data set (Hagemann (2002))to the model grid and (partial) land/sea masks. The files therefore depend on the atmosphere and the ocean

1Italic letters mark the variable part of file names

Page 14: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

6 CHAPTER 2. THE MPI-MET COMPONENT MODELS

model grid. The climatological monthly land surface temperature data (Ares TSLCLIM.nc) depend onthe atmosphere grid only.

Actual Filename Content Filename Dates

AresAlev jan spec.nc Initial spectral data unit.23 1 Jan 89 12:00AresOres jan surf.nc Initial soil data, grid, masks, etc. unit.24 1 Jan 89 12:00AresOres VGRATCLIM.nc Vegetation unit.91 Clim. monthlyAresOres VLTCLIM.nc Leaf area indices unit.90 Clim. monthlyAres TSLCLIM.nc Surf temp. unit.92 Clim. monthlyAres O3clim2.nc Ozone unit.21 Clim. monthlysurrta data Param. for radiation scheme rrtadata Nonehdpara.nc Param. for runoff scheme hdpara.nc Nonehdstart.nc Initial data for runoff scheme hdstart.nc 1 Jan 1989Ares amip2sst clim.nc Sea surface temp. unit.20 Jan 1979

Table 2.1: Input files for ECHAM5. Italics are used to indicate variable parts of the file names: Ares and Oresrepresent the resolution acronym of the atmosphere and the ocean grid respectively. Alev indicates thenumber of vertical levels of the atmosphere grid.

Most tracer gases are well mixed in the standard model version. They are prescribed with constant mixingratios of 438 ppm CO � , 1.65 ppmv CH � , 305 ppbv N � O, 0.280 ppbv CFC11 and 0.484ppbv CFC12. Whenthe middle atmosphere version is used, CH � and N � O can be prescribed as horizontally constant verticalprofiles with decreasing values in the middle atmosphere.

In the standard version with no comprehensive ACM included, ozone is specified from a climatology.The monthly and zonal mean ozone concentration profiles are conservatively interpolated to the modeltime and levels (Fortuin and Kelder (1998)). The data do not depend on the vertical resolution or on theland/sea mask. The file name is Ares O3clim2.nc.

The hydrological discharge model runs on a 0.5 degree geographical grid independent of the atmospheregrid. It reads time independent parameters (e.g. river direction) and initial (January) data from hd-para.nc and hdstart.nc. Both files are therefore independent of the model grid.

The file surrta data contains input data used by the radiation scheme.

The model initially reads sea surface temperatures (SST) from a file called Ares amip2sst clim.nc.In the coupled configuration the data is overwritten by values provided from the ocean model at the startof the simulation.

ECHAM5 generates one output file per output event (triggered by the namelist variable trigfiles). Itcontains model raw diagnostic data. The data are either actual values or time averages of the period spec-ified by the namelist event-variable putdata. The file name is echamid expid yyyymm.dd[.nc].The echamid is a number, that can be attributed to the model user running the experiment (see 4). Thestring expid represents the experiment ID, yyyy the year of the first time level in the file, mm the firstmonth and dd the first day. The appendix .nc is used when the output format is netCDF, which is thePRISM standard format. Optionally, GRIB can be used as output file format.

Actual and runtime Filename Content Dates

Echamid Expid yyyymm.dd[.nc] raw diagnostic output one month

Table 2.2: Output files of ECHAM5. Italics are used for the variable part of the file names. A description of thevariables is given in the text.

2.1.2 Conditional Compilation

In contrast to the ocean model MPI-OM, ECHAM5 does not use cpp flags to (de)activate optional compi-lation of physical parameterisations. Cpp flags are only used to configure the model for different platforms

Page 15: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

2.2. MPI-OM 7

or for different coupled constellations (e.g. whether or not HAMOCC is part of the coupled model). Thesecpp flags are listed in table 5.1.

2.1.3 Namelist Parameters

ECHAM5 reads five namelist files, named columnctl, dynctl, physctl, postctl, runctl andsdsctl. In the PRISM model constellation, only the namelists runctl and sdsctl are used to readparameters that differ from the default values set in the source code. These parameters are used for modelcontrolling, not for the specification of model optional physics. They are listed in tabel 2.3.

Namelist Variabe name Description Value

SDSCTL out expnameSDSCTL out filetypeSDSCTL lresumeSDSCTL ldebugevSDSCTL LSDS1RUNCTL dt start Time step before the beginning of the runRUNCTL dt stop First time step of the next runRUNCTL lcouple Logical, whether or not the model runs in coupled modeRUNCTL lhd Logical, whether or not the hydrological discharge model

is activatedRUNCTL getoceanRUNCTL putoceanRUNCTL trigfilesRUNCTL lhd que .F.RUNCTL delta time Time step in secondsRUNCTL labort .F.RUNCTL nhd diag 1RUNCTL nproca Number of processors usedRUNCTL nprocb Number of processors used 1

Table 2.3: List of namelist variables of ECHAM5. —Hier fehlt noch was—–

2.2 MPI-OM

MPI-OM is the global ocean general circulation model (OGCM) of the MPI in Hamburg. It is basedon the primitive equations and utilises the usual assumptions for large scale ocean models (hydrostatic,Boussinesq). The model is formulated on an Arakawa C staggered (Arakawa and Lamb (1977)) horizontalcurvilinear grid. The grid is generated by a conformal mapping of a global geographical grid whichallows to place the poles at any point of the globe and can accommodate varying resolution in space (seefigure 2.3). Other features are the use of geo-potential vertical coordinates, a free surface which allowsto specify freshwater fluxes at the surface, partial(height) bottom grid cells which makes the resolution ofthe bathymetry independent of the number of layer or their vertical spacing, and an embedded dynamic-thermodynamic sea ice model with viscous-plastic rheology(Hibler (1979)) and snow cover.

The model formulation includes options for isopycnal diffusion, the Gent-McWilliams eddy parameteri-sation, a parameterisation of slope convection, and a choice of higher-order advection schemes, as well asa parameterisation of mixed layer deepening. The model and its climatology are described in Marsland etal. (2003).

Page 16: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

8 CHAPTER 2. THE MPI-MET COMPONENT MODELS

250

252.5

255

257.5

260

262.5

265

267.5

270

272.5

275

277.5

280

282.5

285

287.5

290

292.5

295

297.5

300

302.5

305

307.5

310

-30 0 30 60 90 120 150 180 210 240 270 300

-60

-30

0

30

60

Figure 2.3: MPI-OM grid (resolution=grob) with SST overlay. The figure was generated with the PRISM low-endgraphic package.

2.2.1 Input and Output Data

Geographical positions of the grid cells are read from the file anta Ores.ext8 with Ores being theOcean horizontal grid acronym. Other information on the model grids, such as grid point separation,masks, and bathymetry are stored in arcgri Ores.ext8 and topo Ores. BEK Ores contains spec-ifications on ocean basins used for diagnostics.

MPI-OM needs for its initialisation three dimensional data of potential temperature and salinity on themodel grid. They are read from INITEM Ores.Olev.ext8 and INISAL Ores.Olev.ext8 (Lev-itus et al. (1998)). Olev reflects the number of vertical levels.

If the surface salinity is restored to climatological values, monthly mean data are read from SURSAL -Ores.ext8 (Levitus et al. (1998)).

PRISM provides an infrastructure for coupled models as well as for stand-alone component models. IfMPI-OM runs in stand-alone mode, it does not receive forcing fields from the atmosphere model via OA-SIS. Instead, daily climatological near-surface conditions derived from 15 years of ECMWF re-analysisdata in the OMIP project (Roske (2001)) are used to force the ocean. The input data comprise totalcloud coverage, precipitation, solar radiation, surface temperature and dew point, 10m wind speed andwind stress ( files GICLOUD, GIPREC, GISWRAD, GITDEW, GITEM, GIU10, GIWIX, GI-WIY). Surface fluxes of heat and evaporation are calculated with the help of bulk formulas. In the forcedmode a 30 day calendar is used.

Climatological monthly runoff data are provided as well in files runoff obs and runoff pos. Thelatter contains latitude and longitude of the discharge positions. The corresponding grid cell indices arecalculated in the model for the actual grid. The file therefore does not depend on the grid.

MPI-OM generates a large number of diagnostic output files. The time averaging is controlled by thenamelist variable IMEAN. Each code is written into a separate file named fort.unit according to theunit the file is written to. The file format is EXTRA. At the end of each run a tar file is created from the

Page 17: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

2.2. MPI-OM 9

Actual Filename Content Filename Dates

anta Ores.ext8 Grid cell position anta Nonearcgri Ores.ext8 Masks, ... arcgri Nonetopo Ores Bathymetry topo NoneBEK Ores Basin geometry BEK NoneINITEM Ores.Olev.ext8 Pot. temp. INITEM Clim., ann.INISAL Ores.Olev.ext8 Salinity INISAL Clim., ann.SURSAL Ores.ext8 Surface salinity SURSAL Clim., monthlyGICLOUD res oce Total cloud cover GICLOUD Clim., dailyGIPREC res oce Precipitation GIPREC Clim., dailyGISWRAD res oce Solar radiation GISWRAD Clim., dailyGITDEW res oce Dew point temperature GITDEW Clim., dailyGITEM res oce Surface temperature GITEM Clim., dailyGIU10 res oce 10 m wind speed GIU10 Clim., dailyGIWIX res oce zonal wind stress GIWIX Clim., dailyGIWIY res oce meridional wind stress GIWIY Clim., dailyrunoff obs River discharge runoff obs Clim., monthlyrunoff pos River discharge position runoff pos None

Table 2.4: Input files for MPI-OM. Variable parts of the file name are in italic. The forcing data files starting withthe letters “GI” and the runoff files are used in the stand-alone mode only. The files are not needed andtherefore not provided if the model is coupled to an atmosphere

Actual Filename Content Filename Time resolution

fort.date enddate.tar time averaged model variables none averaging periodTIMESERdate enddate time series TIMESER model time step

Table 2.5: Output files of MPI-OM.

averaged data files with is named fort.date enddate.tar where date represents the date of thefirst and enddate of the last day of the run. In addition a file containing time series is generated fromthe data of each model time step. The content is described in detail in routine DIAGNOSIS.F90 of thesource code. The file is saved as TIMESERdate enddate.

2.2.2 Conditional Compilation

The MPI-OM source code contains a number of cpp flags for conditional compilation of different pa-rameterisations of the model physics. A list of those flags that are activated for the compilation of thePRISM model can be found in table 2.6. Cpp flags related to the configuration for different platforms orcontrolling of the coupled model run are given in table 5.1.

Key name Action

ISOPYK Isopycnal diffusionGMBOLUS Gent et al. (1995) style eddy-induced mixingREDWMICE reduced eddy mixing energy transfer in presence of sea iceADPO deactivate predictor-corrector; instead:SLOPECON ADPO bottom boundary layer transport scheme of (Beckmann and Doscher (1997))NURDIF remove hydrostatic instability by increased vert. diffusionBOLK05 reduce Bolus coefficient by 0.5 relative to defaultOPEND55 increase exp. scale for downward solar penetration by 2 relative to default

Table 2.6: List of cpp flags for conditional compilation to chose MPI-OM physical parameterisation. Only flagswith non-default values are listed.

Page 18: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

10 CHAPTER 2. THE MPI-MET COMPONENT MODELS

2.2.3 Namelist Parameters

In addition to cpp flag (de)activation, it is possible to tune the model by modifying a number of namelistvariables in OCECTL. These variables together with the values set for PRISM coarse2 resolution experi-ments are listed in table 2.7. Note that optimised values may change with resolution and forcing data aswell as with the coupled configuration. Also included are namelist values which control the flow of theexperiments.

Variable name Description Actual Default

exptid Experient IDDT Model time step in seconds 8640CAULAPTS set to 0 if isopycnal diffusion is used 0.0000 0.0002CAULAPUV 0.0060 0.0045AUS 0. 3.E-6CAH00 set to 0 if isopycnal diffusion is used 1000. 1000.0DV0 0.2E-2 0.5E02AV0 0.2E-2 0.5E02CWT 0.5E-3 0.5E-3CSTABEPS 0.03 0.05DBACK 1.05E-5 5.E-5ABACK 5.E-5 5.E-5CRELSAL salinity relaxation time scale; is set to 0 if coupled to

ECHAM50.0 3.E-7

CDVOCON 0.1 20NYEARS Number of simulated years in a runNMONTS Number of simulated months in a runIMEAN Time averaging of output data (2: monthly means) 2IAUFR Start from initial conditions [0/1]

Table 2.7: Namelist specifications of MPI-OM for a PRISM coarse resolution experiment. The default values arethose specified in the source code.

2.3 HAMOCC

HAMOCC5 is a model of the ocean carbon cycle of the NPZD class including a semi-labile pool of dis-solved organic carbon. The basic version contains seven oceanic aqueous tracer which are transported bythe ocean’s advection and mixing routines and diffusively exchanged with the sediment scheme. Particu-late matter (organic carbon, calcium carbonate, silicate) accumulates in the sediment layers. Dissolutionis also possible. The biological model is based on a Redfield stoichiometry for organic material. Phyto-plankton growth is described by a Michaelis-Menten kinetics with growth rates limited by temperature,wind speed, vertical mixing, and light intensity. The model is described in detail in Kriest et al. (2004)and in Maier-Reimer et al. (2005).

2.3.1 Input and Output Data

HAMOCC runs on the MPI-OM grid. It receives all information about the model geometry in a parameterlist from the ocean model. It needs only one input file, INPDUST Ores which contains monthly dustfluxes obtained by nudging the ECMWF ERA15 data set into the ECHAM4/T42/L19 model (Timmreckand Schulz (2004)).

2MPI-OM resolution grob: about 3 degrees horizontal spacing

Page 19: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

2.3. HAMOCC 11

Actual Filename Content Filename Dates

INPDUST Ores Mineral dust input INPDUST Clim., monthly

Table 2.8: Input files for HAMOCC.

The output files of HAMOCC are written in netCDF file format.

Monthly mean output data is saved in file bgcmean date enddate.nc. The naming rule concerningthe time period the file spans is the same as that for the MPI-OM output files (section 2.2.1). The data isstored as n+1 dimensional arrays where n is the dimension of the variable, the additional dimension is forthe months. The file also contains data averaged over the full run period.

Actual Filename Content Filename Dates

bgcmean date enddate.nc Chemical tracer con-centrations, fluxes; bi-ological variables

bgcmean (Monthly)means of runperiod

timeser bgc date enddate.nc Station time series ofchemical and biologi-cal variables

timeser bgc (High fre-quency) meansof run period

Table 2.9: Output files of HAMOCC.

The averaging period of the data in timeser bgc date enddate.nc is set in the namelist by speci-fying parameter nfreqts1. The file contains station time series. The positions of the stations (grid cells)is also specified via namelist input.

2.3.2 Conditional Compilation

HAMOCC is a submodel called from the MPI-OM ocean model (section 2.2). According to the PRISMcoding rules for such models (which in a more general context are called packages), the model must getall the information it needs from the calling model through subroutine parameter lists. This rule is notcompletely kept for MPI-OM and HAMOCC. The module MO COMMO1 from MPI-OM is used in twoHAMOCC routines. Thus the specification of the grid defining cpp flag is necessary. Besides, the flagGMBOLUS has to be activated in order to define the same data blocks used in both models. This will bechanged soon. All other cpp flags are used to control the model flow. They are described in 5.4.

2.3.3 Namelist Parameters

The namelist of HAMOCC contains variables controlling the model flow. The variables and a shortexplanation are listed in table 2.10. It is possible to define stations (model grid cells) for which time seriesare printed.

Variable name Description Actual Default

io stdo bgc 7kchck 0nfrqts1 frequency of time series sampling 10isac 1rlonts1 positions of samples in time series1rlatts1 positions of samples in time series1rdept1ts1 depth1 of samples in time series1

Table 2.10: Namelist specifications of HAMOCC for a PRISM experiment with mainmodel MPI-OM.

Page 20: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

12 CHAPTER 2. THE MPI-MET COMPONENT MODELS

Page 21: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

Chapter 3

PRISM Coupling Strategies

The coupling of the PRISM MPI-M Earth-System model is realised within the infrastructure and with thesoftware defined and developed during the PRISM project (http://prism.enes.org). The current chaptergives a description of the different coupling strategies supported by the PRISM system.

Generally speaking there are three ways of coupling component models within the PRISM infrastructure.Either

1. by implementing a model in the code of a hosting model as one or more subroutine calls, or

2. by direct exchange of data between model executables with the OASIS model interface libraryPSMILe (MPI1 or MPI2 based), or

3. by indirect exchange of data between the models via the OASIS coupler, also making use of thePSMILe library (MPI1 or MPI2 based).

3.1 Main- and Submodel Relationship

The coupling of MPI-OM and HAMOCC is done in the first way. MPI-OM calls HAMOCC by a seriesof subroutine calls. They are describedin detail in section 4.2. Both models are formulated on the samegrid and the data exchange is controlled by the frequency of the respective subroutine calls. The datatransfer is performed through the parameter lists. Note that MPI-OM includes also the code for the seaice calculation. However the sea ice is not treated as a separate component model here though the PRISMproject has defined the sea ice as a subsystem of the earth system. Similarly the ECHAM5 model makescalls to the land surface scheme which is not treated as a separate component model. The ocean model(including sea ice) and the bcg are building a single executable. Same is true for the atmosphere and theland surface.

3.2 Coupling via the PRISM Coupler

The second way of coupling is commonly used for exchange of data between component models runningon the same grid but in different executables. If there is no interpolation needed between source and targetgrid, there is no need to let the data pass trough OASIS3. However, the synchronisation is managed by thePRISM software. This method is not used for the coupling of MPI-M models.

The MPI-AO model (also known as ECHO) is realised as a two executable model (without counting theOASIS3 executable). One model is built by the atmosphere ECHAM5, the other executable by the oceanmodel MPI-OM. Both models are formulated on different grids. Interpolation and synchronisation ismanaged by the PRISM software. This two executable constellation is kept when other components areadded to the model (e.g. HAMOCC). More details on the coupling procedure within the MPI-AO aregiven in section 4.3.

13

Page 22: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

14 CHAPTER 3. PRISM COUPLING STRATEGIES

In all cases the component models must be independent models in the sense that they do not use codeof any other component model. This means that the models should not use routines or include headerfiles from any other model. Routines of a more general purpose should be moved to a model independentlibrary which can be used by all models. For the first case this implies that all information that is exchangedbetween the models must be passed through the parameter lists of subroutine calls (see the package rule inthe PRISM handbook on ’Software Engineering, Coding Rule, Quality Standards’ (Mangili et al. (2003))).

3.2.1 The namcouple

The specific coupling configuration has to be is defined in a file called namcouple. It is a namelist likeinput file for the coupler OASIS3.

With each coupled model retrieved from the CVS repository a base namcouple file is delivered. The actualnamcouple file is created at runtime by replacing the namcouple configurable variables by the value of ascript environment variable. The namcouple base file used with the MPI-AO coupled model is shownbelow. Its configurable variables and the names of the environment variables by which they are replacedare listed in Table 3.1.

Configurable namcouple entries

In the following, the configurable namcouple entries of Table 3.1 are commented on. For other entriesplease refer to the OASIS3 ”User’s Guide” (http://prism.enes.org/Results/Documents/PRISM Reports/-Report2 oasis3 UserGuide.pdf).

Figure 3.1: The lag concept of OASIS3 for concurrent and sequential model runs

1. #Nmseq: Sequential Mode

The sequential mode of the run is calculated from the indices of sequential position for each ofthe exchange fields (see next item). It is defined as the maximum value of the sequential positionindices. The MPI coupled models run in two configurations: Either #Nmseq and all sequentialindices are set to 1 and the component models run concurrently, or the fields going from the ocean

Page 23: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

3.2. COUPLING VIA THE PRISM COUPLER 15

to the atmosphere have index 1 and fields going from the atmosphere to the ocean have index two.In this configuration #Nmseq is 2 and the component models run sequentially.

During the first run of an experiment (RERUN = false in the run script) MPI-OM and ECHAM5run sequentially. At the beginning, the ocean model writes its exchange fields to the OASIS restartfile before the initialisation of the communication with OASIS is done. OASIS reads these fields(surface conditions) and transforms them to the atmosphere grid. The atmosphere model receivesthe ocean data from OASIS and can then starts the integration. At the end of the first coupled timestep ECHAM5 sends its exchange fields via OASIS to MPI-OM. Now the ocean model performsthe calculation of the first coupled time step while the atmosphere model is waiting. As soon asthe ocean model sends data (at the end of the coupled time step) the atmosphere model resumesits calculations, while the ocean model is waiting. At the end of the run both models write OASISrestart files. During restarted runs (RERUN=true in the run script) the models run concurrently, asexchange field restart files are available for both models.Thus, for coupled models involving ECHAM5 and MPI-OM, two different values for the ’sequentialmode’ are used. This concept allows the coupled model to start an experiment without OASISrestart files containing the exchange data. The restart file with exchange data generated by theocean executable are constructed form the ocean initialisation data. For performance reasons, thefirst run should not be long (usually one month).

2. #Iseq: Index of Sequential PositionEach exchange field specified in the namcouple is provided with an index specifying the sequentialposition of the field in the exchange algorithm. Within the first run of an experiment, the atmosphereand ocean models run sequentially, the atmosphere model starting. The sea surface conditionsprovided by the ocean are needed for the integration of the first coupled time step of ECHAM5.Therefore, fields going from the ocean to the atmosphere have the sequential index 1 (as they areneeded first) and fields going from the atmosphere to the ocean having the sequential index 2. Therun script value for this index is $iseq.

3. #Laga2o and #Lago2a: The Lag concept of OASIS3 At each coupled time step, data transferfrom one model to the other model (or in both directions) takes place. The sending model callsprism put (PSMILe library) in its last time step of the coupled time step. The receiving modelgets the data by a prism get call at the beginning of the next coupled time step. The date passedfor synchronisation with the calls is that of the beginning of the model’s time step period (e.g. thefirst model time step has the date 0). This means that sending and receiving of the exchange fields isnot done with the same time stamp. The OASIS3 ’lag’ concept is dealing with this fact. #Laga2odefines the time lag between sending of the exchange fields by the atmosphere model and receivingthem by the ocean model. For concurrent runs this time lag corresponds to one atmosphere timestep. Analogously, #Lago2a corresponds to one ocean time step for concurrent runs.When the models run sequentially, the leading model sends the exchange fields at the last time stepof a coupled time step. The second model reads this data when it resumes the calculation, to simulatethe same coupled time step. In this case the time lag corresponds to the sending model’s time stepminus the coupled time step (a negative value.) Figure 3.1 depicts the lag concept for concurrentand sequential coupling. The arrows represent field exchange from ECHAM5 to MPI-OM (red) andfrom MPI-OM to ECHAM5 (blue).

4. #Chan: Message passing method.

The variable defines the message passing method used for the data exchange from one executableto another. ECHAM5 and MPI-OM can use either MPI-2 or MPI-1.

5. #Mod1procs and #Mod2procs: Number of processors

These parameters represent a lists of three variables specified in the run script. The first vari-able defines the total number of processors used by ECHAM5 ($

�nprocatm � ) and by MPI-OM

($�nprococe � ) respectively. The second variable defines the the number of processors involved in

the communication with OASIS3 ($ncplprocatm/$�ncplprococe � ). The third variable, a spawning

Page 24: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

16 CHAPTER 3. PRISM COUPLING STRATEGIES

argument with MPI-2 communication remains empty.

6. #Cplexptid: Experiment identifier

OASIS uses an experiment ID which is composed of three characters.

7. #Atmmodnam and #Ocemodnam: Component model names

The names of the ECHAM5 and MPI-OM model executable is defined with these six characterstrings. The names correspond to the executable names at runtime i.e. in the working directory.They are different from the executable names generated by the compile scripts. The executables aretransfered to the working directory of the computing host by the run script. At the same time theyare renamed to meet the OASIS3 convention of six characters.

8. #Runtime

The parameter defines the duration of a coupled model run in seconds.

9. #Yyyymmdd

The starting date of the run is defined as an integer with eight digits. The first four integers definethe year followed by two integers for the month and two integers for the day. The variable is updatedfor each run of an experiment.

10. #Dta2o and #Dto2a: Coupling time steps

These parameters define the time span in seconds between two exchanges of the fields going fromECHAM5 to MPI-OM and from MPI-OM to ECHAM5, respectively. The coupling time step has tobe a common multiple of the atmosphere and the ocean model time step. All ECHAM5 exchangefields have to be exchanged at the same frequency since the averaging is done with the same fre-quency for all fields by ECHAM5 itself. To the contrary the MPI-OM fields might be sent withdifferent frequencies by editing the namcouple entries. However the run scripts do not support dif-ferent exchange frequencies for fields of the same origin. No check on the synchronisation will bedone with these modifications and blocking situations can easily occur. Besides, it is questionablewhether the specification of different exchange frequencies is physically reasonable.

11. #Nlogprt: Amount of OASIS3 standard output

The parameter is controlling the amount of information written to the OASIS output file. Withcplout being 0 minimum and with cplout being 2 maximum output is produced.

12. #TimTransa2o and #TimTranso2a: Time transformations It is possible to perform local trans-formation on the exchange fields before they are passed to OASIS3 using the PSMILe library.

In ECHAM5, prism put is called only at exchange dates. The transformation on the fields gener-ated by ECHAM5 is ’INSTANT’, i.e. the fields are passed as they appear in the parameter list of theprism put subroutine calls. Time averaging is done by ECHAM5 in subroutine collect.f.The default values for transformations on the fields generated by MPI-OM is ’AVERAGE’. Calls toprism put are done every model time step and the fields are accumulated by the PSMILe library.Before they are passed to OASIS3 they are divided by the length of the coupling step. For consis-tency, the cpp flag accu by psmile must be activated. If, in the MPI-OM compile script, it isnot set, prism put is called at coupling dates only, and the averaging is done by in the model.#TimTranso2amust then be set to ’INSTANT’.The current run scripts as well as the model adaptations do not support time transformations varyingfrom field to field. If you want to use different transformations you have to edit the namcouple basefile and adapt the models.

13. #Lona, #Lata, #Lono, and #Lato: Field dimensions

The parameters define the dimensions of the atmosphere and ocean exchange fields, respectively.The first dimension (#Lono) of the ocean model exchange field is smaller than the model griddimension by 2 as the cyclic boundary values are not transfered.

14. #Extrapwr: Extrapolation fag

Page 25: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

3.2. COUPLING VIA THE PRISM COUPLER 17

The flag provokes the calculation of weights and addresses for nearest neighbour extrapolation(EXTRAP/NINENN) within OASIS (#Extrapwr=1) or the reading from file (#Extrapwr=0).

15. #Cnfileaw and #Cnfileow: Restart file names

The names of the OASIS restart files written by the atmosphere (atmosphere exchange fields on theatmosphere grid) and by the ocean (ocean exchange data on the non cyclic ocean grid) are definedusing these eight character strings.

16. #Exp: Field status

The field status defines whether or not an exchange field is written to an output file making use ofthe mpp io library. Within the MPI-M coupled model setup possible options are EXPORTED fordata exchange only and EXPOUT for data exchange and output. The EXPOUT option provokes thegeneration of two files per exchange field. One of them containing the exchange data as it is sent(on the source grid) the other one containing the data as it is received by the partner model (on thetarget grid). The filenames are created automatically. They are made up of the field’s key stringfrom the namcouple and the initial date of the run.

17. #Norma: Normalisation method of SCRIP conservative remapping

The SCRIP conservative remapping supports three normalisation methods for grid cells that arepartly covered by masked and unmasked source grid cells. Default option is FRACAREA withprovokes a normalisation with the area fraction of the destination grid cell covered by unmaskedsource grid cells. Option DESTAREA leads to normalisation with the total destination grid cellwhereas option NONE suppresses any normalisation.

18. #Order: Order of SCRIP conservative remapping

It is possible to perform a first (#Order=FIRST) or a second (#Order=SECOND) order SCRIPconservative remapping. For second order remapping the gradients are included. For remappingfrom a fine to a coarse grid first order remapping is sufficient. On the other hand it is recommendedto use second order remapping for the remapping from a coarse to a fine grid.

Page 26: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

18 CHAPTER 3. PRISM COUPLING STRATEGIES

Variable in Run script variable Example ValuesNamcouple for MPI-AO grob T21Base File sequential concurrent

#Nmseq $�nmseq � 2 1

#Iseq $�iseq � 2 1

#Laga2o $�laga2o � -84000 ( ����� ��� - ������� ) 2400 ( ����� ��� )

#Lago2a $�lago2a � 8640 ( ���� ��� )

#Chan $�message passing � $

�nobsend � MPI2 NOBSEND

#Mod1procs $�nprocatm � $

�ncplprocatm � $

�arg1 � 1 1

#Mod2procs $�nprococe � $

�ncplprococe � $

�arg2 � 1 1

#Cplexptid $�jobname � D10

#Atmmodnam $�atmmod � echam5

#Ocemodnam $�ocemod � mpi-om

#Runtime $�runtime � 2592000 (1 month)

#Yyyymmdd $�date � 19780101

#Nlogprt $�nlogprt � 2

#Dta2o $�dta2o � 86400 ( �������� )

#Dto2a $�dto2a � 86400 ( �������� )

#Cnfileaw $�cnfileaw � flxatmos

#Cnfileow $�cnfileow � sstocean

#Exp $�exported � EXPORTED

#Lona $�lona � 64

#Lata $�lata � 32

#Lono $�lono � 120

#Lato $�lato � 101

#LocTrans $�loctrans � LOCTRANS

#TimTransa2o $�timtransa2o � INSTANT

#TimTranso2a $�timtranso2a � INSTANT

#Extrapwr $�extrapwr � 0

#Norma $�norma � FRACAREA

#Order $�order � FIRST

Table 3.1: Configurable entry names of the namcouple base file for MPI-AO (ECHO)

Page 27: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

3.2. COUPLING VIA THE PRISM COUPLER 19

The namcouple base file

The namcouple base file used in ECHO, namcouple echo, is displayed entirely in this section. Entriesbeginning with ’#’ and capital letter followed by small letters are configurable variables. At run time thesevariables are replaced with the values for the specific run of the specific coupled model configuration. Theactual namcouple of a run is printed in the log file of the coupled model run. Check this file to see whetherthe base file was edited correctly.################################################################################# Input file for OASIS3## This version is for use with ECHAM5 fluxes and# surface conditions computed in mpi-om.## The file will be edited in the run-script to update it for the# actual integration period and grid dimensions.## Modified : S. Legutke DKRZ 29.12.02# - updated from 2.4.t version################################################################################## Input delimiters have to occupy position 1 to 9 !# No blank lines allowed !# Length of input lines <= 80 !################################################################################## SEQMODE : 1 if all models run simultaneously# n if n models run sequentially#$SEQMODE

#Nmseq$END

################################################################################## CHANNEL (CHAR*4)# PIPE if named pipes + binary files are used# for synchro and data respectively (not tested);# MPI1/2 if MPI message passing is used for data exchange;#$CHANNEL

#Chan#Mod1procs#Mod2procs

$END################################################################################## CALTYPE: calendar type# 0 = 365 day calendar (no leap years)# 1 = 365 day, or 366 days for leap years, calendar# n (>1) = n day month calendar#$CALTYPE

1$END

################################################################################## NFIELDS : total number of fields being exchanged.#$NFIELDS

19$END

################################################################################## JOBNAME : acronym for the given simulation (CHAR*3)# the value will be set before the actual integration#$JOBNAME

#Cplexptid$END

################################################################################## NBMODEL : number of models and their names (CHAR*6).#$NBMODEL

2 #Atmmodnam #Ocemodnam

Page 28: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

20 CHAPTER 3. PRISM COUPLING STRATEGIES

$END################################################################################## RUNTIME (<I8)# total simulated time for the actual run in seconds# the value will be set before the actual integration#$RUNTIME

#Runtime$END

################################################################################## INIDATE (I8)# initial date of the run (YYYYMMDD)#$INIDATE

#Yyyymmdd$END

################################################################################## MODINFO (YES or NOT)# Indicates whether a header is encapsulated within the field#$MODINFONOT

$END#################################################################################$NLOGPRT#Nlogprt

$END#################################################################################################################################################################$STRINGS

################################################################################### Field 1 Sea_surface_temperature [K;K]#SSTOCEAN SSTATMOS 1 #Dto2a 6 #Cnfileow #Exp#Lono #Lato #Lona #Lata oces atmo SEQ=1 LAG=#Lago2aP 0 P 0#LocTrans INVERT CHECKIN SCRIPR CHECKOUT REVERSE#TimTranso2a

NORSUD WSTESTINT=1

CONSERV LR SCALAR LATITUDE 40 #Norma #OrderINT=1

NORSUD WSTEST################################################################################# Field 2 Sea_ice_thickness [m;m]#SITOCEAN SITATMOS 45 #Dto2a 6 #Cnfileow #Exp#Lono #Lato #Lona #Lata oces atmo SEQ=1 LAG=#Lago2aP 0 P 0#LocTrans INVERT CHECKIN SCRIPR CHECKOUT REVERSE#TimTranso2a

NORSUD WSTESTINT=1

CONSERV LR SCALAR LATITUDE 40 #Norma #OrderINT=1

NORSUD WSTEST################################################################################# Field 3 Sea_ice_area_fraction [;]#SICOCEAN SICATMOS 31 #Dto2a 6 #Cnfileow #Exp#Lono #Lato #Lona #Lata oces atmo SEQ=1 LAG=#Lago2aP 0 P 0#LocTrans INVERT CHECKIN SCRIPR CHECKOUT REVERSE#TimTranso2a

NORSUD WSTESTINT=1

CONSERV LR SCALAR LATITUDE 40 #Norma #OrderINT=1

NORSUD WSTEST###############################################################################

Page 29: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

3.2. COUPLING VIA THE PRISM COUPLER 21

## Field 4 lwe_surface_snow_thickness_where_sea_ice [m;m]#SNTOCEAN SNTATMOS 46 #Dto2a 6 #Cnfileow #Exp#Lono #Lato #Lona #Lata oces atmo SEQ=1 LAG=#Lago2aP 0 P 0#LocTrans INVERT CHECKIN SCRIPR CHECKOUT REVERSE#TimTranso2a

NORSUD WSTESTINT=1

CONSERV LR SCALAR LATITUDE 40 #Norma #OrderINT=1

NORSUD WSTEST################################################################################# Field 5 surface_downward_eastward_stress_where_open_sea [pa;pa]#TXWATMOU TXWOCEAU 50 #Dta2o 8 #Cnfileaw #Exp#Lona #Lata #Lono #Lato atmo oceu SEQ=#Iseq LAG=#Laga2oP 0 P 0#LocTrans INVERT CHECKIN MASK EXTRAP SCRIPR CHECKOUT REVERSE#TimTransa2o

NORSUD WSTESTINT=1

9999.999999e+06NINENN 2 #Extrapwr 1

BICUBIC LR VECTOR LATITUDE 40INT=1

NORSUD WSTEST################################################################################# Field 6 surface_downward_eastward_stress_where_open_sea [pa;pa]#TXWATMOV TXWOCEAV 50 #Dta2o 8 #Cnfileaw #Exp#Lona #Lata #Lono #Lato atmo ocev SEQ=#Iseq LAG=#Laga2oP 0 P 0#LocTrans INVERT CHECKIN MASK EXTRAP SCRIPR CHECKOUT REVERSE#TimTransa2o

NORSUD WSTESTINT=1

9999.999999e+06NINENN 2 #Extrapwr 1

BICUBIC LR VECTOR LATITUDE 40INT=1

NORSUD WSTEST################################################################################# Field 7 surface_downward_northward_stress_where_open_sea [pa;pa]#TYWATMOU TYWOCEAU 51 #Dta2o 8 #Cnfileaw #Exp#Lona #Lata #Lono #Lato atmo oceu SEQ=#Iseq LAG=#Laga2oP 0 P 0#LocTrans INVERT CHECKIN MASK EXTRAP SCRIPR CHECKOUT REVERSE#TimTransa2o

NORSUD WSTESTINT=1

9999.999999e+06NINENN 2 #Extrapwr 1

BICUBIC LR VECTOR LATITUDE 40INT=1

NORSUD WSTEST################################################################################# Field 8 surface_downward_northward_stress_where_open_sea [pa;pa]#TYWATMOV TYWOCEAV 51 #Dta2o 8 #Cnfileaw #Exp#Lona #Lata #Lono #Lato atmo ocev SEQ=#Iseq LAG=#Laga2oP 0 P 0#LocTrans INVERT CHECKIN MASK EXTRAP SCRIPR CHECKOUT REVERSE#TimTransa2o

NORSUD WSTESTINT=1

9999.999999e+06NINENN 2 #Extrapwr 1

BICUBIC LR VECTOR LATITUDE 40INT=1

NORSUD WSTEST################################################################################# Field 9 surface_downward_eastward_stress_where_sea_ice [Pa;Pa]#TXIATMOU TXIOCEAU 52 #Dta2o 8 #Cnfileaw #Exp#Lona #Lata #Lono #Lato atmo oceu SEQ=#Iseq LAG=#Laga2oP 0 P 0#LocTrans INVERT CHECKIN MASK EXTRAP SCRIPR CHECKOUT REVERSE#TimTransa2o

Page 30: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

22 CHAPTER 3. PRISM COUPLING STRATEGIES

NORSUD WSTESTINT=1

9999.999999e+06NINENN 2 #Extrapwr 1

BICUBIC LR VECTOR LATITUDE 40INT=1

NORSUD WSTEST################################################################################# Field 10 surface_downward_eastward_stress_where_sea_ice [Pa;Pa]#TXIATMOV TXIOCEAV 52 #Dta2o 8 #Cnfileaw #Exp#Lona #Lata #Lono #Lato atmo ocev SEQ=#Iseq LAG=#Laga2oP 0 P 0#LocTrans INVERT CHECKIN MASK EXTRAP SCRIPR CHECKOUT REVERSE#TimTransa2o

NORSUD WSTESTINT=1

9999.999999e+06NINENN 2 #Extrapwr 1

BICUBIC LR VECTOR LATITUDE 40INT=1

NORSUD WSTEST################################################################################# Field 11 surface_downward_northward_stress_where_sea_ice [pa;pa]##TYIATMOU TYIOCEAU 53 #Dta2o 8 #Cnfileaw #Exp#Lona #Lata #Lono #Lato atmo oceu SEQ=#Iseq LAG=#Laga2oP 0 P 0#LocTrans INVERT CHECKIN MASK EXTRAP SCRIPR CHECKOUT REVERSE#TimTransa2o

NORSUD WSTESTINT=1

9999.999999e+06NINENN 2 #Extrapwr 1

BICUBIC LR VECTOR LATITUDE 40INT=1

NORSUD WSTEST################################################################################# Field 12 surface_downward_northward_stress_where_sea_ice [pa;pa]#TYIATMOV TYIOCEAV 53 #Dta2o 8 #Cnfileaw #Exp#Lona #Lata #Lono #Lato atmo ocev SEQ=#Iseq LAG=#Laga2oP 0 P 0#LocTrans INVERT CHECKIN MASK EXTRAP SCRIPR CHECKOUT REVERSE#TimTransa2o

NORSUD WSTESTINT=1

9999.999999e+06NINENN 2 #Extrapwr 1

BICUBIC LR VECTOR LATITUDE 40INT=1

NORSUD WSTEST################################################################################# Field 13 lwe_surface_downward_snow_flux_where_sea_ice [m/s;m/s]#FRIATMOS FRIOCEAN 55 #Dta2o 9 #Cnfileaw #Exp#Lona #Lata #Lono #Lato atmo oces SEQ=#Iseq LAG=#Laga2oP 0 P 0#LocTrans INVERT CHECKIN MASK EXTRAP SCRIPR CONSERV CHECKOUT REVERSE#TimTransa2o

NORSUD WSTESTINT=1

9999.999999e+06NINENN 2 #Extrapwr 1

BILINEAR LR SCALAR LATITUDE 40GLOBAL

INT=1NORSUD WSTEST

################################################################################# Field 14 water_flux_into_ocean [m/s;m/s]#FRWATMOS FRWOCEAN 29 #Dta2o 9 #Cnfileaw #Exp#Lona #Lata #Lono #Lato atmo oces SEQ=#Iseq LAG=#Laga2oP 0 P 0#LocTrans INVERT CHECKIN MASK EXTRAP SCRIPR CONSERV CHECKOUT REVERSE#TimTransa2o

NORSUD WSTESTINT=1

9999.999999e+06NINENN 2 #Extrapwr 1

Page 31: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

3.2. COUPLING VIA THE PRISM COUPLER 23

BILINEAR LR SCALAR LATITUDE 40GLOBAL

INT=1NORSUD WSTEST

################################################################################# Field 15 Residual heat flux (sea-ice topmelt heat flux) [W/Mˆ2;W/mˆ2]#RHIATMOS RHIOCEAN 43 #Dta2o 8 #Cnfileaw #Exp#Lona #Lata #Lono #Lato atmo oces SEQ=#Iseq LAG=#Laga2oP 0 P 0#LocTrans INVERT CHECKIN MASK EXTRAP SCRIPR CHECKOUT REVERSE#TimTransa2o

NORSUD WSTESTINT=1

9999.999999e+06NINENN 2 #Extrapwr 1

BILINEAR LR SCALAR LATITUDE 40INT=1

NORSUD WSTEST################################################################################# Field 16 downward_heat_flux_in_sea_ice [W/Mˆ2;W/mˆ2]#CHIATMOS CHIOCEAN 42 #Dta2o 8 #Cnfileaw #Exp#Lona #Lata #Lono #Lato atmo oces SEQ=#Iseq LAG=#Laga2oP 0 P 0#LocTrans INVERT CHECKIN MASK EXTRAP SCRIPR CHECKOUT REVERSE#TimTransa2o

NORSUD WSTESTINT=1

9999.999999e+06NINENN 2 #Extrapwr 1

BILINEAR LR SCALAR LATITUDE 40INT=1

NORSUD WSTEST################################################################################# Field 17 surface_downward_heat_flux_in_air [W/m**2;W/m**2]#NHWATMOS NHWOCEAN 5 #Dta2o 9 #Cnfileaw #Exp#Lona #Lata #Lono #Lato atmo oces SEQ=#Iseq LAG=#Laga2oP 0 P 0#LocTrans INVERT CHECKIN MASK EXTRAP SCRIPR CONSERV CHECKOUT REVERSE#TimTransa2o

NORSUD WSTESTINT=1

9999.999999e+06NINENN 2 #Extrapwr 1

BILINEAR LR SCALAR LATITUDE 40GLOBAL

INT=1NORSUD WSTEST

################################################################################# Field 18 surface_net_downward_shortwave_flux [W/m**2;W/m**2]#SHWATMOS SHWOCEAN 7 #Dta2o 8 #Cnfileaw #Exp#Lona #Lata #Lono #Lato atmo oces SEQ=#Iseq LAG=#Laga2oP 0 P 0#LocTrans INVERT CHECKIN MASK EXTRAP SCRIPR CHECKOUT REVERSE#TimTransa2o

NORSUD WSTESTINT=1

9999.999999e+06NINENN 2 #Extrapwr 1

BILINEAR LR SCALAR LATITUDE 40INT=1

NORSUD WSTEST################################################################################# Field 19 wind_mixing_energy_flux_into_ocean [m**3/s**3]# or wind_speed_at_10m [m /s ]#WSVATMOS WSVOCEAN 37 #Dta2o 8 #Cnfileaw #Exp#Lona #Lata #Lono #Lato atmo oces SEQ=#Iseq LAG=#Laga2oP 0 P 0#LocTrans INVERT CHECKIN MASK EXTRAP SCRIPR CHECKOUT REVERSE#TimTransa2o

NORSUD WSTESTINT=1

9999.999999e+06NINENN 2 #Extrapwr 1

BILINEAR LR SCALAR LATITUDE 40INT=1

NORSUD WSTEST

Page 32: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

24 CHAPTER 3. PRISM COUPLING STRATEGIES

###############################################################################$END

Page 33: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

Chapter 4

The MPI-Met Coupled ModelConstellations

The MPI-M PRISM models can be run in several different coupled combinations. The component modelsare configured for a specific coupled constellations at compile time. Within the PRISM system this is donein a save and automatic way. The only thing that needs to be done is to give the list of models participatingin the experiment as positional parameters when creating the compile scripts. The exact calling sequenceis given below in the coupled models’ sections (4.1-4.5). Depending on the parameter list, the cpp flagsrequired for the configuration of the models are set. The full set of configuring cpp flags together with ashort explanation is listed in table 5.1. Note that the HAM model (Hamburg Aerosol Model) is not yetpart of the MPI-M PRISM model family. There is, however, field exchange from HAMOCC to ECHAM5possible. This is configured with the cpl ham cpp flag in the other models. The fields are received inECHAM5, though their destination should be HAM (in work).

4.1 MPI-OM stand-alone

PRISM provides an environment to run MPI-OM not only in the coupled configuration but also as astandalone model. This section gives instructions on how to setup and run an experiment using MPI-OMas stand-alone ocean model within the PRISM system. For information on the model itself please readsection 2.2.

4.1.1 Retrieving the Source Code

The MPI-OM source code, all related scripts and the initial and forcing data can be retrieved from thePRISM central CVS repository in one go using the CVS module MPI-OM (see 7).

cvs checkout [-r prism_2-4] MPI-OM

The checkout command provokes the build-up of the prism standard directory tree with the root directoryprism. It is possible to rename the root directory. Apart from that the directory structure should not bemodified, otherwise some of the PRISM software might not work correctly.

4.1.2 Compilation

The compile script is created by calling the script Create COMP cpl model.ksh (directory pri-sm/util/compile/frames) with the parameter mpi-om. Optionally, the version ID can be givenas second positional parameter. If no second parameter is given, the compile script is generated using the

25

Page 34: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

26 CHAPTER 4. THE MPI-MET COUPLED MODEL CONSTELLATIONS

default version ID which is B01 for the stand-alone configuration of MPI-OM. Per default, the compilescripts are generated for the machine Create COMP cpl models.frm is called on. If the compilationshall take place on another machine the node name of the compiling host needs to be given as a third inputparameter.

Create COMP cpl model.ksh mpi-om [id] [node]

Calling Create COMP cpl model.ksh for the stand-alone configuration of MPI-OM is equivalent tocalling Create COMP models.frm for MPI-OM with an empty list of partner models. The list ofpartner models (the last input parameter of the script) is needed for the automatic definition of cpp flags,specific for the constellation. More details on the compile script generation can be found in section 5.3 orin the handbook of the PRISM Standard Compilation Environment (Legutke and Gayler (2004)).

Create COMP models.frm mpi-om NONE - "" node ID ""

The cpp flags needed for the configuration of the stand-alone model or of a specific coupled constellationare set automatically. However, other configuration parameters for the model have to be set manually in thegenerated compile script. These are e.g. cpp flags for activation of non-default parameterisations (compare2.2) or the horizontal and vertical resolution. Default resolution is the coarsest resolution available fromthe PRISM repository, in the case of MPI-OM it is grob (see 7).

MPI-OM needs two executables: one for the initial start from the climatology and another one for restartedruns starting from a model generated restart file. Thus the compile script has to be run twice, once withthe parameter newstart=yes (default) and once with newstart=no.

The compile script generated can be found in the model directory /prism/src/mod/mpi-om. Toallow the storage of different compile scripts for different MPI-OM configurations the scripts are labelledwith the version ID (COMP mpi-om B01.node).

4.1.3 Model Execution

The scripts for the model execution (tasks) are generated by the script Create TASKS.frm. The scriptis located in directory prism/util/running/frames. It needs to be called with the coupled modelname as first and the experiment ID as second positional parameter. To create the tasks for a machine otherthan the default machine, the node name of the computing host has to be given as third input parameter.

Create TASKS.frm mpi-om B01 [node]

The first call of Create TASKS.frm for a specific coupled model and experiment ID leads to the gen-eration of a setup file (setup cplmod expid) containing all configurable parameters for the experi-ment. In the example case the (coupled) model configuration (cplmod) is mpi-om and the experimentID (expid) is B01. The setup file needs to be edited according to the experimental design. Reasonabledefaults are provided for all variables. Table 4.1 lists the model specific variables of the setup. For detailedinformation on the setup please read the handbook of the PRISM Standard Running Environment (Gaylerand Legutke (2004)).

Once the setup is adapted, Create TASKS.frm has to be called again with the same input parameters.This leads to the tasks’ generation. They are transferred automatically to directory home/expid/-scripts on the computing host. The path variable home corresponds to the user’s definition in thesetup. The input data needed for the experiment as well as the executables are transferred automaticallyat runtime.

To start the simulation the runscript RUN cplmod expid needs to be submitted. Depending on thecomputing host this can be done interactively by just typing the name of the script or within a queueing

Page 35: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

4.2. MPI-OM + HAMOCC 27

Variable Possible Choices Descriptionres oce grob grob1 grob15 horizontal grid resolution acronymvres oce 20 20 40 number of vertical levelsocevers B01 any character string ocean model versionnthreadoce 1 machine dependent number of openMP threads

Table 4.1: Model specific parameters in the setup file generated for MPI-OM as stand-alone model. Bolt charactersare used for defaults.

system.

4.2 MPI-OM + HAMOCC

MPI-OM can be set up to run together with the marine bio-geo-chemistry model HAMOCC as a singleexecutable (i.e. without the coupler OASIS3). The coupling is realised by a main/sub-model relationshipof the two component models. The HAMOCC routines are called from the MPI-OM main program. Allparameters specifying the grid constellation are passed from MPI-OM to HAMOCC in the parameter listsof these subroutine calls. However, HAMOCC is considered to be an independent component model: Thesource code resides in a separate source code directory and the model has a separate compile script.

Table 4.2 lists the HAMOCC routines called from the main model MPI-OM. All routines are called fromthe main program MPIOM.F90. The subroutine calls are activated only, if the cpp key cpl hamocc isset.

Routine Description Calling Frequencyini bgc Initialisation of the bgc sub-model once a runrun bgc drives the bgc each model time stepdilute bgc Dilution of bgc tracers due to fresh water fluxes each model time stepavrg bgcmean Calculation of monthly means of the bgc tracer fields once a monthaufw bgc Writing of the bgc restart file once a yearwrite bgcmean Calculation and writing of bgc mean data once a runend bgc Termination of the bgc sub-model once a run

Table 4.2: HAMOCC routines called from the main model MPI-OM in the coupled configuration MPI-OB. Theroutines are listed in the calling order.

The HAMOCC modules used by MPI-OM are listed in table 4.3. The modules are not needed to buildMPI-OM in an stand-alone ocean setup. For that reason the use commands of the modules are enclosedwith the cpp flag cpl hamocc. In the current MPI-OB setup the sub-model HAMOCC is using mod-ules of MPI-OM, listed in table 4.4. This does not meet the PRISM standards. Following the package-rule,a sub-model shall not use any source code of the calling model. Further information on that rule is givenin the SCE handbook Legutke and Gayler (2004).

4.2.1 Exchange Fields

The field exchange between MPI-OM and HAMOCC is sketched in figure 4.1. The coupled model runbegins with the initialisation of MPI-OM as in the stand-alone configuration without HAMOCC (seesection 2.2). When the initial files (grid description, climatology, restart file etc.) and the forcing data areread, the HAMOCC initialisation routine (ini bgc) is called. Three dimensional arrays of temperature(T) and salinity (S) are passed from the ocean to the bgc model for the initial calculation of the chemical

Page 36: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

28 CHAPTER 4. THE MPI-MET COUPLED MODEL CONSTELLATIONS

Routine Description Used inmo biomod variables for marine biology MPIOM, OCICE, OCJITRmo carbch variables for inorganic carbon cycle MPIOM, OCJITR, octdiff bgcmo control bgc control variables for bgc modules MPIOM, OCJITR, octdiff bgcmo param1 bgc bgc tracer parameters MPIOM, OCJITR, octdiff bgcmo sedmnt variables for sediment modules MPIOM, OCJITR

Table 4.3: HAMOCC modules used by the main model MPI-OM in the coupled configuration MPI-OB.

Routine Description Used inmo commo1 ocean/sediment tracer arrays beleg bgc, inventory bgcmo commau1 inventory bgc

Table 4.4: MPI-OM modules used by the sub-model HAMOCC in the coupled configuration MPI-OB. The usageof main model routines o modules does not meet the PRISM standards, as it brakes the package-rule.

constants. In the integration phase there are two exchange frequencies: the (time consuming) calculationof chemical constants takes place once a month and requires monthly mean Temperature and Salinity.Actual temperature and salinity fields are passed every time step together with the sea ice concentration(SI), effective upper layer thickness including the effect of snow/ice draft and sea level elevation (DZ � )and vertical velocity (u

�). Note that the atmospheric variables solar radiation (fsw), 10m wind speed (u ��� ),

and surface pressure (Pa)1 are also passed from the ocean to HAMOCC. These fields are read from file inthe stand-alone (forced) mode. In a coupled constellation including an AGCM they are provided by theatmosphere model.

Temperature and salinity are needed in the calculation of the chemical reactions (chemcon.F90), the pistonvelocity (carchm.F90), the carbonate ion concentration in the sediment (powach.F90), and the biologicalprocesses (ocprod.F90).

No chemical tracer and no solar radiation is allowed to penetrate the snow/ice cover. Therefore the openwater fraction (1-SI) is multiplied to the piston velocity and the solar radiation.

The surface pressure enters the surface tracer flux calculation. The vertical velocity is passed to theroutine calculating the vertical diffusion and interface flux of pore water tracers (dipowa.F90) (but is notused there).

The effective upper layer thickness (DZ � ) is used in the tracer budget calculation related to the sinkingprocess (ocprod.F90). Note that the advective and diffusive transport is calculated in the ocean model.

The bgc model affects the ocean model, as the solar radiation downward penetration in the ocean ismodified by the phytoplankton concentration. This effect is calculated if the cpp flag FB BGC OCE is set.Otherwise the bgc does not affect the ocean calculations and is a pure diagnostic model.

4.2.2 Retrieving the Source Code

The source code of the component models MPI-OM and HAMOCC, all related scripts as well as initialand forcing data needed for this coupled model constellation can be retrieved from the PRISM centralCVS repository specifying the CVS module MPI-OB (see section 7).

cvs checkout [-r prism_2-4] MPI-OB

1Presently the surface pressure is set to a constant value in the MPI-OM source code.

Page 37: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

4.2. MPI-OM + HAMOCC 29

Figure 4.1: Field exchange between MPI-OM and HAMOCC. The 3D fields of temperature (T) and salinity (S) arepassed from MPI-OM to HAMOCC in the initialisation phase. In the integration phase monthly meantemperature and salinity is passed every month. Besides, actual values of temperature, salinity, sea iceconcentrations (SI), upper layer thickness (DZ � ), vertical velocity (u

�), solar radiation (fsw), 10m wind

speed (u ��� ), and surface pressure (Pa) are passed to the bgc model at every time step.The red letters areused for atmospheric variables. The bgc tracer fields are advected using the MPI-OM advection routine.Optionally, The absorption is calculated at each time step in HAMOCC and is sent to MPI-OM.

Page 38: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

30 CHAPTER 4. THE MPI-MET COUPLED MODEL CONSTELLATIONS

The checkout command provokes the build-up of the prism standard directory tree with the root directoryprism. It is possible to rename the root directory. Apart from that the directory structure should not bemodified, otherwise some of the PRISM software might not work correctly.

4.2.3 Compilation

The compile scripts are created by calling the script Create COMP cpl model.ksh (directory pri-sm/util/compile/frames) with the parameter mpi-ob. Optionally, the version ID can be givenas second positional parameter. If no second parameter is specified, the compile script is generated usingthe default version ID which is B03 for the MPI-OM/HAMOCC configuration. Per default, the compilescripts are generated for the machine Create COMP cpl models.frm is called on. If the compilationshall take place on another machine the node name of the compiling host needs to be given as a third inputparameter.

Create COMP cpl model.ksh mpi-ob [id] [node]

Calling Create COMP cpl model.ksh for the coupled configuration MPI-OB is equivalent to callingCreate COMP models.frm twice, once for each of the component models. The list of componentmodels participating in the coupled constellation is specified as last input parameter. It is needed for theautomatic definition of cpp flags, specific for the coupled constellation. More details on the script can befound in section 5.3 or in the handbook of the PRISM Standard Compilation Environment (Legutke andGayler (2004)).

Create COMP models.frm hamocc NONE - "" node ID "mpi-om hamocc"

Create COMP models.frm mpi-om NONE - "" node ID "mpi-om hamocc"

Although the cpp flags needed for the configuration of a specific coupled constellation are set automat-ically, other configuration parameters for the models have to be set manually in the generated compilescripts. These are e.g. cpp flags for activation of non-default parameterisations (compare 2.2) or the hor-izontal and vertical resolution. Default resolution is the coarsest resolution available from the PRISMrepository, in the case of MPI-OM and HAMOCC it is grob (see 7). Note that MPI-OM and HAMOCCare running on the same grid: If a grid resolution other than default is wanted, the compile scripts of bothmodels need to be adapted.2

The compile scripts generated can be found in the model directories /prism/src/mod/mpi-om and/prism/src/mod/hamocc, respectively. To allow the storage of different compile scripts for differ-ent model configurations the scripts are labelled with the version ID (COMP mpi-om B03.node andCOMP hamocc B03.node). When the models are in a main/sub model relationship, the compile scriptsmust use identical version or experiment acronyms.

To compile the models, submission of the MPI-OM compile script is sufficient. The script recognises thatHAMOCC will be run as a sub-model and checks whether it is up to date by launching the compile scriptfor HAMOCC with the corresponding version acronym. The process is aborted when such a script is notavailable in the HAMOCC source code directory. Otherwise the library libhamocc B03.a is updated. Thenthe MPI-OM library is updated (libmpi-om B03.a) and both libraries are linked to build the executablempi-om hamocc B03[ newstart].x. MPI-OM needs two executables: one for the initial start from clima-tology and another for restarted runs starting from a model generated restart file. Thus the compile scripthas to be run twice, once with the parameter newstart=yes (default) and once with newstart=no.

2In a PRISM compliant configuration a sub-model should be independent of the resolution. For that reason the grid resolutionof the main model is not assigned to the sub-model automatically during script generation and compilation process. In HAMOCCthe specification of the resolution is required at compile time, because the model uses MPI-OM modules (MO COMMO1,MO COMMOAU1) which is not PRISM SCE compliant.

Page 39: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

4.3. MPI-OM + ECHAM5 31

4.2.4 Model Execution

The scripts for the model execution (tasks) are generated by the script Create TASKS.frm. The scriptis located in directory prism/util/running/frames. It needs to be called with the coupled modelname as first and the experiment ID as second positional parameter. To create the tasks for a machine otherthan the default machine, the node name of the computing host has to be given as third input parameter.

Create TASKS.frm mpi-ob B03 [node]

The first call of Create TASKS.frm for a specific coupled model and experiment ID leads to the gener-ation of a setup file (setup cplmod expid) containing all configurable parameters for the experiment.In the example case the coupled model configuration (cplmod) is mpi-ob and the experiment ID (expid)is B03. The setup file needs to be edited according to the experimental design. Reasonable defaults areprovided for all variables. Table 4.5 lists the model specific variables of the setup. For detailed informationon the setup please read the handbook of the PRISM Standard Running Environment (Gayler and Legutke(2004)).

Variable Possible Choices Descriptionres oce grob grob1 grob15 horizontal grid resolution acronymvres oce 20 20 40 number of vertical levelsocevers B03 any character string ocean model versionnthreadoce 1 machine dependent number of openMP threads

Table 4.5: Model specific parameters in the setup file generated for MPI-OB. Bolt characters are used for defaults.

Note that there are no configurable variables for HAMOCC. The model version and the grid resolution ofthe sub model has to be identical to the main model version and resolution per definition.

Once the setup is adapted, Create TASKS.frm has to be called again with the same input parameters.This leads to the tasks’ generation. They are transferred automatically to directory home/B03/scriptson the computing host. The path variable home corresponds to the user’s definition in the setup. The inputdata needed for the experiment as well as the executables are transferred automatically at runtime.

To start the simulation the runscript RUN mpi-ob B03 needs to be submitted. Depending on the comput-ing host this can be done interactively by just typing the name of the script or within a queueing system.

4.3 MPI-OM + ECHAM5

The MPI-Met coupled atmosphere ocean model is called MPI-AO. It is also known as ECHO. The cou-pling of MPI-OM and ECHAM5 is realised in a three executable configuration: each component model aswell as OASIS3 has an own executable. The exchange fields are passed from one model to the other modelvia OASIS3. The transformation between the grids is done by the coupler as well. Transformation methodand exchange frequencies can be set by keyword specification in the OASIS3 namelist file namcouple(see section 3.2.1).

The fields passed from ECHAM5 to MPI-OM are average values of the coupling time period. In ECHAM5this is required by the way the control of the exchange events is implemented. However, the fields providedby MPI-OM can be sent at different frequencies. It is questionable, whether sending fields at differentfrequencies makes sense from a scientific point of view. For that reason the run scripts are set up in a waythat only one exchange frequency is supported for all exchange fields going in the same direction. It ischecked that the coupling time steps in both directions synchronise at the end of each day (see figure 4.2).For a setup with exchange frequencies varying from field to field the namcouple file needs to be editeddirectly. For more details on the technical aspects see section 3.2.1.

Page 40: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

32 CHAPTER 4. THE MPI-MET COUPLED MODEL CONSTELLATIONS

Figure 4.2: Exchange algorithm between ECHAM5 and MPI-OM. The coupling time steps of the atmospheremodel and of the ocean model must synchronise at the end of a day (n �

�t ����� = m �

�t ��� = 86400 [s] with

m and n being the number of exchange time steps in the atmosphere and ocean model, respectively).

4.3.1 Exchange Fields

The exchange algorithm of ECHAM5 and MPI-OM is depicted in Figure 4.2. The ocean model calculatesthe ocean surface conditions and provides the corresponding fields to the atmosphere model. The fresh-water, radiation and momentum fluxes are calculated in ECHAM5 and sent to MPI-OM. The exact fieldsare listed in the figure. Detailed information on the exchange fields and there physical relevance in thereceiving model is given in the following paragraphs.

Fields going from the Ocean to the Atmosphere

ECHAM5 provides fluxes of freshwater, heat, and momentum to MPI-OM. Figure 4.3 graphs the fluxesand their treatment in the ocean.

Net heat flux: The net atmospheric heat flux field contains the sum of all heat fluxes at the ocean watersurface (turbulent fluxes of sensible and latent heat, downward and outgoing long-wave radiation,and solar radiation) as well as the latent heat of fusion of ice sheet discharge from Greenland orAntarctica converted into a surface flux in the coastal cells (figure 4.3). The discharge is thereforeassumed to be in form of frozen water at the ocean temperature of the discharge cell. The fieldheats or cools the upper ocean layer. Heat flux exchange fields are not weighted by the sea iceconcentration before they are passed, as are the freshwater fluxes.

Solar radiation: In addition to the net heat flux at the water surface, the solar radiation term is providedseparately for the calculation of downward penetration of solar radiation below the upper oceanlayer. Note that it is not used as a surface flux, but as an interfacial flux (decreasing downward) atthe base of the ocean layers (figure 4.3). In the ocean it is weighted by the fraction of open water ineach cell.

Page 41: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

4.3. MPI-OM + ECHAM5 33

Figure 4.3: Use of ECHAM5 heat and freshwater fluxes in MPI-OM. The blue arrows represent the ’liquid’ fresh-water flux. It comprises all freshwater that enters directly the ocean. A positive freshwater flux increasesthe surface elevation(dashed line). The solid freshwater flux (grey) is dumped on the ice floes as snow.It enters the water only when it melts. A decrease of snow/ice mass is possible though not likely (dashedline). The yellow arrows represent the heat fluxes. The residual heat flux can only decrease the snow/icevolume (dashed line). The conductive heat flux can increase or decrease the ice volume. The net at-mospheric heat flux includes all water surface fluxes and the latent heat of fusion of ice sheet dischargewhich is assumed to enter the ocean as frozen water. The green arrows depict the vertical diffusionincrease by turbulent energy flux at the surface.

Page 42: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

34 CHAPTER 4. THE MPI-MET COUPLED MODEL CONSTELLATIONS

Heat flux over sea ice: The heat flux over sea ice is passed in two parts (figure 4.3). The residual heatflux is what is ’left over’ when the sea ice or snow skin temperature is not allowed to raise above0 C in the surface heat budget calculation. The residual heat flux is used to melt the snow/iceslab from above and enters the ocean only when all snow and ice is melted. This term is alwayspositive (downward) and can only decrease the snow/ice volume. The second field, the conductiveor diffusive heat flux, is used in the ocean for either bottom ablation or accretion.

Liquid freshwater flux: The freshwater fluxes are also split into two fields, the liquid and the solidfreshwater flux. The liquid freshwater flux comprises all freshwater fluxes that enter directly theocean. These are all rainfall fluxes (over water and over ice, convective or large scale), evaporation(of water only), snow fall (over water only), river and continental runoff, as well as ice sheet massdischarge. A positive liquid freshwater flux increases the sea surface elevation (dashed lines) anddecreases the upper layer salinity.

Solid freshwater flux: Snowfall (large scale and convective) over ice and sublimation of snow or ice arepassed from ECHAM5 to MPI-OM as the solid freshwater flux. The solid freshwater flux is dumpedon the ice floes. It increases the surface (barotropic) pressure but does not change the salinity. Thishappens only when the snow is melted. A decrease of snow mass by this flux is possible though notlikely.

10m wind speed: The 10m wind speed ( ���� ) is used for ocean mixed layer deepening due to wind stir-

ring caused by turbulent exchanges across the surface. Technically it increases the vertical diffusioncoefficients in the upper ocean. It is weighted with the open water cell area fraction in the oceanbefore it is used.

Wind stress: Eight wind stress fields are passed to the coupler: The wind stress over ice and the windstress over water are sent separately. It weighted by the respective surface fraction (water/ice). Thezonal and the meridional component of the wind stress fields are sent twice each to be interpolatedto the zonal and to the meridional grid points of the MPI-OM C-grid. The model is formulated on adistorted grid, i.e. grid directions do not coincide with geographical directions. Exchanged vectorfields are rotated in the ocean model after the exchange.

In ECHAM5 all fluxes are calculated separately over open water and sea ice to account for differentsurface conditions as e.g. roughness, albedo, temperature, and stability conditions. Despite the fact thatECHAM5 uses only the terms calculated over the larger of the water or land fractions of each cell, allvalues calculated on water or ice are used in the interpolation to the ocean grid to account for the gradientsnear the coast line (see section 2.1).

Fields going from the Atmosphere to the Ocean

The ocean model MPI-OM provides time mean surface conditions to the atmosphere, i.e. sea surfacetemperature, ice thickness, ice concentration, and snow depth on sea ice.

Noch ein Satz zu SST! The snow depth enters the calculation of the surface albedo (by modifying the skintemperature dependent formula) and determines, together with the ice thickness, the effective thicknessof the snow/ice cover which enters the calculation of the conductive heat flux through the slab. The ice andsnow thicknesses passed to ECHAM5 are the effective thicknesses, i.e. they are the prognostic variablesdivided by the sea ice concentration. This division can create very large thicknesses for very small iceconcentration. The fields are therefore truncated before they are passed. Note that neither the use of thesurface conditions, not that of the solar radiation or the 10m wind speed is critical for the conservationcriterion.

The MPI-AO version used for the IPCC simulations includes the ocean surface velocities as additionalexchange fields. They are used to calculate the turbulent fluxes. This feature can be deactivated with thecpp flag no cpl oce vel. It is not included in the prism 2-4 release.

Page 43: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

4.3. MPI-OM + ECHAM5 35

Figure 4.4: Implementation of the coupling routines in MPI-OM. Only routines directly related to the coupling areincluded. Yellow boxes represent the routines belonging to the PSMILe library. All MPI-OM couplingroutines are collocated in the module mo couple. The routines called from outside the module arerepresented by blue boxes. Circular arrows correspond to fortran DO loops. Depicted are the timestepping loop, loops over all fields going from the ocean to the atmosphere and loops over the fieldsgoing the other way round.

4.3.2 The Coupling Routines

As mentioned above the coupling of ECHAM5 and MPI-OM is achieved using the coupler OASIS3 andthe interface library PSMILe. The library contains several routines managing the field exchange with MPI.No direct MPI calls need to be implemented in the component model sourcecode. For detailed informationon the coupling software please read the OASIS3 users guide (Valcke et al. (2004)).

The Figures 4.4 and 4.5 show the implementation of the coupling routines in the models source code.To have a clear structure, all PSMILe calls are made from a single module. It is called mo couple inboth models. All mo couple routines that are called from outside the module have a name starting withcouple . These routines are displayed as blue boxes in the figures. The yellow boxes represent PSMILeroutines. The routine names begin with prism . The figures only include the routines directly related tothe coupling. They are based on version prism 2-4 of the PRISM system.

Page 44: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

36 CHAPTER 4. THE MPI-MET COUPLED MODEL CONSTELLATIONS

Figure 4.5: Implementation of the coupling routines in ECHAM5. Only routines directly related to the coupling areincluded. Yellow boxes represent the routines belonging to the PSMILe library. All ECHAM5 couplingroutines are collocated in the module mo couple. The routines called from outside the module arerepresented by blue boxes. Circular arrows correspond to fortran DO loops. Depicted are the loop overthe ECHAM5 rerun cycles (i.e. one month), the time stepping loop, loops over all fields going from theocean to the atmosphere and loops over the fields going the other way round.

Page 45: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

4.3. MPI-OM + ECHAM5 37

The coupling routines of MPI-OM

In MPI-OM all coupling routines are called from the main program MPIOM. At the very beginning of a run,routine couple prep is called to open the standard output file oceout. After the models initialisationthe coupling is initialised by routine couple init. A call of prism init comp proto initialisesthe communication with the coupler.

The grids writing routine is called to generate the grid description files needed by the coupler. Thegrids writing process is started with a call to prism start gridswriting. The routine returnsa flag stating wheter or not the grid description files already exist. If the files are available, no gridswriting process is stopped. If the files are missing, the longitudes and latitudes of the grid cell centersare provided to PSMILe by routine prism write grid.nc, and the grid cell corners are provided byprism write corners.nc. As MPI-OM is running on a C-grid, the exchange fields are defined onthree different grids. Grid acronym oces is used for scalar arrays, the acronyms oceu and ocev forzonal and meridional vector components. Thus, the routines above need to be called three times, once foreach of the grids. The coordinates of the grid specified are added to the file grids.nc. Analoguouslythe land sea masks defined on the three grids are provided by three calls of prism write mask andthe masks are added in the file masks.nc. Note that in contrast to the land sea mask in MPI-OM themasks needed for OASIS3 use 1 for land points and 0 for ocean cells. The file areas.nc containing thegrid cell areas is completed by three calls of prism write area. Finaly the grids writing process isterminated by calling prism terminate grids writing.

The local MPI communicator is received from prism get localcom proto. As MPI-OM is openMPparalellised, one process is sending and receiving the coupling fields. Although the box strategy waschosen as decomposition strategy: The hole grid is defined as one box. Compared with the serial strategy,defining the grid as one box has the advantage, that the exchange fields are stored as two dimensionalarrays in netCDF output files produced by PSMILe. The decomposition strategy is defined by callingprism def partition proto.

Within the first run of a coupled simulation ECHAM5 and MPI-OM are running in sequentially. This hasthe advantage that no restart files for the coupler are needed. In routine ini write MPI-OM writes theexchange fields (as read from the MPI-OM initial files) to a file called sstocean that will be read by thecoupler as restart file containing the first exchange fields for ECHAM5. Next step of the initialisation phaseis the definition of the exchange fields. This happens through several calls of prism derf var proto,each outgoing and each incoming exchange field is defined by an extra call. To terminate the definitionphase routine prism enddef proto is called. A call to check par performs consitency checks ofthe parameters defined for MPI-OM and parameters defined for the coupler.

At the beginnning of each model time step routine coupler get a2o is called to receive the exchangefields containing the atmospheric fluxes from the coupler. Note that the routine is called each time stepwhereas the atmospheric fields are received every coupling time step, only (i.e. once a day). For eachof the exchange fields going from the atmosphere to the ocean prism get proto is called. When thefield is received (at a coupling time step) it is moved to the model’s array for the specific variable. Theocean grid has two overlapping rows for cyclic boundaries. These two exrea lines need to be completed.When all fields are received from the coupler the postprocessing takes place (post a2o). As the MPI-OM grid is rotated and stretched the vector arrays received need to be rotated to match the grid lines.Besides, the wind-stress is devided by the reference density. The residual heat flux is corrected to avoidnegative values. Negative values can occur from the OASIS3 CONSERV option (not to mistake for SCRIPconservative remapping).

At the end of each model time step routine couple put o2a is called to send the ocean surface con-ditions. The routine prep o2a prepares the coupling fields: The exhange fields are masked and a unittransformation takes place. Then, for each exchange field going from the ocean to the atmosphere routineget o2a is called to move the array to the exchange buffer followed by a prism put proto. Notethat prism put proto is called each time step. The current fields are accumulated each time step by

Page 46: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

38 CHAPTER 4. THE MPI-MET COUPLED MODEL CONSTELLATIONS

PSMILe. An actual MPI Send is performed ony at the coupling time step when the accumulated fields arenormalised to send the time averages (keyword AVERAGE in the namcouple).

The calendar is updated at the end of each time step by a call of couple calendar. It counts theseconds since beginning of the run.

At the end of the run routine couple end is called to finalise the coupling. The routine mainly performsa call to prism terminate proto which in turn deallocates the coupling fields.

The coupling routines of ECHAM5

The coupling is initialised at the beginning of a run by the routine couple init. The routine is calledfrom control in the first rerun cycle, only. The PSMILe communication is initialised by a call of thePSMILe routine prism init comp proto.

To generate the grid description files used by OASIS3 the routine grids writing is called. The processof grids writing is started by a call of PSMILe routine prism start grids writing. The routinechecks, whether or not the grid descrition files are available. If no grids writing is needed, the processis stopped. If the grid descrition files are missing the logitudes and latitudes of the grid cell centersand corners are provided to PSMILe by a call of prism write grids and prism write cornersrespectively. As ECHAM5 is the first model calling the grids writing routines the prism write gridscall provokes the generation of the file grids.nc and adds the coordinate arrays. Similarly the filemasks.nc is generated by a call of prism write mask. As described in section 2.1 ECHAM5 usespartial grid cells. Thus the land sea mask contains values between 0 (water) and 1 (land). OASIS usesinteger mask arrays. To include all wet grid cells in the interpolation/remapping only grid cells with awater fraction of less then one percent are treated as land cells (1). The actual wet grid cell fraction isacounted for in the areas array, where the wet area in each grid cell is defined. The file areas.nc iscreated by a call of prism write area.

The MPI local communicator is received by a call to PSMILe routine prism get localcomm proto.Although the ECHAM5 model is MPI paralellised, only one process is sending and receiving the couplingfields. The box strategy was chosen as decomposition strategy: The hole grid is defined as one box.Compared with the serial strategy, defining the grid as one box has the advantage, that the exchange fieldsare stored as two dimensional arrays in netCDF output files produced by PSMILe (option EXPOUT in thenamcouple). The decomposition strategy is defined in prism def partition proto.

All exchange field received from the coupler and all fields sent to coupler need to be defined by a call ofprism def var proto. The definition process is terminate by a call of prism enddef proto. Incontrary to the setup of MPI-OM the outgoing fields of ECHAM5 are accumulated in the model.In routineini a2o these arrays are initialised with zero. The initialisation phase of the coupling is terminated bysome constency checks performed in routine chck par.

At the beginning of each coupling time step routine couple get o2a is called to receive sea sur-face conditions provided by the ocean model. For each incoming exchange field the PSMILe routineprism get proto is called to receive the corresponding field from the coupler. Afterwards the field ismoved to the model arrays and distributed over the different processors.

In contrast to MPI-OM accumulation in the model, collect. ..... At the end of the coupled time stepcouple put a2o is called to send the outgoing exchange fields to the coupler. The fluxes had beenaccumulated throughout the coupling periode. Now, the normalised arrays are moved to the exchangedbuffer in routineget a2o and sent to the coupler by prism put proto. When all fields are sent,the initial fields are re-initialised with zero to prepare the next coupling interval. Finally the calendar isupdated. It counts the seconds since the beginning of the run.

At the end of the last timestep the coupling ist terminated by calling routine couple end. In caseof a sequential run (i.e. the first run of the experiment) the writing of an OASIS3 restart file is triggered(routine couple restart a2o). The restart file contains the atmospheric fluxes on the atmosphere gridaccumulated and normalised since the last coupling time step. The writing is performed by the PSMILe

Page 47: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

4.3. MPI-OM + ECHAM5 39

routine prism write restart proto. Finally, routine prism terminate proto terminates thecoupling and deallocates the coupling fields.

4.3.3 Retrieving the Source Code

The source code of ECHAM5, MPI-OM and OASIS3, all related scripts as well as initial and forcingdata needed for this coupled model constellation can be retrieved from the PRISM central CVS repositoryspecifying the CVS module MPI-AO or ECHO (see section 7).

cvs checkout [-r prism_2-4] MPI-AO

The checkout command provokes the build-up of the prism standard directory tree with the root directoryprism. It is possible to rename the root directory. Apart from that the directory structure should not bemodified, otherwise some of the PRISM software might not work correctly.

4.3.4 Compilation

The compile scripts are created by calling the script Create COMP cpl model.ksh (directory pri-sm/util/compile/frames) with the parameter mpi-ao or echo. Optionally, the version ID can begiven as second positional parameter. If no second parameter is specified, the compile script is generatedusing the default version ID which is D10 for the MPI-AO configuration. Per default, the compile scriptsare generated for the machine Create COMP cpl models.frm is called on. If the compilation shalltake place on another machine the node name of the compiling host needs to be given as a third inputparameter.

Create COMP cpl model.ksh mpi-ao [ID] [node]

Calling Create COMP cpl model.ksh for the coupled configuration MPI-AO is equivalent to callingCreate COMP models.frm four times, for each of the component models, for the coupler and for thelibraries. The list of component models participating in the coupled constellation (ECHAM5 and MPI-OM) is specified as last input parameter when generating the compile scripts for the two models. The listis needed for the automatic definition of cpp flags, specific for the coupled constellation. More details onthe script can be found in section 5.3 or in the handbook of the PRISM Standard Compilation Environment(Legutke and Gayler (2004)).

Create COMP libs.frm - "" node

Create COMP models.frm oasis3 "" - "" node " " ""

Create COMP models.frm echam5 "" - "" node ID "echam5 mpi-om"

Create COMP models.frm mpi-om "" - "" node ID "echam5 mpi-om"

The scripts for the libraries and for OASIS3 (first two lines in the example) need to be created only oncefor the PRISM system and therefore do not have any version ID associated. The message passing methodused for the data exchange between the executables (MPI1 or MPI2) needs to be configured at compiletime. This determines the way the executables are launched. The executables have the string MPI1 orMPI2 appended to their names to allow the storage of executables of both kinds. ECHAM5 and MPI-OMof PRISM release prism 2-4 can only be run with MPI2. This is the default of the second parameter "".Note that the second parameter cannot be specified as MPI2 for technical reasons. More details on thecompile script generation is given in section 5.3.

Although the cpp flags needed for the configuration of a specific coupled constellation are set automat-ically, other configuration parameters for the models have to be set manually in the generated compilescripts. These are e.g. cpp flags for activation of non-default parameterisations (compare 2.1 and 2.2) or

Page 48: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

40 CHAPTER 4. THE MPI-MET COUPLED MODEL CONSTELLATIONS

the horizontal and vertical resolution of the MPI-OM grid. Default resolution is the coarsest resolutionavailable from the PRISM repository, in the case of MPI-OM it is grob (see 7). The ECHAM5 executabledoes not depend on resolution.

The compile scripts generated can be found in the directories /prism/src/mod/echam5, /prism-/src/mod/mpi-om and /prism/src/mod/oasis3, respectively. To allow the storage of differentcompile scripts for different model configurations the scripts for the models are labelled with the versionID (COMP echam5 D10.node and COMP mpi-om D10.node). All three compile scripts have thento be launched explicitely by the user. However, the scripts trigger the update of the libraries they need.Therefore, the library compile script must have been created beforehand.

MPI-OM needs two executables: one for the initial start from climatology and another for restarted runsstarting from a model generated restart file. Thus the compile script has to be run twice, once with theparameter newstart=yes (default) and once with newstart=no.

4.3.5 Model Execution

The scripts for the model execution (tasks) are generated by the script Create TASKS.frm. The scriptis located in directory prism/util/running/frames. It needs to be called with the coupled modelname as first and the experiment ID as second positional parameter. To create the tasks for a machine otherthan the default machine, the node name of the computing host has to be given as third input parameter.

Create TASKS.frm mpi-ao D10 [node]

The first call of Create TASKS.frm for a specific coupled model and experiment ID leads to the gener-ation of a setup file (setup cplmod expid) containing all configurable parameters for the experiment.In the example case the coupled model configuration (cplmod) is mpi-ao and the experiment ID (expid)is D10. The setup file needs to be edited according to the experimental design. Reasonable defaults areprovided for all variables. Table 4.6 lists the model specific variables of the setup. For detailed informationon the setup please read the handbook of the PRISM Standard Running Environment (Gayler and Legutke(2004)).

Once the setup is adapted, Create TASKS.frm has to be called again with the same input parameters.This leads to the tasks’ generation. They are transferred automatically to directory home/D10/scriptson the computing host. The path variable home corresponds to the user’s definition in the setup. The inputdata needed for the experiment as well as the executables are transferred automatically at runtime.

To start the simulation the runscript RUN echo D10 needs to be submitted. Depending on the computinghost and on the computing resources needed this can be done interactively by just typing the name of thescript or within a queueing system.

4.4 ECHAM5 + MPI-OM + HAMOCC

The coupling of the three models ECHAM5, MPI-OM, and HAMOCC is realised in a three-executableconfiguration as the MPI-AO configuration. The bio-geo-chemistry model is included as a sub-model ofthe ocean model MPI-OM.

4.4.1 Exchange Fields

The fields exchangd between MPI-OM and ECHAM5 are the same as in the MPI-AO configuration, thefield exchange between MPI-OM and HAMOCC corresponds to the MPI-OB configuration. Figure 4.6displays the field exchange between the three models. The exchange between ECHAM5 and MPI-OMis accomplished by the coupler OASIS3. In contrast, the exchange between MPI-OM and HAMOCC is

Page 49: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

4.4. ECHAM5 + MPI-OM + HAMOCC 41

Variable Possible Choices DescriptionECHAM5res atm t21 t63 t106 horizontal grid resolution acronymvres atm 19 31 31 number of vertical levelsatmvers D10 three character string ocean model versionout filetype 2 1: GRIB 2: netCDF output file formatlhd yes yes no activation of the hydrological dis-

charge modelechamid 65010 five character string echam user/experiment idMPI-OMres oce grob grob1 grob15 horizontal grid resolution acronymvres oce 20 20 40 number of vertical levelsocevers D10 any character string ocean model versionOASIS3jobname D10 three character string OASIS experiment IDnlogprt 2 0: little, 1: much, 2: very much OASIS3 standard output extentncplvers “” any character string namcouple versionscripwr 0 0: no 1: yes trigger calculation of new SCRIP

remapping matricesgridswr 0 0: no 1: yes trigger writing of new grid descrip-

tion filesextrapwr 0 0: no 1: yes trigger writing of a new extrapola-

tion matrix (NINENN)order FIRST FIRST SECOND order of SCRIP conservative

remappingdto2a 86400 86400 43200 coupling time step from ocean to

atmosphere [s]timtranso2a AVERAGE INSTANT AVERAGE time transformation of coupling

fieldsexport EXPORTED EXPORTED EXPOUT write exchange fields to output filePartitioningnthreadoce 1 machine dependent number of openMP threads for

MPI-OMnprocatm 1 machine dependent number of MPI processors for

ECHAM5

Table 4.6: Model specific parameters in the setup file generated for MPI-AO. Bolt characters are used for defaults.

Page 50: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

42 CHAPTER 4. THE MPI-MET COUPLED MODEL CONSTELLATIONS

realised through subroutine parameter lists. The export of coefficients for solar attenuation by phytoplank-ton is optional and can be switched on with the cpp flag FB OCE BGC. If it is not activated, HAMOCChas no influence on the ocean and can be considered as a diagnostic package running online.

4.4.2 Retrieving the Source Code

The source codes and related scripts for the MPI-AOB constellation can be retrieved from the PRISMsource code repository using the CVS module MPI-AOB. The module comprises the source code ofECHAM5, MPI-OM, HAMOCC and OASIS3, all related scripts as well as initial and forcing data neededfor this coupled model constellation (see section 7).

cvs checkout [-r prism_2-4] MPI-AOB

The checkout command provokes the build-up of the prism standard directory tree with the root directoryprism. It is possible to rename the root directory. Apart from that the directory structure should not bemodified, otherwise some of the PRISM software might not work correctly.

4.4.3 Compilation

The compile scripts are created by calling the script Create COMP cpl model.ksh (directory pri-sm/util/compile/frames) with the parameter mpi-aob. Optionally, the version ID can be givenas second positional parameter. If no second parameter is specified, the compile script is generated using

Figure 4.6: The field exchange algorithm between ECHAM5, MPI-OM and HAMOCC. It is the same as that for thecombination ECHAM5+MPI-OM (4.3) and MPI-OM+HAMOCC (4.2). The inter executable exchangebetween ECHAM5 and MPI-OM is managed by the coupler OASIS3. The direct coupling betweenMPI-OM and HAMOCC is realised through subroutine parameter lists. The export of solar attenuationcoefficients by the phytoplankton (dashed arrow) is optional.

Page 51: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

4.4. ECHAM5 + MPI-OM + HAMOCC 43

the default version ID which is D15 for the MPI-AOB configuration. Per default, the compile scripts aregenerated for the machine Create COMP cpl models.frm is called on. If the compilation shall takeplace on another machine the node name of the compiling host needs to be given as a third input parameter.

Create COMP cpl model.ksh mpi-aob [ID] [node]

Calling Create COMP cpl model.ksh for the coupled configuration MPI-AOB is equivalent to call-ing Create COMP models.frm five times, for each of the component models, for the coupler and forthe libraries. The list of component models participating in the coupled constellation (ECHAM5 MPI-OMand HAMOCC) is specified as last input parameter when generating the compile scripts for the three mod-els. The list is needed for the automatic definition of cpp flags, specific for the coupled constellation. Moredetails on the script can be found in section 5.3 or in the handbook of the PRISM Standard CompilationEnvironment (Legutke and Gayler (2004)).

Create COMP libs.frm - "" node

Create COMP models.frm oasis3 "" - "" node " " ""

Create COMP models.frm echam5 "" - "" node ID "echam5 mpi-om hamocc"

Create COMP models.frm hamocc "" - "" node ID "echam5 mpi-om hamocc"

Create COMP models.frm mpi-om "" - "" node ID "echam5 mpi-om hamocc"

The scripts for the libraries and for OASIS3 (first two lines in the example) need to be created only oncefor the PRISM system and therefore do not have any version ID associated. The message passing methodused for the data exchange between the executables (MPI1 or MPI2) needs to be configured at compiletime. This determines the way the executables are launched. The executables have the string MPI1 orMPI2 appended to their names to allow the storage of executables of both kinds. ECHAM5 and MPI-OMof PRISM release prism 2-4 can only be run with MPI2. This is the default of the second parameter "".Note that the second parameter cannot be specified as MPI2 for technical reasons. More details on thecompile script generation is given in chapter 5.3.

Although the cpp flags needed for the configuration of a specific coupled constellation are set automat-ically, other configuration parameters for the models have to be set manually in the generated compilescripts. These are e.g. cpp flags for activation of non-default parameterisations (compare 2.1, 2.2 and 2.3)or the horizontal and vertical resolution of the MPI-OM grid. Default resolution is the coarsest resolutionavailable from the PRISM repository, in the case of MPI-OM and HAMOCC it is grob (see 7). Note thatMPI-OM and HAMOCC are running on the same grid: If a grid resolution other than default is wanted,the compile scripts of both models need to be adapted.3 The ECHAM5 executable does not depend onresolution.

The compile scripts generated can be found in the directories /prism/src/mod/echam5, /prism-/src/mod/mpi-om,/prism/src/mod/hamocc and /prism/src/mod/oasis3, respectively.To allow the storage of different compile scripts for different model configurations the scripts for themodels are labelled with the version ID (COMP echam5 D15.node, COMP mpi-om D15.node andCOMP hamocc D15.node). When the models are in a main/sub model relationship (as MPI-OM andHAMOCC), the compile scripts must use identical version acronyms.

The compile scripts of ECHAM5, MPI-OM and OASIS have to be launched explicitely by the user.The scripts trigger the update of the libraries they need. Therefore, the library compile script musthave been created beforehand. Similarly, the MPI-OM compile script recognises that HAMOCC willbe run as a sub-model and checks whether it is up to date by launching the compile script for HAMOCCwith the corresponding version acronym. The process is aborted when such a script is not available

3In a PRISM compliant configuration a sub-model should be independent of the resolution. For that reason the grid resolutionof the main model is not assigned to the sub-model automatically during script generation and compilation process. In HAMOCCthe specification of the resolution is required at compile time, because the model uses MPI-OM modules (MO COMMO1,MO COMMOAU1) which is not PRISM SCE compliant.

Page 52: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

44 CHAPTER 4. THE MPI-MET COUPLED MODEL CONSTELLATIONS

in the HAMOCC source code directory. Otherwise the library libhamocc D15.a is updated. Then theMPI-OM library is updated (libmpi-om D15.a) and both libraries are linked to build the executable mpi-om hamocc D15[ newstart].x. MPI-OM needs two executables: one for the initial start from climatologyand another for restarted runs starting from a model generated restart file. Thus the compile script has tobe run twice, once with the parameter newstart=yes (default) and once with newstart=no.

4.4.4 Model Execution

The scripts for the model execution (tasks) are generated by the script Create TASKS.frm. The scriptis located in directory prism/util/running/frames. It needs to be called with the coupled modelname as first and the experiment ID as second positional parameter. To create the tasks for a machine otherthan the default machine, the node name of the computing host has to be given as third input parameter.

Create TASKS.frm mpi-aob D15 [node]

The first call of Create TASKS.frm for a specific coupled model and experiment ID leads to thegeneration of a setup file (setup cplmod expid) containing all configurable parameters for the ex-periment. In the example case the coupled model configuration (cplmod) is MPI-OM and the experimentID (expid) is D15. The setup file needs to be edited according to the experimental design. Reasonabledefaults are provided for all variables. The setup file for MPI-AOB is very similar to that created forMPI-AO. A list of the model specific variables of the MPI-AO setup is given in Table 4.6. Only differeceto the MPI-AOB setup are differet defaults for the version IDs of the executables and for the OASIS3jobname (D15 instead of D10). For detailed information on the setup please read the handbook of thePRISM Standard Running Environment (Gayler and Legutke (2004)).

Once the setup is adapted, Create TASKS.frm has to be called again with the same input parameters.This leads to the tasks’ generation. They are transferred automatically to directory home/D15/scriptson the computing host. The path variable home corresponds to the user’s definition in the setup. The inputdata needed for the experiment as well as the executables are transferred automatically at runtime.

To start the simulation the runscript RUN mpi-aob D15 needs to be submitted. Depending on the com-puting host and on the computing resources needed this can be done interactively by just typing the nameof the script or within a queueing system.

4.5 Exchange from HAMOCC to (EC)HAM

The inclusion of the MPI-Met aerosol model HAM into the PRISM MPI-Met model system is in work.Similar to the coupling of HAMOCC with MPI-OM HAM will be linked as library linked to ECHAM5.In the current version the aerosol model HAM is not included. The fields that are planed to be exchangedfrom HAMOCC to HAM are currently received by the atmosphere model ECHAM5. As in the MPI-AOB configureation the field exchange between ECHAM5 and MPI-OM happens via OASIS. The dataexchange between MPI-OM and HAMOCC is achived through the parameter lists of subroutine calls. Theexchange between ECHAM5 and HAM will also be directly by subroutine parameter lists. However, theexchange between HAMOCC and HAM will be managed by OASIS3. The two models will be runningas sub-models of different main models in different executables and on different grids. The exchange dataare two dimensional surface fields, and the interpolation will be performed by OASIS3 (see figure 4.7).

4.5.1 Exchange Fields

Figure 4.7 displays the field exchange between the models. The field exchange between ECHAM5 andMPI-OM and between MPI-OM and HAMOCC is realised in the same way as in the MPI-AOB configura-tion described in section 4.4 i.e. via OASIS3/psmile, and through subroutine parameter lists respectively.

Page 53: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

4.5. EXCHANGE FROM HAMOCC TO (EC)HAM 45

Figure 4.7: The field exchange algorithm between ECHAM5, MPI-OM and HAMOCC including DMS exchange.The exchange algorithm between ECHAM5 and MPI-OM and between MPI-OM and HAMOCC is thesame as in the MPI-AOB configuration (compare figure 4.6). As the aerosol model HAM is not yetincluded in the system, the DMS flux, that is supposed to go from HAMOCC to HAM is received bythe atmosphere model ECHAM5. The exchange of DMS happens via OASIS3.

This section is focussing on the data exchange from HAMOCC to (EC)HAM. The notation (EC)HAMwas chosen to make clear that the fields are intended for the HAM model, but as it is not yet included theyare received by ECHAM. The export of coefficients for solar attenuation by phytoplankton is optional andcan be switched on with the cpp flag FB OCE BGC. If it is not activated, HAMOCC has no influence onthe ocean and can be considered as a diagnostic package running online.

As an example, the transfer of DMS (dimethyl sulphide) flux is included in the HAMOCC source code.Other exchange algorithms are planned and therefore this particular scheme needs a special cpp flag( cpl dms) which is set automatically presently. In order to keep model variables in the componentthey belong to, the declaration of the DMS exchange variable for the interface library PSMILe is done inHAMOCC, separated from the declarations in MPI-OM.

The DMS flux is calculated in HAMOCC. The 10m wind speed provided by ECHAM5 is used in thecalculation of the turbulent exchange coefficient. Therefore the time scale of the calculated flux dependsheavily on the frequency of data exchange from ECHAM5 to MPI-OM. Although the frequency of datatransfer from HAMOCC to (EC)HAM can be set independently from the other exchange frequencies, it isquestionable whether it makes sense to specify another frequency than that for the transfer from ECHAM5to MPI-OM. Note that the 10m wind speed is received from ECHAM5 in MPI-OM from OASIS3, and isfrom there passed to HAMOCC through parameter lists.

4.5.2 Retrieving the Source Code

The source code needed to set up this model configuration is the same as that for the MPI-AOB configura-tion described in section 4.4. Instructions on how to download the Icode and associated scripts are given

Page 54: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

46 CHAPTER 4. THE MPI-MET COUPLED MODEL CONSTELLATIONS

in that section. In a later release the source code of HAM will be included. The name of the aproppriateCVS module will be MPI-AOB-HAM.

4.5.3 Compilation

The compile scripts are created by calling the script Create COMP cpl model.ksh (directory pri-sm/util/compile/frames) with the parameter mpi-aob-ham. Optionally, the version ID can begiven as second positional parameter. If no second parameter is specified, the compile script is generatedusing the default version ID which is D16 for the MPI-AOB-HAM configuration. Per default, the compilescripts are generated for the machine Create COMP cpl models.frm is called on. If the compilationshall take place on another machine the node name of the compiling host needs to be given as a third inputparameter.

Create COMP cpl model.ksh mpi-aob-ham [ID] [node]

Calling Create COMP cpl model.ksh for the coupled configuration MPI-AOB-HAM is equivalentto calling Create COMP models.frm for each of the component models, for the coupler and for thelibraries. The list of component models participating in the coupled constellation (ECHAM5 HAM MPI-OM and HAMOCC) is specified as last input parameter when generating the compile scripts for the mod-els. The list is needed for the automatic definition of cpp flags, specific for the coupled constellation. Moredetails on the script can be found in section 5.3 or in the handbook of the PRISM Standard CompilationEnvironment (Legutke and Gayler (2004)).

Create COMP libs.frm - "" node

Create COMP models.frm oasis3 "" - "" node " " ""

Create COMP models.frm echam5 "" - "" node ID "echam5 ham mpi-om hamocc"

Create COMP models.frm hamocc "" - "" node ID "echam5 ham mpi-om hamocc"

Create COMP models.frm mpi-om "" - "" node ID "echam5 ham mpi-om hamocc"

The initialisation of the field exchange is done in MPI-OM. The initialisation phase is terminated by lastmodel declaring exchange fields using the PSMILe library. In this case it is HAMOCC. This situation iscontrolled by the cpp flag last submodel which is also set automatically when MPI-OM and HAMOCCare both exchanging data with OASIS3. For more details on the main/sub-model configuration see thePRISM SCE handbook (Legutke and Gayler (2004)).

The compilation process is very similar to the process of MPI-AOB setup describd in section 4.4. Pleaseread that section for further information. In the default configuration of MPI-AOB-HAM the version andexperiment ID D16 is used. The three compile scripts for ECHAM5, MPI-OM and OASIS3 have to belaunched explicitely by the user. The script for HAMOCC is launched by the MPI-OM compile scriptautomatically. It is mandatory that both compile scripts are using the same version ID. When HAM willbe included as sub-model of ECHAM5 the coresponding compile script will be lauched by ECHAM5.Remember that MPI-OM needs two executables, the first to start from initial conditions, the second forrestarted runs.

4.5.4 Model Execution

The scripts for the model execution (tasks) are generated by the script Create TASKS.frm. The scriptis located in directory prism/util/running/frames. It needs to be called with the coupled modelname as first and the experiment ID as second positional parameter. To create the tasks for a machine otherthan the default machine, the node name of the computing host has to be given as third input parameter.

Create TASKS.frm mpi-aob-ham D16 [node]

Page 55: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

4.5. EXCHANGE FROM HAMOCC TO (EC)HAM 47

The first call of Create TASKS.frm for a specific coupled model and experiment ID leads to the gener-ation of a setup file (setup cplmod expid) containing all configurable parameters for the experiment.In the example case the coupled model configuration (cplmod) is MPI-OM-HAM and the experiment ID(expid) is D16. The setup file needs to be edited according to the experimental design. Reasonabledefaults are provided for all variables. The setup file for MPI-AOB-HAM is very similar to the file createdfor MPI-AO. A list of the model specific variables of the MPI-AO setup is given in table 4.6. Differentfrom the MPI-AOB-HAM setup are the defaults for the version IDs of the executables and for the OASIS3jobname (D16 instead of D10). Besides two extra parameter are listed dealing with the data exchangebetween HAMOCC and (EC)HAM. These parameters are listed in table 4.7. For detailed information onthe setup please read the handbook of the PRISM Standard Running Environment (Gayler and Legutke(2004)).

Variable Possible Choices Descriptionatmvers D16 three character string ocean model versionechamid 65016 five character string echam user/experiment idocevers D16 any character string ocean model versionjobname D16 three character string OASIS experiment IDdtb2a 86400 86400 43200 coupling time step from bgc to at-

mosphere (aerosols) [s]timtransb2a AVERAGE INSTANT AVERAGE time transformation of coupling

fields going from bgc to atmo-sphere (aerosols)

Table 4.7: Model specific parameters in the setup file generated for MPI-AOB-HAM that are not included or differfrom the the setup file for the MPI-AO configuration 4.6. Bolt characters are used for defaults.

Once the setup is adapted, Create TASKS.frm has to be called again with the same input parameters.This leads to the tasks’ generation. They are transferred automatically to directory home/D16/scriptson the computing host. The path variable home corresponds to the user’s definition in the setup. The inputdata needed for the experiment as well as the executables are transferred automatically at runtime.

To start the simulation the runscript RUN mpi-aob-ham D16 needs to be submitted. Depending on thecomputing host and on the computing resources needed this can be done interactively by just typing thename of the script or within a queueing system.

Page 56: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

48 CHAPTER 4. THE MPI-MET COUPLED MODEL CONSTELLATIONS

Page 57: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

Chapter 5

Compilation

The compilation of the MPI-Met component models is realised following the PRISM standards. Fordetailed information please read the handbook on the standard compilation environment Legutke andGayler (2004). This chapter highlights special aspects of the SCE at the example of the MPI-Met models.The explicit steps that need to be taken to compile the components of a coupled model are listed for eachof the coupled configurations in chapter 4.

5.1 Conditional Compilation

Each of the component models can run in a number of constellations as described in chapter 2. Theconstellation is specified at compile time and the executables are configured accordingly.

Table 5.1 lists the coupling specific cpp flags which are used for configuring the source code. To acti-vate conditional code that needs to be compiled only if the component is coupled with a specific othercomponent (model) the cpp-flag ( cpl model) was defined. The flags is defined automatically in thecompile script when the component model appears in the list of partner models. Note that the list ofpartner models is checked against a fixed list of possible partner models. For more details please checkthe SCE handbook (Legutke and Gayler (2004)).

The cpp key prism is used to activate or deactivate source code specific to the general coupling withthe PRISM software. It is e.g. used in ECHAM5 to switch off the compilation of the calls to the PRISMsoftware (PSMILe and mpp io). For a stand-alone run the ECHAM5 source code can be compiled withouthaving these libraries.

The cpp key prism last subm is needed for main/submodel constellations to activate the call to thePSMILe routine prism enddef which terminates the declaration and definition phase of the PRISMsoftware. All models define their exchange fields and MPI ports themselves. In order not to violate the’package rule’ prism enddef must be called by the last model declaring coupling fields. For the MPI-AO this is the ocean model MPI-OM, whereas in the MPI-AOB-HAM configuration it is the bgc modelHAMOCC.

The cpp key synout activates the output of synchronisation messages. This is useful during the devel-opment phase but should be deactivated for normal runs.

The cpp key CLIM Box switches from a serial domain decomposition to a decomposition into boxeseven if the field exchange is done by only one processor. There is just one box which comprises the fullgrid domain. The advantage of this specification is that mpp io output files contain two dimensional dataarrays and can be visualised with ncview.

The cpp key cpl pisces activates the use of the marine bgc model PISCES in MPI-OM. PISCESdoes not belong to the MPI-M component model family.

The cpp key accu by psmile appears in the model for historical reasons. It was used to demonstrate

49

Page 58: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

50 CHAPTER 5. COMPILATION

the possibility of using PSMILe routines to accumulate the exchange fields instead of performing theaccumulation by the model’s own routines. The cpp key should be activated and will be removed soon.

Key name Action Defaultcpl mpiom wraps all source code needed for coupling with mpi-om -cpl echam5 wraps all source code needed for coupling with echam5 -cpl hamocc wraps all source code needed for coupling with hamocc -cpl ham wraps all source code needed for coupling with ham -cpl pisces coupling with pisces -prism last subm activates the call to prism enddef -prism have subm deactivates the call to prism enddef -prism activation of calls to PSMILe routines -CLIM Box -synout detailed information on coupling activities -

Table 5.1: List of cpp flags for conditional compilation of source code related to coupling.

Apart from the general purpose cpp flags listed in table 5.1 there are model specific cpp flags for theconfiguration of the individual components. Specific cpp flags for MPI-OM are listed in table 2.6. Anothermethod to control the coupled models is the specification of namelists parameters. Those which are usedfor the control of the coupled models described here are listed in the tables 2.3, 2.7 and 2.10.

5.2 Library Compilation

The libraries needed by a component model are compiled automatically, when the models compile scriptis executed. Prerequisite is the exitance of a library compile script. The script is called with the listof the libraries needed in the loading process and with the appropriate message passing (MPI1/MPI2).The libraries needed by the MPI-Met coupled model components are listed in table 5.2. All modelscommunicating with the coupler OASIS3 need to be compiled with the PSMILe and the CLIM library.PSMILe (PRISM System Model Interface Library) routines are called by the component models to setup,maintain and terminate the communication with OASIS3. The mpp io library is used for diagnostic modeloutput in netCDF/CF format and is called from PSMILe routines. The counterpart of PSMILe is the CLIMlibrary which is linked to OASIS3. The other OASIS3 libraries are used for interpolation. More details onthe coupling software can be found in Valcke et al. (2004).

Component LibrariesECHAM5 support, netcdf90, PSMILe, mpp ioMPI-OM PSMILe, mpp ioHAMOCC PSMILe, mpp ioOASIS3 anaisg, anaism, scrip, fscint, CLIM

Table 5.2: The libraries needed by the components of the MPI-Met coupled models

5.3 Generation of Compile Scripts

Each model or submodel requires its own compilation script whether or not it builds an executableor is linked as a submodel to a main model. Compile scripts for models are generated with the toolCreate COMP models.frm located in directory util/compile/frames. The tool provides ahelp function (Create COMP models.frm --help). It accepts 7 command line parameters, whichmust be given in the right order:

$1 model name: Name of the directory containing the model source code (lower case, e.g. mpi-om,echam5). The parameter is mandatory.

Page 59: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

5.4. THE COMPILATION PROCESS 51

$2 message passing library: The message passing library used for communication with the coupler.Possible options are: ‘‘’’ to choose MPI-2 which is the default, MPI1 for the usage of MPI-1 orNONE if no OASIS communication is wanted.

$3 standard output direction: The (g)make standard output is directed to a file (‘‘’’) or to the screen(-).

$4 error output direction: The (g)make error output is directed to a file (‘‘’’), to the screen (-) or tothe standard output file (+).

$5 node name: The node name of the machine for which the compile script is generated. If no nodename is given (‘‘’’) the compile scripts are generated for the default machine which is the machineCreate COMP models.frm is called on. To generate compile scripts for an other machine thenode name must be given (e.g. ds for the cross NEC-SX cross-compile-server at the DKRZ).

$6 version acronym: Acronym for the model version. This can be an experiment ID, a release tag, orany other string. The version acronym is appended to the executable name to discriminate betweendifferent versions of the same model. Sub-models must use the same version acronym as the mainmodel it is linked to.

$7 list of participating models: The list of participating models defines the coupled model configura-tion, i.e. the component models that are part of the coupled model.

The compile script for the libraries is generated with the tool util/compile/frames/Create -COMP libs.frm. The procedure is similar to the above. Type Create COMP libs.frm --helpfor help.

The script util/compile/frames/Create COMP cpl model.ksh provides an easy way to gen-erate all compile scripts needed for a coupled model. Only mandatory input parameter for that script is thecoupled model name. Besides it is possible to specify the version acronym and the node name. Defaultvalues are used for all other parameters listed above. Examples for the usage of Create COMP cpl mo-del.ksh are given in chapter 4.

5.4 The Compilation Process

When the model compile script of a model is launched, it is first checked whether all pieces are availableand up to date. This means the model compile script first launches the library compile script and checksall libraries which are needed for the model. Then it checks whether a submodel is to be linked to theexecutable. If so, the compile script for the submodel (e.g. HAMOCC) is launched. Note that this compilescript has to be generated in advance and must have the same version acronym as the main model.

After successful compilation the executable is moved to directory architecture/bin, which is part ofthe PRISM standard directory structure. The standard name of the executables model modvers.nodedepends on the model, the model version, the node name of the machine it was created for and on the mes-sage passing method used for the coupling. At runtime the executable is copied to the working directoryof the experiment and it is renamed to meet the OASIS3 standard. If the executable name is modified bythe user, the run script will not find it unless it was adapted accordingly.

Page 60: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

Chapter 6

Modification of MPI-M model couplingalgorithms

This chapter gives a brief overview about the steps that need to be taken to change the coupling algorithmof a coupled PRISM model. Ideally, it should be possible to define all aspects of the coupling interfacewithin the setup file of the experiment (compare tables 4.1, 4.5, 4.6 and 4.7). However, this requires veryflexible models and very comprehensive setup files and scripts. For that reason changing the coupling al-gorithm might need modifications at different places. Some aspects of the coupling algorithm are definedwithin the component models, e.g the number and kind of exchange fields. Other aspects, as the interpo-lation method, are defined within the namcouple. The examples of the following sections are referring tothe coupled model ECHO (section 4.3).

6.1 Changing the number of processors the component models run on

The setup variable nprocatm defines the number of MPI processors used by ECHAM5. Per default, themodel is running on one processor. To use more than one processor just adapt the variable.

The PRISM version of MPI-OM uses the openMP parallelisation. Use the setup variable nthreadoceto adapt the number of openMP threads to the desired value (default value is 1).

At runtime, the modified values are delivered to the models as namelist parameters. Thus no recompilationis needed. Besides, within the runscript the variables are used to launch the coupled model. The resourcesrequest for the queueing system is adapted to the new values as well.

6.2 Changing the exchange frequencies

The Implementation of PSMILe calls in ECHAM5 and in MPI-OM is not realised in the same way. Forthat reason modifications of the exchange frequencies happen in differently for MPI-OM and ECHAM5.

6.2.1 Fields going from the ocean to the atmosphere

The exchange interval of fields going from the ocean to the atmosphere is defined by the setup variabledto2a. The default value is 86400 seconds (1 day). All fields going from the ocean to the atmosphereare send with the same frequency. To modify this value just adapt the parameter. Make sure, that theexchange time step is a common multiple of the atmosphere and ocean model time steps. Besides, theexchange time step needs to be a factor of the length of a run. Thus for ECHO the only alternative valuefor dto2a is 43200 seconds (1/2 day).

52

Page 61: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

6.3. CHANGING THE INTERPOLATION METHODS 53

In MPI-OM the PSMILe routines to send and to receive the coupling fields are called at each modeltime step (compare section 4.3.2). However the actual MPI sending process is triggered at the coupledtime steps only. The accumulation and normalisation of the coupling fields is performed by PSMILe.Thus MPI-OM profits from the full PSMILe flexibility concerning exchange frequencies and it is possibleto define different exchange frequencies for each of the exchange fields. However, this needs adapta-tions of the namcouple file (compare section 3.2.1). The namcouple base file is located in directoryutil/running/adjunct files/oasis3/cplmod. The namcouple variable #Dto2a is replacedby the exchange interval at runtime. To define different exchange frequencies for each of the exchangefields, the variable must be replaced by the exchange time steps (in seconds) of the respective exchangefield.

Detailed information on the PSMILe library and on the namcouple is provided in Valcke et al. (2004).

6.2.2 Fields going from the ocean to the atmosphere

In ECHAM5 the PSMILe routines to send and to receive the exchange fields are called at the coupling timesteps only. The accumulation and normalisation of the exchange fields happens in the model’s own rou-tines (compare section 4.3.2). Changeing the coupling interval needs more adaptation than just changingthe variable dta2o in the runscript.

The specification of different exchange frequencies for the different ECHAM5 exchange fields is notsupported at all.

6.3 Changing the interpolation methods

The interpolation method is defined in the namcouple base file (compare section 3.2.1). To change theinterpolation method, it needs to be adapted accordingly. For detailed information on the interpolationmethods please read Valcke et al. (2004).

As the MPI-OM model grid is a stretched and rotated logically rectangular grid, not all of the OASIS3interpolation methods can be used. The SCRIP library provides several interpolation methods for alllogically rectangular, reduced or unstructured grids. The MOZAIC library is suitable for all these gridstoo, but in contrast to SCRIP, the interpolation matrices need to be provided by the user.

6.4 Changing the standard output extent for exchange control

The extent of standard output written by OASIS and PSMILe is defined by the setup parameter nlogprt.The default value is 2 (very much standard output). This choice is suitable in the development phase. Forproduction runs the standard output extent should be reduced (nlogprt=0).

6.5 Changing the model resolution

The model resolution is defined in the setup file with the variables res atm and res oce for the atmo-sphere and ocean model grid resolution respectively. Possible choices are listed in the setup.

In contrast to ECHAM5, MPI-OM needs to be recompiled when the model resolution is changing. Inthe “User Specifications” part of the compile script the variable model horizontal grid needs tobe adapted. Do not forget to produce the two MPI-OM executables needed, one to start from initialconditions, the other for the following runs.

Some of the input files used for ECHAM5 depend on the ocean model grid. Pleas contact us for instruc-tions on how to generate those files, in case they are not available.

Page 62: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

54 CHAPTER 6. MODIFICATION OF MPI-M MODEL COUPLING ALGORITHMS

6.6 Adding exchange fields

Adding additional exchange fields needs programming effort. Affected are the sending and the receivingmodel, the namcouple base file. Here the list of To-Dos.

1. Programming of model code to provide the fields to send in the sending model. It might be usefulto wrap the new code with cpp flags.

2. Implementation of the PSMILe calls to send the extra fields in the sending model

3. Compilation of the sending model. Adaptation of the compile script if necessary (e.g. new cppflags).

4. Implementation of the PSMILe calls to receive the new fields in the receiving model

5. Adaptation of the model physics to the new fields in the receiving model. It might be useful to wrapthe new code with cpp flags.

6. Compilation of the receiving model. Adaptation of the compile script if necessary.

7. Adding sections for the new fields in the namcouple base file.

8. Adaptation of variable NFIELDS in the namcouple base file. It defines the total number of fieldsbeing exchanged

9. Adaptation of the runscript, if necessary (e.g. new input or output files, increase of computing time,specification of the new namcouple version)

10. Performance of test runs with the coupled model until the results are satisfying.

Page 63: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

Chapter 7

The PRISM CVS Repository

The PRISM repository is installed on a CVS server. Presently the server runs on the host bedano of theSwiss National Supercomputing Centre (CSCS) in Zurich. A mirror repository was installed by Model andData (M&D) in Hamburg. After the PRISM project, the mirror is planed to become the actual repository,as support of the Zurich repository cannot be assured.

The CVS repository on bedano is accessible via a web interface and by direct connection from selectedPRISM platforms (pserver method). Both methods are briefly described in the following paragraphs.More information is available from the WP-3i web page http://prism.dkrz.de/Workpacka-ges/WP3i/.

The repository is password protected. Some of the PRISM software is freely available for scientific use.This includes the coupler OASIS3, the three TOYCLIM models and the standard environments for com-pilation and running. Within the project phase all models were available for the PRISM members. Now,the distribution policies of the institutes is committing. Please contact the PRISM web page for furtherinformation on that topic.

7.1 Download via Direct Connection (pserver)

The pserver method establishes a direct connection between client and server. This method is availablefrom selected PRISM platforms, only. Please contact us for information on these machines.

1. Login on a machine where direct access to bedano is possible (see above).

2. Define the environment variable CVSROOT.

CVSROOT=:pserver:[email protected]:/users/cvs

or as an alternative

setenv CVSROOT :pserver:[email protected]:/users/cvs

3. Login on bedano. Please contact us for the password.

cvs login

4. Download the source code. The CVS modules that have been defined for the MPI-Met coupledmodels are listed in table 7.1. To receive the latest version of a module type

cvs checkout MODULE

To download a specific version of a module a tag can be specified with the checkout command. Tag“prism 2-4” defines the state of the PRISM system at the end of the project.

cvs checkout -r tag MODULE

The checkout command leads to a download of all files integrated in the CVS module MODULE.

55

Page 64: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

56 CHAPTER 7. THE PRISM CVS REPOSITORY

Source Code, Utilities and Data Source Code and Utilities DataMPI-OM MPI-OMSRC MPI-OMDATAMPI-OB MPI-OBSRC MPI-OBDATAMPI-AO MPI-AOSRC MPI-AODATAECHO ECHOSRC ECHODATAMPI-AOB MPI-AOBSRC MPI-AOBDATAMPI-AOB-HAM MPI-AOB-HAMSRC MPI-AOB-HAMDATA

Table 7.1: List of the CVS modules defined for the coupled models described in this report. The Modules MPI-AOand ECHO are identical.

Besides, CVS administrative directories are created to allow for further interaction with the repos-itory. If these administrative directories are not wanted the module can be downloaded using theexport command.

cvs export -r tag MODULE

5. Finally terminate the CVS session typing

cvs logout

7.2 Download via the Web

1. The CVS server is accessible though the web:

http://prism-cvs.cscs.ch/cgi-bin/cvsweb.cgi

The site is password protected. Login as “guest”. Please contact us to get a password.

2. Select the sources you would like to look at or to download by mouse clicks.

The usage of CVS modules is not supported using the web interface and the sources have to be down-loaded file by file. Besides, the web connection to bedano is quite slow. For that reason it is stronglyrecommended to use the pserver method when downloading more than one single file.

7.3 The CVS modules

To facilitate the download, for each PRISM coupled model three CVS modules have been defined. Thefirst module which is labelled with the coupled model name in capital letters contains the source codeof the model components including the coupler, the source code of all libraries needed, a complete setof tools for model compilation and execution as well as a tar file containing the initial data. To retrievesource code and tools only, a module with the appendix SRC was defined. Analogously a module withappendix DATA contains the initial data tar file, only. Table 7.1 lists the CVS modules for the coupledmodels described in this report.

Page 65: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

Chapter 8

Interfacing with the GUI

The MPI-Met coupled models described in the handbook on hand have been interfaced to a graphicaluser interface (GUI). The PRISM GUI to specify a model setup and SMS was running on a DKRZ serverwithin the project phase. It was possible to compile and run the models on the DKRZ HLRE computingfacilities through the GUI and to monitor running experiments. More information on this topic is availablein Constanza et al. (2004).

57

Page 66: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

58 CHAPTER 8. INTERFACING WITH THE GUI

Page 67: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

Bibliography

Arakawa, A. and V. R. Lamb, 1977: Computational design of the basic dynamical processes of the UCLAgeneral circulationmodel. Meth. Comput. Phys., 17, 173-265.

Beckmann, A. and R. Doscher, 1997: A method for improved representation of dense water spreadingover topography in geopotential-coordinate models. J. Phys. Oceanogr., 27.

Constanza, P., C. Larsson, C. Linstead, X. Le Pasteur, and N. Wedi, 2004: The PRISM Graph-ical User Interface (GUI) and Web Services Guide, PRISM Report Series, No 6, nn pp.(http://prism.enes.org/Results/Documents/PRISMReport/Report06.pdf).

Dumenil, L. and E. Todini, E. 1992. A rainfall-runoff scheme for use in the Hamburg climate model.In: Advances in theoretical Hydrology, A Tribute to James Dooge, edited by Kane, J. O., EuropeanGeophysical Society Series on Hydrological Science, pp 129-157. Elsevier Amsterdam.

Fortuin, J. P. F. and H. Kelder, 1998. An ozone climatology based on ozonesonde and satellite measure-ments. J. Geophys. Res., 103, 31709-31734.

Gayler, V. and S. Legutke, 2004: The PRISM Standard Running Environment Guide’, PRISM ReportSeries, no. 4, 40 pp.(http://prism.enes.org/Results/Documents/PRISMReports/Report04.pdf).

Gent, P. R., J. Willebrand, T. J. McDougall, and J. C. McWilliams, 1995: Parameterizing eddy-inducedtracer transports in ocean circulation models. J. Phys. Oceanogr., 25

Groetzner, A., R. Sausen, and M. Clausen, 1996: The impact of sub-grid scale sea-ice imhomogenitieson the perfomance of the atmospheric general circulation model ECHAM3,.Climate Dynamics, 12,477-496.

Hagemann, S. and Dumenil, 1996. Development of a parameterization of laterald discharge for the globalscale . Report 219, Max-Planck-Institut fur Meteorologie, Hamburg.

Hagemann, S., 2002. An improved land surface parameter data set for global and regional climate models.Report 336, Max-Planck-Institut fur Meteorologie, Hamburg.

Hibler III, W.D., 1979: A dynamic thermodynamic sea ice model. J. Phys. Oceanogr., 9, 815-846.

Kriest, I., ..., 2004. HAMOCC - The Global Model of Marine Bio-Geo-Chemistry of the Max PlanckInstitut in Hamburg. Modelle and Daten, Technical Report, No.3.

Legutke, S. and V.Gayler, 2004: The PRISM Standard Compilation Environment Guide’, PRISM ReportSeries, no. 3, 66 pp. (http://prism.enes.org/Results/Documents/PRISMReports/Report05.pdf).

Levitus, S., Boyer, T. P., Conkright, M. E., O’Brien, T., Antonov, J., Stephens, C., Stathoplos, l., John-son, D., Gelfeld, R., 1998. Worl Ocean Data Base 1998. In: Introduction???. NOAA Atlas NESDIS18, Ocean Climate Laboratory, National Oceanographic Data Center, vol. 1. US Government PrintingOffice, Washington, DC.

Maier-Reimer et al., 2005. The marine biochemistry in the earth system, Journal of climate, to be submit-ted.

Mangili, A., M. Ballabio, M. Djordje, L. Kornblueh, R. Vogelsang, and P. Bourcier, 2003: PRISM Soft-ware Engineering, Coding Rule, Quality Standards, 31 pp.(http://prism.dkrz.de/Workpackages/WP3i/WP2b coding rules.ps)

59

Page 68: PRISM Project for Integrated Earth System Modelling An ... · PRISM Project for Integrated Earth System Modelling An Infrastructure Project for Climate Research in Europe funded by

60 BIBLIOGRAPHY

Marsland, S.J., H. Haak, J.H. Jungclaus, M. Latif und F. Roske, 2003: The Max- Planck- Institute globalocean/sea ice model with orthogonal curvilinear coordinates. Ocean Modelling, 5, 91-127.

Roesch, EA., M. Wild, H. Gilgen, and, A. Ohmura, 2001. A new snow cover fraction parameterization forthe ECHAM4 GCM. Clim. Dyn., 17, 933-946.

Roeckner, E., G. Baeuml, L. Bonaventura, R. Brokopf, M. Esch, M. Giorgetta, S. Hagemann, I. Kirchner,L. Kornblueh, E. Manzini, A. Rhodin, U. Schlese, U. Schulzweidea, A. Tompkins, 2003. The atmo-spheric general circulation model ECHAM5, Part I. Max Planck Institute for Meteorology, Report No.349.

Roeckner, E., et al., 2004. The atmospheric general circulation model ECHAM5, Part II: Simulated cli-matology and comparison with observations. Max Planck Institute for Meteorology, Report No. 3??.

Roske, F., 2001. An atlas of surface fluxes based on the ECMWF Re-Analysis - a climatological datasetto force global ocean general circulation models. Report 323, Max-Planck-Institut f”ur Meteorologie,Hamburg, Germany.

Schulz, J.-P., L. Dumenil, and J. Polcher, 2001. On the land surface - atmosphere coupling and its impactin a single-column atmospheric model. J. Appl. Meteorol., 40, 642-663.

Timmreck C. and M. Schulz (2004): Significant dust simulation differences in nudged and climatologicaloperation mode of the AGCM ECHAM, Journal of Geophysical Research - Atmospheres, in press.

Valcke, S., A. Caubel, R. Vogelsang, and D. Declat, 2004: OASIS3 User’s Guide (oasis3 prism 2-3).PRISM Project Report ?, 70 pp., CERFACS TR/CMGC/04/68(http://prism.enes.org/Results/Documents/PRISMReports/Report03.pdf).