a community t errain-following o cean m odeling s ystem

Post on 29-Jan-2016

37 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

DESCRIPTION

A Community T errain-following O cean M odeling S ystem. 2003 Terrain-Following Ocean Models Users Workshop PMEL, Seattle, WA, August 5, 2003. Developers and Collaborators. Hernan G. Arango Alexander F. Shchepetkin W. Paul Budgell Bruce D. Cornuelle Emanuele DiLorenzo Tal Ezer - PowerPoint PPT Presentation

TRANSCRIPT

A Community Terrain-following Ocean Modeling

System

2003 Terrain-Following Ocean Models Users WorkshopPMEL, Seattle, WA, August 5, 2003

Developers and CollaboratorsHernan G. Arango

Alexander F. Shchepetkin

W. Paul Budgell

Bruce D. Cornuelle

Emanuele DiLorenzo

Tal Ezer

Mark Hadfield

Kate Hedstrom

Robert Hetland

John Klinck

Arthur J. Miller

Andrew M. Moore

Christopher Sherwood

Rich Signell

John C. Warner

John Wilkin

Rutgers University

UCLA

IMR, Norway

SIO

SIO

Princeton University

NIWA, New Zealand

University of Alaska, ARSC

TAMU

Old Dominion

SIO

University of Colorado

USGS/WHOI

SACLANT

USGS/WHOI

Rutgers University

Executive Committee

Dale B. Haidvogel

James C. McWilliams

Robert Street

Rutgers University

UCLA

Stanford University

ONR Support

Manuel Fiadeiro

Terri Paluszkiewicz

Charles Linwood Vincent

• To design, develop and test an expert ocean modeling system for scientific and operational applications over a wide range of scales from coastal to global

• To provide a platform for coupling with operational atmospheric models, sediment models, and ecosystem models

• To support multiple levels of nesting and composed grids

• To provide tangent linear and adjoint models for variational data assimilation, ensemble forecasting, and stability analysis

• To provide a framework for massive parallel computations

Objectives

• Use state-of-the-art advances in numerical techniques, subgrid-scale parameterizations, data assimilation, nesting, computational performance and parallelization

• Modular design with ROMS as a prototype

• Test and evaluate the computational kernel and various algorithms and parameterizations

• Build a suite of test cases and application databases

• Provide a web-based support to the user community and a linkage to primary developers

Approach

ROMS/TOMS 2.0 released to beta testers onJanuary 16, 2003 and full user community onJune 30, 2003.

Accomplishments

Built tangent linear and adjoint models and tested on realistic applications of the US West and East Coasts: eigenmodes and adjoint eigenmodes, singular vectors, pseudospectra, forcing singular vectors, stochastic optimals, and ensemble forecasting.

(Relief Image from NOAA Animation by Rutgers)

The model is used in oceanographic studies in over 30 countries by:

• Universities

• Government Agencies

• Research Organizations

Ocean Modeling Web Site

http://www.ocean-modeling.org/

• Free-surface, hydrostatic, primitive equation model

• Generalized, terrain-following vertical coordinates

• Boundary-fitted, orthogonal curvilinear, horizontal coordinates on an Arakawa C-grid

• Non-homogeneous predictor/corrector time-stepping algorithm

• Accurate discretization of the baroclinic pressure gradient term

• High-order advection schemes

• Continuous, monotonic reconstruction of vertical gradients to maintain high-order accuracy

Kernel Attributes

Vertical Terrain-following Coordinates

Dubrovnik(Croatia)

Vieste(Italy)

Longitude

Depth(m)

Curvilinear Transformation

CartesianSphericalPolar

• Modular, efficient, and portable F90/F95 Fortran code with dynamical allocation of memory via de-referenced pointer structures.

• C-preprocessing managing

• Multiple levels of nesting and composed grids

• Lateral boundary conditions options for closed, periodic, and radiation

• Arbitrary number of tracers (active and passive)

• Input and output NetCDF data structure

• Support for parallel execution on both shared- and distributed -memory architectures

Code Design

Model Grid Configuration

Nested Composed

• Coarse-grained parallelization

Parallel Framework

}

} Nx

Ny

Parallel Tile Partitions

8 x 8

• Coarse-grained parallelization

Parallel Framework

• Shared-memory, compiler depend directives MAIN (OpenMP 2.0 standard)

• Distributed-memory (MPI)

• Optimized for cache-bound computers

• ZIG-ZAG cycling sequence of tile partitions

• Few synchronization points

• Serial and Parallel I/O (via NetCDF)

• Efficiency 4-64 threads

(Ezer)

The cost of saving output and global averaging is much

higher for the MPI code

(for the shared-memory SGI machine)

(Ezer)

• Horizontal mixing of tracers along level, geopotential, isopycnic surfaces

• Transverse, isotropic stress tensor for momentum

• Local, Mellor-Yamada, level 2.5, closure scheme

• Non-local, K-profile, surface and bottom closure scheme

• Local, Mellor-Yamada, level 2.5, closure scheme

• General Length-Scale turbulence closure (GOTM)

Subgrid-Scale Parameterizations

• Air-Sea interaction boundary layer from COARE (Fairall et al., 1996)

• Oceanic surface boundary layer (KPP; Large et al., 1994)

• Oceanic bottom boundary layer (inverted KPP; Durski et al., 2001)

Boundary Layers

• Wave / Current / Sediment bed boundary layer (Styles and Glenn, 2000; Blaas; Sherwood)

• Lagrangian Drifters (Klinck, Hadfield, Capet)

• Tidal Forcing (Hetland, Signell)

• River Runoff (Hetland, Signell, Geyer)

• Sea-Ice (Budgell, Hedstrom)

• Biology Fasham-type Model (Moisan, Di Lorenzo, Shchepetkin, Frenzel, Fennel, Wilkin)

• EcoSim Bio-Optical Model (Bissett, Wilkin)

• Sediment erosion, transport and deposition (Warner, Sherwood, Blaas)

Modules

Ongoing and Future Work

• One- and two-way nesting• Wetting and drying capabilities• Sediment model• Bottom boundary layer models• Ice model• Parallelization of adjoint model• Variational data assimilation • Parallel IO• Framework (ESMF)• Web-based dynamic documentation• Test cases• WRF coupling

One-Way Nesting

North Atlantic Basin

• 1/10 degree resolution (1002x1026x30)• Levitus Climatology• NCEP daily winds: 1994-2000• COADS monthly heat fluxes• Requirements:

• Memory: 11 Gb• Input data disk space: 16 Gb• Ouput data disk space: 280 Gb• 32 Processors Origin 3800, 4x16• CPU: 46 hours per day of simulation• Wall clock: 153 days for 7-year simulation

Bathymetry

1/10 degree

(m)

Resolution

ETOPO5

r-factor = 3.2

Free-Surface(m)

Temperature

at 100 m

US East Coast

• 30 km resolution (192x64x30)• Initialized for North Atlantic Basin Simulation• NCEP daily winds• COADS monthly heat fluxes with imposed daily

shortwave radiation cycle.• One-way nesting

• Boundary conditions from 3-day averages• Flather/Chapman OBC for 2D momentum• Clamped OBC for 3D momentum and tracers

• Rivers• Fasham-type biology model

Temperature

One-Way

at 100 m

Coupling

(Wilkin)

Temperature

Potential

at 50 m

(Celsius)

30 km

Resolution

(Wilkin)

SurfaceTemperature

Surface Chlorophyll

PublicationsEzer, T., H.G. Arango and A.F. Shchepetkin, 2002: Developments in Terrain-Following Ocean Models: Intercomparisons of Numerical

Aspects, Ocean Modelling, 4, 249-267.

Haidvogel, D.B., H.G. Arango, K. Hedstrom, A. Beckmann, P. Malanotte-Rizzoli, and A.F. Shchepetkin, 2000: Model Evaluation Experiments in the North Atlantic: Simulations in Nonlinear Terrain-Following Coordinates, Dyn. Atmos. Oceans, 32, 239-281.

MacCready, P. and W.R. Geyer, 2001: Estuarine Salt Flux through an Isoline Surface, J. Geoph. Res., 106, 11629-11639.

Malanotte-Rizzoli, P., K. Hedstrom, H.G. Arango, and D.B. Haidvogel, 2000: Water Mass Pathways Between the Subtropical and Tropical Ocean in a Climatological Simulation of the North Atlantic Ocean Circulation, Dyn. Atmos. Oceans, 32, 331-371.

Marchesiello, P., J.C. McWilliams and A.F. Shchepetkin, 2003: Equilibrium Structure and Dynamics of the California Current System, J. Phys. Oceanogr., 34, 1-37.

Marchesiello, P., J.C. McWilliams, and A.F. Shchepetkin, 2001: Open Boundary Conditions for Long-Term Integration of Regional Ocean Models, Ocean Modelling, 3, 1-20.

Moore, A.M., H.G. Arango, A.J. Miller, B.D. Cornuelle, E. Di Lorenzo, and D.J. Neilson, 2003: A Comprehensive Ocean Prediction and Analysis System Based on the Tangent Linear and Adjoint Components of a Regional Ocean Model, Ocean Modelling, Submitted.

Peven, P., C. Roy, A. Colin de Verdiere and J. Largier, 2000: Simulation and Quantification of a Coastal Jet Retention Process Using a Barotropic Model, Oceanol. Acta, 23, 615-634.

Peven, P., J.R.E. Lutjeharms, P. Marchesiello, C. Roy and S.J. Weeks, 2001: Generation of Cyclonic Eddies by the Agulhas Current in the Lee of the Agulhas Bank, Geophys. Res. Let., 27, 1055-1058.

Shchepetkin, A.F. and J.C. McWilliams, 2003: The Regional Ocean Modeling System: A Split-Explicit, Free-Surface Topography-Following Coordinates Ocean Model, J. Comp. Phys., Submitted.

Shchepetkin, A.F. and J.C. McWilliams, 2003: A Method for Computing Horizontal Pressure-Gradient Force in an Oceanic Model with a Non-Aligned Vertical Coordinate, J. Geophys. Res., 108, 1-34.

She, J. and J.M. Klinck, 2000: Flow Near Submarine Canyons Driven by Constant Winds, J. Geophys. Res., 105, 28671-28694.

Warner, J.C., H.G. Arango, C. Sherwood, B. Butman, and Richard P. Signell, 2003: Performance of four turbulence closure methods Implemented using a Generic Length Scale Method, Ocean Modelling, Revised and Resubmitted.

Modular Design

Code Design

#include "cppdefs.h“

MODULE mod_ocean USE mod_kinds

implicit none

TYPE T_OCEAN real(r8), pointer :: rubar(:,:,:) real(r8), pointer :: rvbar(:,:,:) real(r8), pointer :: rzeta(:,:,:) real(r8), pointer :: ubar(:,:,:) real(r8), pointer :: vbar(:,:,:) real(r8), pointer :: zeta(:,:,:)#ifdef SOLVE3D real(r8), pointer :: pden(:,:,:) real(r8), pointer :: rho(:,:,:) real(r8), pointer :: ru(:,:,:,:) real(r8), pointer :: rv(:,:,:,:) real(r8), pointer :: t(:,:,:,:,:) real(r8), pointer :: u(:,:,:,:) real(r8), pointer :: v(:,:,:,:) real(r8), pointer :: W(:,:,:) real(r8), pointer :: wvel(:,:,:)# ifdef SEDIMENT real(r8), pointer :: bed(:,:,:,:) real(r8), pointer :: bed_frac(:,:,:,:) real(r8), pointer :: bottom(:,:,:)# endif#endif

END TYPE T_OCEAN

TYPE (T_OCEAN), allocatable :: ALL_OCEAN(:)

END MODULE mod_ocean

CONTAINS

SUBROUTINE allocate_ocean (ng, LBi, UBi, LBj, UBj)

USE mod_param#ifdef SEDIMENT USE mod_sediment#endif

integer, intent(in) :: ng, LBi, UBi, LBj, UBj

IF (ng.eq.1) allocate ( OCEAN(Ngrids) )

allocate ( OCEAN(ng) % rubar(LBi:UBi,LBj:UBj,2) ) allocate ( OCEAN(ng) % rvbar(LBi:UBi,LBj:UBj,2) ) allocate ( OCEAN(ng) % rzeta(LBi:UBi,LBj:UBj,2) ) allocate ( OCEAN(ng) % ubar(LBi:UBi,LBj:UBj,3) ) allocate ( OCEAN(ng) % vbar(LBi:UBi,LBj:UBj,3) ) allocate ( OCEAN(ng) % zeta(LBi:UBi,LBj:UBj,3) )

#ifdef SOLVE3D allocate ( OCEAN(ng) % pden(LBi:UBi,LBj:UBj,N(ng)) ) allocate ( OCEAN(ng) % rho(LBi:UBi,LBj:UBj,N(ng)) ) allocate ( OCEAN(ng) % ru(LBi:UBi,LBj:UBj,0:N(ng),2) ) allocate ( OCEAN(ng) % rv(LBi:UBi,LBj:UBj,0:N(ng),2) ) allocate ( OCEAN(ng) % t(LBi:UBi,LBj:UBj,N(ng),3,NT(ng)) ) allocate ( OCEAN(ng) % u(LBi:UBi,LBj:UBj,N(ng),2) ) allocate ( OCEAN(ng) % v(LBi:UBi,LBj:UBj,N(ng),2) ) allocate ( OCEAN(ng) % W(LBi:UBi,LBj:UBj,0:N(ng)) )

# ifdef SEDIMENT allocate ( OCEAN(ng) % bed(LBi:UBi,LBj:UBj,Nbed,MBEDP) ) allocate ( OCEAN(ng) % bed_frac(LBi:UBi,LBj:UBj,Nbed,NST) ) allocate ( OCEAN(ng) % bottom(LBi:UBi,LBj:UBj,MBOTP) )# endif#endif

RETURN END SUBROUTINE allocate_ocean

SUBROUTINE initialize_ocean (ng, tile)

USE mod_param#ifdef SEDIMENT USE mod_sediment#endif

integer, intent(in) :: ng, tile integer :: IstrR, IendR, JstrR, JendR, IstrU, JstrV

real(r8), parameter :: IniVal = 0.0_r8

#include "tile.h"#ifdef DISTRIBUTE IstrR=LBi IendR=UBi JstrR=LBj JendR=UBj#else# include "set_bounds.h"#endif

OCEAN(ng) % rubar(IstrR:IendR,JstrR:JendR,1:2) = IniVal OCEAN(ng) % rvbar(IstrR:IendR,JstrR:JendR,1:2) = IniVal OCEAN(ng) % rzeta(IstrR:IendR,JstrR:JendR,1:2) = IniVal

OCEAN(ng) % ubar(IstrR:IendR,JstrR:JendR,1:3) = IniVal OCEAN(ng) % vbar(IstrR:IendR,JstrR:JendR,1:3) = IniVal OCEAN(ng) % zeta(IstrR:IendR,JstrR:JendR,1:3) = IniVal

...

RETURN END SUBROUTINE initialize_ocean

END MODULE mod_ocean

#include "cppdefs.h" MODULE omega_mod

implicit none

PRIVATE PUBLIC omega

CONTAINS

SUBROUTINE omega (ng, tile)

USE mod_param USE mod_grid USE mod_ocean

integer, intent(in) :: ng, tile

# include "tile.h“# ifdef PROFILE CALL wclock_on (ng, 13)# endif CALL omega_tile (ng, Istr, Iend, Jstr, Jend, & & LBi, UBi, LBj, UBj, & & GRID(ng) % Huon, & & GRID(ng) % Hvom, & & GRID(ng) % z_w, & & OCEAN(ng) % W)# ifdef PROFILE CALL wclock_off (ng, 13)# endif

RETURN END SUBROUTINE omega

SUBROUTINE omega_tile (ng, Istr, Iend, Jstr, Jend, & & LBi, UBi, LBj, UBj, & & Huon, Hvom, z_w, W)

USE mod_param USE mod_scalars USE bc_3d_mod, ONLY : bc_w3d_tile

integer, intent(in) :: ng, Iend, Istr, Jend, Jstr integer, intent(in) :: LBi, UBi, LBj, UBj

real(r8), intent(in) :: Huon(LBi:,LBj:,:) real(r8), intent(in) :: Hvom(LBi:,LBj:,:) real(r8), intent(in) :: z_w(LBi:,LBj:,0:) real(r8), intent(out) :: W(LBi:,LBj:,0:)

integer :: IstrR, IendR, JstrR, JendR, IstrU, JstrV integer :: i, j, k real(r8), dimension(PRIVATE_1D_SCRATCH_ARRAY) :: wrk

# include "set_bounds.h" DO j=Jstr,Jend DO i=Istr,Iend W(i,j,0)=0.0_r8 END DO DO k=1,N(ng) DO i=Istr,Iend W(i,j,k)=W(i,j,k-1)- & & (Huon(i+1,j,k)-Huon(i,j,k)+ & & Hvom(i,j+1,k)-Hvom(i,j,k)) END DO END DO

...

END DO

RETURN END SUBROUTINE omega_tile

top related