advanced computing for fusion simulation and modelling pär strand

41
EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804 Advanced computing for fusion simulation and modelling Pär Strand Chalmers University of Technology EU Fusion for ITER Applications - EUFORIA 5 th EGEE USER FORUM 2010-04-14 Uppsala

Upload: imogene-montana

Post on 03-Jan-2016

29 views

Category:

Documents


1 download

DESCRIPTION

Advanced computing for fusion simulation and modelling Pär Strand Chalmers University of Technology EU Fusion for ITER Applications - EUFORIA. 5 th EGEE USER FORUM 2010-04-14 Uppsala. Overview. Brief Background – Fusion Modelling Challenges - Use Cases. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

Advanced computing for fusion simulation and modelling

Pär Strand

Chalmers University of Technology

EU Fusion for ITER Applications - EUFORIA

5th EGEE USER FORUM

2010-04-14 Uppsala

Page 2: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

Overview

Brief Background – Fusion

Modelling Challenges - Use Cases.

EUFORIA approach to supporting modellers

Summary

Page 3: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

Fusion Energy source for the

sun and other stars

Provides a potential source of base load energy production

Been working on this for more than 50 years

Has turned out to be a very difficult problem

"Every time you look up at the sky, every one of those points of light is a reminder that fusion power is extractable from hydrogen and other light elements, and it is an everyday

reality throughout the Milky Way Galaxy."

--- Carl Sagan, Spitzer Lecture, October 1991

Page 4: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

Fusion Two main lines of research

Inertial confinement Implosion of small pellets NIF at LLNL

Magnetic confinement Two main type of confugrations

studied: Stellarator – W7X

Currently under construction in Greifswald in Germany

Tokamak – ITER Under construction in Cadarache

in France

Page 5: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

ITER

Involves 7 partners representing more than 50% world population

Costs > 10 G$

Under construction in Cadarache, France

Key element on the path to fusion energy production

Page 6: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

ITER

Costs > 10 G$

Budget is for Construction only!

Developments for the work programme and securing the exploitation of the device is in the hands of each of the partners.

Page 7: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

Units

Plasma Major Radius

6.2 m

Plasma Minor Radius

2.0 m

Plasma Volume

840 m3

Plasma Current

15.0 MA

Toroidal Field on Axis

5.3 T

Fusion Power 500 MW

Burn Flat Top >400 s

Power Amplification

>10

ITER

Page 8: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

Units

Plasma Major Radius

6.2 m

Plasma Minor Radius

2.0 m

Plasma Volume

840 m3

Plasma Current

15.0 MA

Toroidal Field on Axis

5.3 T

Fusion Power 500 MW

Burn Flat Top >400 s

Power Amplification

>10

ITER

Page 9: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

ITER has developed from a long sequence of experiments BUT is the first tokamak

where modelling is set to play an important role in both design and exploitation.

This is even more true for the next stage - a first DEMO reactor.

Page 10: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

The “fast track” to fusion energy – hinges on several parallel developments

“Numerical Tokamak” – a comprehensive simulation package with predictive

capabilities.

Page 11: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

Relation to Experiments

Physics analysis and exploration

tools”Semi-open”

Operational core tools”In House”

Operational core tools”In House”

Local Data layer

ModulesApplications

Workflows

Grid

HPC

LocalCluster

LocalData

Data exchanges

Requires Machine descriptions and Data mappings

Universal Access Layer, exp2ITM

ExperimentModelling Platform

User Gateway

Device Independent

Exchange ofModules

Applications

Page 12: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

Modelling Challenges Fusion codes comes with a variety of different computational requirements:

Local calculations (serial or small clusters)

Capacity computing

MC, Ray tracing, parameter scans

Intermediate level grid parallel applications (~128cores)

Capability computing (pushing the limits of existing HPC)

No single “best” computing paradigm

A single application may require different architectures/ or infrastructures depending

on particular use!

Plasmas are complex and non-linear and dynamic responses may force higher level

fidelity in modelling – on the fly change of application and computing infrastructure

requirements

Physics are coupled

Rare situation with a single physics codes suffices for deep understanding - coupled applications a necessity!

Page 13: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

1d

2d

Real problem is 3d space, 2/3d velocity

SimulationsEUFORIA scope is ”edge and core turbulence and transport”

Page 14: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

Models describing the plasma vary in complexity

IonTurbulence

Atomic

10-9

10-6

10-3

1

10+3

10-9 10-6 10-3 1 10+3

meters

seco

nds

Sheath10-12

ICRH

CoreTransport

ElectronTurbulence

ECRH

SlowingDown

Erosion

EdgeTransport

5D

4-6D

3D

2-3D

1D

NTMs

AEsIon

Turbulence

Atomic

10-9

10-6

10-3

1

10+3

10-9 10-6 10-3 1 10+3

meters

seco

nds

Sheath10-12

ICRH

CoreTransport

ElectronTurbulence

ECRH

SlowingDown

Erosion

EdgeTransport

5D

4-6D

3D

2-3D

1D

NTMs

AEs

-Extreme ranges in both time and space scales

-Varying dimensionality

Page 15: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

IonTurbulence

Atomic

10-9

10-6

10-3

1

10+3

10-9 10-6 10-3 1 10+3

meters

seco

nds

Sheath10-12

ICRH

CoreTransport

ElectronTurbulence

ECRH

SlowingDown

Erosion

EdgeTransport

5D

4-6D

3D

2-3D

1D

NTMs

AEsIon

Turbulence

Atomic

10-9

10-6

10-3

1

10+3

10-9 10-6 10-3 1 10+3

meters

seco

nds

Sheath10-12

ICRH

CoreTransport

ElectronTurbulence

ECRH

SlowingDown

Erosion

EdgeTransport

5D

4-6D

3D

2-3D

1D

NTMs

AEs

Models describing the plasma vary in complexity

EUFORIA PHYSICS

Page 16: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

Models describing the plasma vary in complexity

Turbulence simulation3d space

(can also be kinetic)Edge transport simulation

2d space

Page 17: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

Developing a new paradigm for fusion computing

ScientificWorkflow

GRID HPC Visualization

- Building on e-infrastructure tools, middleware and installations

- Integrating tools and physics models together with a ”fusion simulation ontology”

- At least initially building on fusion de facto standards for data access and communication

Page 18: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

EUFORIA14 member Institutes

522pms covering

- Management- Training- Dissemination- Grid and HPC infra- structure & support- Code adaptation & optimization-Workflows-Visualization

Page 19: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

Work plan outline

Jan 2008

Dec 2010Mixed WorkflowsMixed Workflows

Grid Testbed DeploymentGrid Testbed Deployment

Grid ApplianceGrid Appliance

Proof of concept runsProof of concept runs

Application PortingApplication Porting

Standardization and integrationStandardization and integration

Development and deploymentDevelopment and deployment

MigratingMigrating

Page 20: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

Grid Infrastructure Centers with infrastructure in EUFORIA

IFCA in Santander, Spain

KIT in Karlsruhe, Germany

Chalmers University, Sweden

Ciemat in Trujillo and Madrid, Spain

Current Status: ~3100 CPUs, ~2TB online #CPU Free Total Jobs Running Waiting ComputingElement---------------------------------------------------------- 640 137 2027 197 1830 svea-gl2.c3se.chalmers.se:2119/jobmanager-pbs-svea 568 268 0 0 0 iwrce2.fzk.de:2119/jobmanager-lcgpbs-i2gpar 228 139 0 0 0 ce01-tic.ciemat.es:2119/jobmanager-lcgpbs-euforia1616 1616 0 0 0 gridce01.ifca.es:2119/jobmanager-sge-euforia 48 48 0 0 0 ce-euforia.ceta-ciemat.es:2119/jobmanager-lcgpbs-euforia

Avail Space(Kb) Used Space(Kb) Type SEs----------------------------------------------------------1000 n.a n.a storm.ifca.es1000 n.a n.a storm.ifca.es114395396 100297660 n.a griddpm01.ifca.es114395091 100297964 n.a griddpm01.ifca.es1820000000 21058536 n.a iwrse2.fzk.de13488960 5528882 n.a svea-gl3.c3se.chalmers.seAlso shared resources with Fusion VO

EUFORIA grid supporting parallel apps and

Interactive grid (i2g) extensions.

Page 21: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

HECToR(EPCC, UK) –Cray XT4 (integrated with X2 vector system in a single machine). XT4: 11,328 cores. Processor: 2.8GHz Opteron; 33.2TB RAM. 59Tflops theoretical peak. X2: 112 vector processors; 2.87Tflops theoretical peak–No. 29 in Top 500 (54.65 TflopsLINPACK)

MareNostrum(BSC, Spain)–IBM Cluster. 10240 cores. Processor: 2.3GHz PPC 970; 20TB. 94.21Tflops peak.–No.26 in Top 500 (63.83 TflopsLINPACK)

Louhi(CSC, Finland)–Cray XT4. 4,048 cores. 4.5TB RAM. 37.68Tflops peak.–No. 70 in Top 500 (26.80 TflopsLINPACK)

With DESA ~3Mcpuhours 2009

Page 22: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

Example Fusion Workflow

One example of a workflow

Many variants are possible

Different modules have very different computational requirements

sub-second/iteration on a single cpu hour(s)/iteration on a large number of cpus

Need to visualize the results

EQUIL

ETS

ECRHICRH TRANSPNEUTRALS

SAWTEETH

EQUIL?

ELM(c)

CORE2EQ

SOURCE_COMBINER

ELM(t)

TRANSPORT_COMBINER

NEO

ScientificWorkflow

GRID HPC Visualization

Page 23: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

Coupling codes and applicationsN modules integrated in N different applications

N modules coupled into a dynamic application framework

Adapted from David De Roure

Number of coupled codes/physics modules in typical fusion applications can be large:

- Chain1 model at JET ~85 coupled applications in a DAG like structure- Detailed predicitve transport modelling – 25 different modules with multiple versions and iterative dependencies

Still need to support single standalone applications!

Page 24: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804 24

(Subset of) codes adapted for Grid use

• Different code domains and different parallel strategies:• GEM: Linear & non linear Turbulence

Gyrofluid code (Core Transport). MPI.• BIT1: Divertor code

(SOL Transport). Parameter Scan• EIRENE: Neutral transport for tokamaks & Stellarators (Neutral

Transport). MC Code. • DAB (Distributed Asinchronous Bees): Tool for the optimization of

any concept in fusion devices. Asynchronous Algorithm.• Plus the previously gridified Codes (Taken from EGEE code

Platform): Suitable for Workflows.

- ISDEP (MC-Transport), Mishka & Helena (Equilibrium and MHD), VMEC (3D-Equilibrium suitable for tokamaks and sellarators).

Page 25: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

Enhancing the grid usage Application specific support tools for enhancing the grid usage. TAPAS (Tool for Automatic Parameter Studies) and TAPAS4GRID : allows parameter scans automatically on HPCs and

grids. Used for EMC3-EIRENE and BIT1.

S.t.a.r.t. (Simple Tool for Application Runs and Transfer of data) provides a command line interface that hides all the

complexity of job management in the grid. It also simplifies the inclusion of required libraries for the submitted jobs.

Tools are also building block for improved inclusion of applications in Workflows

Page 26: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

Generic Application Structure

Large scale fusion applications tend to be long lived: - How to build software that lasts 30+ years?

Page 27: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

European Fusion Implementation

Modelling tools are either standalone applications OR modules .i.e., subroutines. Automagic tools wrap modules into workflow components (Kepler Actors)

Page 28: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

KEPLER Transport Workflow

P.HUYNH, V.BASIUK

ScientificWorkflow

GRID HPC Visualization

Page 29: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

KEPLER Transport Workflow

P.HUYNH, V.BASIUK

ScientificWorkflow

GRID HPC Visualization

End user only care about available resources, suitability to problem and performance,

Particular details of the underpinning hw/sw configurations is in general not of interest:

Hide or at least simplify complexity and promote usability from the end user point of view.

Lesson learned: Users much more forgiving towards HPC access complexity than they

appear to be towards grid access protocols uesability thresholds.

Page 30: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

The EUFORIA Services in the broader view

GRID and HPC infrastructures

EUFORIA EGEE DEISA

Page 31: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

Launching simple Grid jobs

Composite actor

Page 32: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

Example of composite actor - detailsCertificates

Job submit

Job status

Get output

Allows for arbitrary complex workflows to be built!

Page 33: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

GRID workflow

Kepler on the local cluster launches a simple Kepler workflow on

GRID:

Gateway RAS on euforia.efda-itm.eu GRID

Page 34: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

Workflow + fusion codes on GRID and HPC

MHD equilibrium codes

•Helena on GRID

•ILSA on HPC

DEMO D10 TODAY

EUFORIA-EGEE-DEISA Collaboration

Page 35: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

Data Management Data model (CPOs) not directly supported by Kepler workflow

Need to manage data outside WF Universal Access Layer

Built on fusion de facto standards “MDSplus” for transport and storage

Supports in memory, file (HDF5) and database (MDSplus) access

Challenges

Larger datasets (order Gb’s and beyond)

Efficiency in distributed environments?

Access model inherits concept of centralized data server (s) and cataloguing.

UAL

Extensible user API built on the CPO (“standardized data structures”)

A transport layer support local and remote access

A set of “backend” storage option:s Mdsplus, HDF5, (memory cache), ...!

Page 36: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

ITER – an International CollaborationExperimental facility:

-10Gbit/s during discharges, 500-1000s

- 20PB/year (lower bound estimates)

International partners:

- Data replication – at least two offsite rep.

- (Near) real time data streaming, inline

modelling data to remote centers

- “Semi” remote operation

- Middleware interoperability

agreement on single technology (most

interfaces will be centrally

managed/decided!)

- Resource sharing/policies

- IPR challenging issue.

Nuclear installation:

- Security

- licensing

“ITER aim is to demonstrate that it is possible to

produce commercial energy from fusion.”

First plasma 2019, full operation 2025 (!)

~3000-4000 remote participants

Page 37: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

Electron Temperature, DIII-D, with RMPs

(initially developed for stellarator applications W7AS, W7X, LHD) was advanced to a more flexible grid structure to allow divertor tokamak + RMP applications. first self-consistent 3D plasma and neutral gas transport simulations for poloidal divertor tokamak configurations with RMPs. Simulation results for ITER similar shape plasmas at DIII-D show a strong 3D spatial modulation of plasma parameter, e.g. in T

e.

EMC3-EIRENE code verification(by benchmarks with 2D tokamak edge codes) and validation (TEXTOR, DIII-D, JET, LHD experiments) ongoing EMC3-EIRENE is currently foreseen for contractual ITER RMP design studies (jointly by FZJ and IPP, 2010…)

Te,1

= 60 eVT

e,2 = 120 eV

Te,3

= 200 eV

Towards fully 3D CFD: The EMC3-EIRENE code (IPP Greifswald – FZ-Juelich)Frerichs, H., Reiter, D. et al., Comm. Phys. Commun. (2010) 181 61-70 and: Nuclear Fusion (2010) 50, in print

ScientificWorkflow

GRID HPC Visualization

Page 38: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

ASCOT calculations of ITER

Tuomas Koskela

The ASCOT group used the EUFORIA- DEISA resources for simulating fusion alpha particles in ITER with two computing time - intensive upgrades.

1. an experiment-based anomalous diffusion profile

2. full orbit integration near the wall for precision wall hits

The goal was to provide insight on the optimal location of Fast Ion Loss Detectors in ITER, which is still under discussion.

ScientificWorkflow

GRID HPC Visualization

3D power distribution on the wall under

anomalous diffusion

Page 39: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

Fusion has turned out to be somewhat more complex to master than initially believed

Contrary to popular beliefs - we ARE making rapid progress!

Page 40: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

Summary The fusion community is benefitting from e-

infrastructures and using input from EUFORIA and

collaborating project to do better science! Improvements and porting of single codes

GRID HPC

Using workflows Using visualization tools Running on the infrastructure provided by

GRID –EUFORIA VO sustainable into EGI DEISA and Project Partners

Training and outreach

Page 41: Advanced computing for fusion simulation and  modelling Pär Strand

EUFORIA FP7-INFRASTRUCTURES-2007-1, Grant 211804

Thanks Chalmers University of Technology (Coordinator) from Sweden

Max Plank Institute for Plasma Physics (IPP) from Germany

Centro Superior de Investigaciones Científicas (CSIC) from Spain

Centro de Investigaciones Energéticas, Medio Ambientales y Tecnológicas (CIEMAT) from Spain

Forschungszentrum Karlsruhe (FZK) from Germany

Finnish IT Center for Science (CSC) from Finland

Abo Akademi University (ABO) from Finland

University of Edinburgh (UEDIN) from United Kingdom

Barcelona Supercomputing Center (BSC) from Spain

French Atomic Energy Commission (CEA) from France

University of Strasbourg from France

University of Ljubljana (UOL) from Slovenia

Poznan Supercomputing and Networking Center PSNC from Poland

Italian National Agency for New Technologies, Energy and the Environment (ENEA) from Italy

EGEE

DEISA