![Page 1: Multiobjective Calibration with PADDS: Testing Alternative Selection Metrics](https://reader035.vdocuments.net/reader035/viewer/2022070422/568165f8550346895dd9218c/html5/thumbnails/1.jpg)
Multiobjective Calibration with PADDS: Testing Alternative Selection Metrics
Masoud AsadzadehBryan Tolson
![Page 2: Multiobjective Calibration with PADDS: Testing Alternative Selection Metrics](https://reader035.vdocuments.net/reader035/viewer/2022070422/568165f8550346895dd9218c/html5/thumbnails/2.jpg)
2
Outline• Objectives
• PA-DDS algorithm
• Alternative selection metrics
• Experiment to choose proper selection metric
• MO Performance Evaluation with CNHV
• Validation of Selected Metric, MO Model Calibration
• Conclusions and Future Work
![Page 3: Multiobjective Calibration with PADDS: Testing Alternative Selection Metrics](https://reader035.vdocuments.net/reader035/viewer/2022070422/568165f8550346895dd9218c/html5/thumbnails/3.jpg)
3
Objectives• Evaluating PA-DDS performance:
– Solving MOPs with more than 2 objectives– Using alternative selection metrics
• Random (RND)• Crowding Distance (CD)• Hypervolume (HV)
• Choosing proper selection metric• Validating selected metric, comparing modified
PA-DDS against high quality MO algorithms: – AMALGAM vs. ɛ-NSGAII vs. PA-DDS
![Page 4: Multiobjective Calibration with PADDS: Testing Alternative Selection Metrics](https://reader035.vdocuments.net/reader035/viewer/2022070422/568165f8550346895dd9218c/html5/thumbnails/4.jpg)
4
Pareto Archive DDSPerturb current
ND solutionUpdate ND solutions
Continue?STOP
New solution is ND?
Pick the New solution
Pick a ND solution
Initialize starting solutions
YN
Create ND-solution set
YN
![Page 5: Multiobjective Calibration with PADDS: Testing Alternative Selection Metrics](https://reader035.vdocuments.net/reader035/viewer/2022070422/568165f8550346895dd9218c/html5/thumbnails/5.jpg)
5
Alternative Selection Metrics
• Random Selection (RND)
• Crowding Distance (CD)– Deb et al. (2002)
• Contribution to HyperVolume (HV)– Zitzler and Thiele 1999– Used as selection metric in Emmerich et al. (2005) f1
f2
![Page 6: Multiobjective Calibration with PADDS: Testing Alternative Selection Metrics](https://reader035.vdocuments.net/reader035/viewer/2022070422/568165f8550346895dd9218c/html5/thumbnails/6.jpg)
6
Experiment to Choose Selection Metric
PA-DDS
RND CD HV
Mathematical Test Suites1 2 3
• Number of Trials: 50
• Budget: 1,000 and 10,000
• Performance Evaluation: CNHV
![Page 7: Multiobjective Calibration with PADDS: Testing Alternative Selection Metrics](https://reader035.vdocuments.net/reader035/viewer/2022070422/568165f8550346895dd9218c/html5/thumbnails/7.jpg)
7
Mathematical Test Problem, ZDT4Zitzler et al. (2000)
• 10 decision variables
• 2 objectives
• 219 local fronts
• Convex Pareto front
![Page 8: Multiobjective Calibration with PADDS: Testing Alternative Selection Metrics](https://reader035.vdocuments.net/reader035/viewer/2022070422/568165f8550346895dd9218c/html5/thumbnails/8.jpg)
8
Mathematical Test Problem, WFG4Huband et al. (2006)
• Scalable
• 24 decision variables
• 2 and 3 objectives
• Highly Multi-modal
• Concave front
![Page 9: Multiobjective Calibration with PADDS: Testing Alternative Selection Metrics](https://reader035.vdocuments.net/reader035/viewer/2022070422/568165f8550346895dd9218c/html5/thumbnails/9.jpg)
9
Mathematical Test Problem, WFG4Huband et al. (2006)
![Page 10: Multiobjective Calibration with PADDS: Testing Alternative Selection Metrics](https://reader035.vdocuments.net/reader035/viewer/2022070422/568165f8550346895dd9218c/html5/thumbnails/10.jpg)
10
MO Model Comparison• Comparative Normalized Hyper-Volume
1
1
Worst attained front
Best attained front
![Page 11: Multiobjective Calibration with PADDS: Testing Alternative Selection Metrics](https://reader035.vdocuments.net/reader035/viewer/2022070422/568165f8550346895dd9218c/html5/thumbnails/11.jpg)
11
CNHV vs. HV• Same as HV or NHV
– CNHV always prefers dominating solution
– CNHVA > CNHVB : B doesn’t weakly dominate A
– CNHVmax = 1 & CNHVmin = 0
• Compares multiple trials of multiple algorithms
• Measures performance across compared algorithms
![Page 12: Multiobjective Calibration with PADDS: Testing Alternative Selection Metrics](https://reader035.vdocuments.net/reader035/viewer/2022070422/568165f8550346895dd9218c/html5/thumbnails/12.jpg)
12
Results: ZDT4
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
RND1,000
CD1,000
HV1,000
CNHV
Prob
abili
ty o
f Fin
ding
Bet
ter
CN
HV
1
11
1
![Page 13: Multiobjective Calibration with PADDS: Testing Alternative Selection Metrics](https://reader035.vdocuments.net/reader035/viewer/2022070422/568165f8550346895dd9218c/html5/thumbnails/13.jpg)
13
Results: ZDT4
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
RND10,000
CD10,000
HV10,000
CNHV
Prob
abili
ty o
f Fin
ding
Bet
ter
CN
HV
![Page 14: Multiobjective Calibration with PADDS: Testing Alternative Selection Metrics](https://reader035.vdocuments.net/reader035/viewer/2022070422/568165f8550346895dd9218c/html5/thumbnails/14.jpg)
14
Results: WFG4 Two Objectives
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
RND1,000
CD1,000
HV1,000
CNHV
Prob
abili
ty o
f Fin
ding
Bet
ter
CN
HV
![Page 15: Multiobjective Calibration with PADDS: Testing Alternative Selection Metrics](https://reader035.vdocuments.net/reader035/viewer/2022070422/568165f8550346895dd9218c/html5/thumbnails/15.jpg)
15
Results: WFG4 Two Objectives
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
RND10,000
CD10,000
HV10,000
CNHV
Prob
abili
ty o
f Fin
ding
Bet
ter
CN
HV
![Page 16: Multiobjective Calibration with PADDS: Testing Alternative Selection Metrics](https://reader035.vdocuments.net/reader035/viewer/2022070422/568165f8550346895dd9218c/html5/thumbnails/16.jpg)
16
Results: WFG4 Three Objectives
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
RND1,000
CD1,000
HV1,000
CNHV
Prob
abili
ty o
f Fin
ding
Bet
ter
CN
HV
![Page 17: Multiobjective Calibration with PADDS: Testing Alternative Selection Metrics](https://reader035.vdocuments.net/reader035/viewer/2022070422/568165f8550346895dd9218c/html5/thumbnails/17.jpg)
17
Results: WFG4 Three Objectives
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
RND10,000
CD10,000
HV10,000
CNHV
Prob
abili
ty o
f Fin
ding
Bet
ter
CN
HV
![Page 18: Multiobjective Calibration with PADDS: Testing Alternative Selection Metrics](https://reader035.vdocuments.net/reader035/viewer/2022070422/568165f8550346895dd9218c/html5/thumbnails/18.jpg)
18
Validating the Selected MetricPA-DDS
RND CD HV
Mathematical Test Suites
PA-DDSε-NSGAII AMALGAM
Model Calibration
1 2 3
• Number of Trials: 10
• Budget: 10,000
• Performance Evaluation: CNHV
![Page 19: Multiobjective Calibration with PADDS: Testing Alternative Selection Metrics](https://reader035.vdocuments.net/reader035/viewer/2022070422/568165f8550346895dd9218c/html5/thumbnails/19.jpg)
19
• Sub-watershed in Cannonsville– 37 km2
• SWAT2000
• 26 Parameters
• Nash Sutcliffe– Flow, Phosphorus delivery
Model Calibration, Town Brook
Tolson and Shoemaker 2007
![Page 20: Multiobjective Calibration with PADDS: Testing Alternative Selection Metrics](https://reader035.vdocuments.net/reader035/viewer/2022070422/568165f8550346895dd9218c/html5/thumbnails/20.jpg)
20
Model Calibration Results
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
AMALGAM10,000
eNSGAII10,000
PA-DDS10,000
CNHV
Prob
abili
ty o
f Fin
ding
Bet
ter
CN
HV
![Page 21: Multiobjective Calibration with PADDS: Testing Alternative Selection Metrics](https://reader035.vdocuments.net/reader035/viewer/2022070422/568165f8550346895dd9218c/html5/thumbnails/21.jpg)
21
Model Calibration Results
0.52 0.53 0.54 0.55 0.56 0.57 0.58 0.59 0.6 0.61 0.62 0.63 0.64 0.65 0.66 0.67 0.68 0.69 0.7 0.710.58
0.6
0.62
0.64
0.66
0.68
0.7
0.72
0.74
0.76Actual Approximate Fronts, Combined 3 Worst CNHV, Budget 10,000
PA-DDSAMALGAMeNSGAIIBest Attained FrontWorst Attained Front
NS Flow
NS
Phos
phor
us T
rans
port
![Page 22: Multiobjective Calibration with PADDS: Testing Alternative Selection Metrics](https://reader035.vdocuments.net/reader035/viewer/2022070422/568165f8550346895dd9218c/html5/thumbnails/22.jpg)
22
Model Calibration Results
0.52 0.53 0.54 0.55 0.56 0.57 0.58 0.59 0.6 0.61 0.62 0.63 0.64 0.65 0.66 0.67 0.68 0.69 0.7 0.710.58
0.6
0.62
0.64
0.66
0.68
0.7
0.72
0.74
0.76Actual Approximate Fronts, Combined 4 Average CNHV, Budget 10,000
PA-DDSAMALGAMeNSGAIIBest Attained FrontWorst Attained Front
NS Flow
NS
Phos
phor
us T
rans
port
![Page 23: Multiobjective Calibration with PADDS: Testing Alternative Selection Metrics](https://reader035.vdocuments.net/reader035/viewer/2022070422/568165f8550346895dd9218c/html5/thumbnails/23.jpg)
23
Model Calibration Results
0.52 0.53 0.54 0.55 0.56 0.57 0.58 0.59 0.6 0.61 0.62 0.63 0.64 0.65 0.66 0.67 0.68 0.69 0.7 0.710.58
0.6
0.62
0.64
0.66
0.68
0.7
0.72
0.74
0.76Actual Approximate Fronts, Combined 3 Best CNHV, Budget 10,000
PA-DDS
AMALGAM
eNSGAII
Best Attained Front
Worst Attained Front
NS Flow
NS
Phos
phor
us T
rans
port
![Page 24: Multiobjective Calibration with PADDS: Testing Alternative Selection Metrics](https://reader035.vdocuments.net/reader035/viewer/2022070422/568165f8550346895dd9218c/html5/thumbnails/24.jpg)
24
Model Calibration Results
0.52 0.53 0.54 0.55 0.56 0.57 0.58 0.59 0.60 0.61 0.62 0.63 0.64 0.65 0.66 0.67 0.68 0.69 0.70 0.710.52
0.54
0.56
0.58
0.60
0.62
0.64
0.66
0.68
0.70
0.72
0.74
0.76Actual Approximate Fronts, 10 Trials Combined, Budget 10,000
PA-DDS
AMALGAM
eNSGAII
Best Attained Front
NS Flow
NS
Phos
phor
us T
rans
port
![Page 25: Multiobjective Calibration with PADDS: Testing Alternative Selection Metrics](https://reader035.vdocuments.net/reader035/viewer/2022070422/568165f8550346895dd9218c/html5/thumbnails/25.jpg)
25
Model Calibration Results
0.52 0.53 0.54 0.55 0.56 0.57 0.58 0.59 0.60 0.61 0.62 0.63 0.64 0.65 0.66 0.67 0.68 0.69 0.70 0.710.56
0.58
0.60
0.62
0.64
0.66
0.68
0.70
0.72
0.74
0.76Actual Approximate Fronts, 10 Trials Combined
PA-DDS10,000
PA-DDS1,000
Best Attained Front
NS Flow
NS
Phos
phor
us T
rans
port
![Page 26: Multiobjective Calibration with PADDS: Testing Alternative Selection Metrics](https://reader035.vdocuments.net/reader035/viewer/2022070422/568165f8550346895dd9218c/html5/thumbnails/26.jpg)
26
• PA-DDS inherits simplicity and parsimonious characteristics of DDS– Generates good Pareto approximate front in the modeller's time frame– Reduces the need to fine tune the algorithm parameters– Solves both continuous and discrete problems
• PA-DDS can solve MOPs with more than 2 objectives• HV based selection clearly improves PA-DDS performance• PA-DDS with HV selection is promising compared to two high quality
benchmark algorithms, AMALGAM and ε-NSGAII
Evaluate PA-DDS performance in solving Multi Objective model calibrations with more than 2 objective functions
Implement a more efficient archiving strategy and dominance check (e.g. Fieldsend et al. 2003)
Conclusions & Future Work
![Page 27: Multiobjective Calibration with PADDS: Testing Alternative Selection Metrics](https://reader035.vdocuments.net/reader035/viewer/2022070422/568165f8550346895dd9218c/html5/thumbnails/27.jpg)
27
![Page 28: Multiobjective Calibration with PADDS: Testing Alternative Selection Metrics](https://reader035.vdocuments.net/reader035/viewer/2022070422/568165f8550346895dd9218c/html5/thumbnails/28.jpg)
28
Budget vs. DimensionAlg. Study Type of MOP # DV Budget
AMALGAM
Vrugt and Robinson, 2006 Test problems (ZDT) 10 2,500; 5,000; 7,500; 15,000
Wohling et al. 2008 Soil hydraulic parameter estimation 15 20,000
Huisman et al. 2009 Coupled HYDRUS-2D, CRMOD 12 (?) 10,000
Zhang et al. 2010 SWAT 16 10,000
ε-NSGAII
Kollat, Reed, 2005 Test problems 10, 30 12,000 to 15,000
Kollat, Reed, 2006 Groundwater Monitoring (discrete) 25 200,000
![Page 29: Multiobjective Calibration with PADDS: Testing Alternative Selection Metrics](https://reader035.vdocuments.net/reader035/viewer/2022070422/568165f8550346895dd9218c/html5/thumbnails/29.jpg)
29
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
RND1,000 RND10,000
CD1,000 CD10,000
HV1,000 HV10,000
CNHV
Prob
abili
ty o
f Fin
ding
Bet
ter
CN
HV
Results: ZDT4
![Page 30: Multiobjective Calibration with PADDS: Testing Alternative Selection Metrics](https://reader035.vdocuments.net/reader035/viewer/2022070422/568165f8550346895dd9218c/html5/thumbnails/30.jpg)
30
Results: WFG4 Two Objectives
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
RND1,000 RND10,000
CD1,000 CD10,000
HV1,000 HV10,000
CNHV
Prob
abili
ty o
f Fin
ding
Bet
ter
CN
HV
![Page 31: Multiobjective Calibration with PADDS: Testing Alternative Selection Metrics](https://reader035.vdocuments.net/reader035/viewer/2022070422/568165f8550346895dd9218c/html5/thumbnails/31.jpg)
31
Results: WFG4 Three Objectives
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
RND1,000 RND10,000
CD1,000 CD10,000
HV1,000 HV10,000
CNHV
Prob
abili
ty o
f Fin
ding
Bet
ter
CN
HV
![Page 32: Multiobjective Calibration with PADDS: Testing Alternative Selection Metrics](https://reader035.vdocuments.net/reader035/viewer/2022070422/568165f8550346895dd9218c/html5/thumbnails/32.jpg)
32
Model Calibration Results
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
AMALGAM1,000 AMALGAM10,000
eNSGAII1,000 eNSGAII10,000
PA-DDS1,000 PA-DDS10,000
CNHV
Prob
abili
ty o
f Fin
ding
Bet
ter
CN
HV
![Page 33: Multiobjective Calibration with PADDS: Testing Alternative Selection Metrics](https://reader035.vdocuments.net/reader035/viewer/2022070422/568165f8550346895dd9218c/html5/thumbnails/33.jpg)
33
Model Calibration Results
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
AMALGAM1,000
eNSGAII1,000
PA-DDS1,000
CNHV
Prob
abili
ty o
f Fin
ding
Bet
ter
CN
HV