introduction - docushare · web view3.3.2.1.5.2checkpoint/restart within processing step 209...

388
LSST Data Management UML Use Case and Activity Model LDM-134 7/12/2011 Large Synoptic Survey Telescope (LSST) Data Management UML Use Case and Activity Model J Kantor, R Allsman, K-T Lim, T Axelrod, R Plante LDM-134 7/12/2011 1

Upload: nguyenliem

Post on 28-May-2018

220 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

Large Synoptic Survey Telescope (LSST)Data Management UML Use

Case and Activity Model

J Kantor, R Allsman, K-T Lim, T Axelrod, R Plante

LDM-134

7/12/2011

1

Page 2: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

Change Record

Version Date Description Owner name

1 1/28/2011 Update Document to reflect Model based on Data Challenge 3 J. Kantor

2 7/12/2011 Update Document to reflect Model based on Data Challenge 3B PT1 R. Allsman

i

Page 3: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

Table of Contents

Change Record................................................................................................................. i1 Introduction.................................................................................................................1

1.1 Purpose of Document.........................................................................................11.2 Glossary.............................................................................................................1

2 Application Overview..................................................................................................32.1 Define the Scope................................................................................................32.2 Context...............................................................................................................4

Use Cases........................................................................................................................53 DMS Use Case and Activity Model.............................................................................5

3.1 Actors................................................................................................................. 53.1.1 Alert Category Author <Actor>.....................................................................63.1.2 Camera <Actor>...........................................................................................63.1.3 Catalog Creator <Actor>..............................................................................63.1.4 Data Management System Administrator <Actor>.......................................63.1.5 LSST Operations <Actor>............................................................................73.1.6 LSST User <Actor>......................................................................................73.1.7 Observatory Control System <Actor>...........................................................73.1.8 Observatory Operations <Actor>..................................................................73.1.9 Pipeline Creator <Actor>..............................................................................73.1.10 Pipeline Operator <Actor>............................................................................73.1.11 Public Interface User <Actor>......................................................................73.1.12 Science User <Actor>..................................................................................73.1.13 Simulator <Actor>.........................................................................................83.1.14 VO Registry <Actor>....................................................................................83.1.15 VOEvent Author <Actor>..............................................................................83.1.16 VOEvent Subscriber <Actor>.......................................................................8

3.2 Application Use Cases and Activities.................................................................83.2.1 Alert Production Subsystem..........................................................................8

3.2.1.1 Process an Observing Night <UseCase>...............................................103.2.1.1.1 Process Raw Images to Alerts <UseCase>......................................10

3.2.1.1.1.1 Moving Object overlays star <Issue>..........................................113.2.1.1.1.2 When does Template Image get rotated? <Issue>......................113.2.1.1.1.3 Prepare for Observing <UseCase>.............................................123.2.1.1.1.4 Visit Image Processing <UseCase>...........................................12

3.2.1.2 Alert Production <Activity>......................................................................133.2.1.2.1 Process Exposures <ExpansionRegion>..........................................13

3.2.2 Data Release Production Subsystem..........................................................133.2.2.1 Produce a Data Release <UseCase>.....................................................16

3.2.2.1.1 Create Data Release Policy <UseCase>..........................................173.2.2.1.2 Monitor Data Release Progress <UseCase>....................................18

3.2.2.2 Data Release Production <Activity>.......................................................183.2.2.2.1 Per-CCD Processing <ExpansionRegion>........................................193.2.2.2.2 Per-Detection Processing <ExpansionRegion>................................20

ii

Page 4: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.2.2.3 Per-Sky Tile Processing <ExpansionRegion>...................................203.2.3 Calibration Products Production Subsystem..............................................20

3.2.3.1 Produce Calibration Data Products <UseCase>.....................................213.2.3.1.1 Acquire Calibration Data <UseCase>..............................................22

3.2.3.1.1.1 Prepare Standards and References <UseCase>........................233.2.3.2 Calibration Products Production <Activity>.............................................23

3.2.4 Association Pipeline....................................................................................243.2.4.1 Associate Visit Sources with Objects <UseCase>..................................273.2.4.2 Associate Sources with Objects <UseCase>..........................................28

3.2.4.2.1 Number of Object Catalogs <Issue>.................................................283.2.4.3 Night Source Association <Activity>.......................................................29

3.2.4.3.1 Calculate Object Zone Indices for Visit <Activity>.............................303.2.4.3.2 Match DiaSources to AstroObjects <Activity>...................................313.2.4.3.3 Preprocess AstroObject Catalog <Activity>......................................323.2.4.3.4 Update AstroObject and DiaSource Catalogs <Activity>..................333.2.4.3.5 Find Tiles that Overlap Field of View <Class>..................................343.2.4.3.6 Match DIA Sources to AstroObjects <Class>....................................343.2.4.3.7 Prepopulate Sky Tiles with Astro Objects <Class>...........................353.2.4.3.8 Update AstroObject and DIA Source Catalogs <Class>...................363.2.4.3.9 Build Destination Stripe Tesselation <Object>..................................37

3.2.4.4 DiaSource Association Pipeline <Activity>.............................................373.2.4.5 Object Merge/Association Pipeline <Activity>.........................................383.2.4.6 Source Association Pipeline <Activity>...................................................39

3.2.4.6.1 Acquire Metadata <Activity>..............................................................403.2.4.6.2 Source Cluster Attributes <Activity>..................................................403.2.4.6.3 Source Clustering <Activity>.............................................................40

3.2.5 Astrometric Calibration................................................................................413.2.5.1 Generate Astrometric Models <UseCase>.............................................423.2.5.2 Astrometric Model Generation Pipeline <Activity>..................................42

3.2.5.2.1 Acquire Sky Tile Metadata <Activity>................................................433.2.5.2.2 Associate Detections and Sources <Activity>...................................433.2.5.2.3 Calibrate Astrometry <Activity>.........................................................44

3.2.6 Deep Detection Pipeline..............................................................................443.2.6.1 Detect Deep Sources <UseCase>..........................................................453.2.6.2 Deep Detection Pipeline <Activity>.........................................................46

3.2.6.2.1 Acquire Sky Tile Metadata <Activity>................................................473.2.6.2.2 Detect Sources <Activity>.................................................................473.2.6.2.3 Measure Sources <Activity>..............................................................47

3.2.7 Difference Imaging Pipeline........................................................................483.2.7.1 Detect Sources in a Visit <UseCase>.....................................................493.2.7.2 Detect Sources <UseCase>...................................................................503.2.7.3 AP Difference Imaging Pipeline <Activity>..............................................503.2.7.4 Difference Imaging Pipeline <Activity>...................................................51

3.2.7.4.1 Identify Sky Tiles <Activity>..............................................................533.2.7.4.2 Extract Template <Activity>...............................................................53

iii

Page 5: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.7.4.3 Image Differencing <Activity>............................................................543.2.7.4.4 Difference Detection <Activity>.........................................................553.2.7.4.5 Difference Measurement <Activity>..................................................55

3.2.7.5 Commissioning vs. Main Processing <Issue>.........................................563.2.7.6 Complex Shape Determination <Issue>.................................................563.2.7.7 Inter-CCD/Raft/Focal Plane Communication <Issue>.............................56

3.2.8 Image Coaddition Pipelines.........................................................................573.2.8.1 Create Deep Coadds <UseCase>..........................................................603.2.8.2 Deep Coadd Generation Pipeline <Activity>...........................................61

3.2.8.2.1 Acquire Sky Tile Metadata <Activity>................................................613.2.8.2.2 Convert Exposure to SkyMapImage <Activity>.................................613.2.8.2.3 Generate ChiSquared Coadd <Activity>...........................................62

3.2.8.3 PSFMatch Pipeline <Activity>.................................................................623.2.8.3.1 Acquire Sky Tile Metadata <Activity>................................................633.2.8.3.2 Build Exposure Stack <Activity>........................................................633.2.8.3.3 Match PSF <Activity>........................................................................633.2.8.3.4 Reject Outliers <Activity>..................................................................643.2.8.3.5 Warp Exposure to SkyMapImage <Activity>.....................................65

3.2.8.4 TemplateGen Pipeline <Activity>............................................................653.2.8.4.1 Coadd without Outlier Rejection <Activity>.......................................663.2.8.4.2 Generate Sky Tile <Activity>.............................................................67

3.2.9 Image Processing Pipeline..........................................................................673.2.9.1 Process Raw Images to Calibrated Images <UseCase>........................73

3.2.9.1.1 Required TF <Requirement>.............................................................743.2.9.1.2 Build PSF Model <UseCase>............................................................743.2.9.1.3 Add Fake Objects <UseCase>..........................................................753.2.9.1.4 Determine WCS for a Science Image <UseCase>...........................75

3.2.9.2 ISR Pipeline <Activity>............................................................................763.2.9.2.1 Remove Cross-Talk <Activity>..........................................................763.2.9.2.2 SDQA for ISR <Activity>...................................................................773.2.9.2.3 Acquire Visit Metadata <Activity>......................................................773.2.9.2.4 Add Variance <Activity>....................................................................773.2.9.2.5 Apply Darks <Activity>......................................................................783.2.9.2.6 Defringe <Activity>............................................................................793.2.9.2.7 Flatten <Activity>...............................................................................793.2.9.2.8 Identify Calibration Product <Activity>...............................................803.2.9.2.9 Linearity <Activity>............................................................................803.2.9.2.10 Remove Bias <Activity>.................................................................803.2.9.2.11 Remove Overscans <Activity>........................................................813.2.9.2.12 Remove Saturation <Activity>.........................................................823.2.9.2.13 Transform Metadata <Activity>.......................................................833.2.9.2.14 Validate Metadata <Activity>...........................................................83

3.2.9.3 CCD Assembly Pipeline <Activity>.........................................................843.2.9.3.1 CCD Assembly <Activity>.................................................................843.2.9.3.2 SDQA for CCD Assembly <Activity>.................................................85

iv

Page 6: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.9.3.3 Identify Defects <Activity>.................................................................853.2.9.4 CR Split Handling Pipeline <Activity>.....................................................86

3.2.9.4.1 Background Estimation 1 <Activity>..................................................873.2.9.4.2 Background Estimation 2 <Activity>..................................................873.2.9.4.3 Find and Mask CRs 1 <Activity>.......................................................883.2.9.4.4 Find and Mask CRs 2 <Activity>.......................................................883.2.9.4.5 Mask and Sum <Activity>..................................................................883.2.9.4.6 Simple Image Differencing <Activity>................................................89

3.2.9.5 Image Characterization Pipeline <Activity>............................................893.2.9.5.1 Aperture Correction <Activity>..........................................................903.2.9.5.2 Bright Star Detection <Activity>.........................................................903.2.9.5.3 Bright Star Measurement <Activity>..................................................913.2.9.5.4 CCD Photometric Calibration <Activity>............................................923.2.9.5.5 Exposure Generation <Activity>........................................................923.2.9.5.6 PSF Determination <Activity>...........................................................933.2.9.5.7 WCS Determination <Activity>..........................................................933.2.9.5.8 WCS Verification <Activity>...............................................................94

3.2.10 Moving Object Pipelines (Day and Night)....................................................953.2.10.1 Identify Moving Objects <UseCase>.....................................................97

3.2.10.1.1 Running MOP before Transients <Issue>.......................................983.2.10.2 Mask Moving Objects from Image <UseCase>.....................................983.2.10.3 Mask Moving Objects Pipeline <Activity>.............................................98

3.2.10.3.1 Acquire Visit Metadata <Activity>....................................................983.2.10.3.2 Mask Footprints <Activity>..............................................................99

3.2.10.4 Compute Coarse Ephemerides for Night <Activity>..............................993.2.10.5 Night MOPS Pipeline <Activity>..........................................................1003.2.10.6 DayMOPS Pipeline <Activity>.............................................................102

3.2.10.6.1 Inter Night Linking <Activity>.........................................................1033.2.10.6.2 Intra Night Linking <Activity>.........................................................1033.2.10.6.3 Orbit Determination <Activity>.......................................................1043.2.10.6.4 Orbit Management <Activity>........................................................1043.2.10.6.5 Setup DayMOPS <Activity>..........................................................104

3.2.11 Object Characterization Pipeline...............................................................1053.2.11.1 Generate Galaxy Models <UseCase>................................................1073.2.11.2 Galaxy Model Generation Pipeline <Activity>.....................................107

3.2.11.2.1 Detect Transforms <Activity>........................................................1083.2.11.2.2 Forced Photometry <Activity>.......................................................1083.2.11.2.3 Multifit <Activity>...........................................................................1083.2.11.2.4 Postage Stamp Generation <Activity>..........................................109

3.2.12 Photometric Calibration Pipeline...............................................................1093.2.12.1 Difference Image Forced Photometry <UseCase>............................1123.2.12.2 Recalibrate Data Release Photometry <UseCase>............................1123.2.12.3 Difference Forced Photometry Pipeline <Activity>..............................112

3.2.12.3.1 Acquire Visit Metadata <Activity>..................................................1133.2.12.3.2 Measure Sources <Activity>..........................................................113

v

Page 7: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.12.4 Photometric Calibration Pipeline <Activity>........................................1133.2.12.4.1 Assess TOA Mag Distributions <Activity>.....................................1143.2.12.4.2 Calculate Photometric Calibration SDQA Metrics <Activity>.........1143.2.12.4.3 Compare Grey Atm with IR Camera Data <Activity>....................1153.2.12.4.4 Correct for Atmospheric Extinction <Activity>...............................1153.2.12.4.5 Find Non-gray Extinction <Activity>..............................................116

3.2.13 Single Frame Measurement Pipeline........................................................1173.2.13.1 Measure Single Frame Sources <UseCase>......................................1193.2.13.2 Single Frame Source Measurement Pipeline <Activity>.....................119

3.2.13.2.1 Acquire Visit Metadata <Activity>..................................................1193.2.13.2.2 Compute Source Sky Coordinates <Activity>...............................1203.2.13.2.3 Detect Sources <Activity>.............................................................1203.2.13.2.4 Measure Sources <Activity>..........................................................120

3.2.14 Science Data Quality Analysis Toolkit.......................................................1203.2.15 Science Data Quality Assessment Pipeline...............................................121

3.2.15.1 Assess Data Quality <UseCase>........................................................1243.2.15.1.1 Assess Data Quality for Nightly Processing at Archive <UseCase>

1253.2.15.1.1.1 Analyze Astrometric Quality <UseCase>.................................1263.2.15.1.1.2 Analyze Image Quality <UseCase>.........................................1273.2.15.1.1.3 Analyze Object Properties Quality <UseCase>.......................1273.2.15.1.1.4 Analyze Orbit Quality <UseCase>...........................................1283.2.15.1.1.5 Analyze Outliers <UseCase>..................................................1283.2.15.1.1.6 Analyze Photometric Quality <UseCase>................................128

3.2.15.2 SDQA Interactive Environment <UseCase>......................................1293.2.15.2.1 Observatory Operations (Camera Scientist/Pipeline Controller/Scheduler) <Actor>..........................................................................1333.2.15.2.2 Adjust SDQA Thresholds <UseCase>..........................................1333.2.15.2.3 Analyze SDQA Metrics using SQuAT <UseCase>.......................134

3.2.15.2.3.1 display filtered SDQA Results <Object>..................................1353.2.15.2.3.2 display partial SDQA results <Object>....................................1353.2.15.2.3.3 display SDQA metric set up <Object>.....................................1353.2.15.2.3.4 display SDQA Results <Object>..............................................1363.2.15.2.3.5 display task selection <Object>...............................................1363.2.15.2.3.6 filter SDQA results <Object>...................................................1363.2.15.2.3.7 format output <Object>............................................................1363.2.15.2.3.8 generate SDQA Results <Object>...........................................1363.2.15.2.3.9 query database <Object>........................................................136

3.2.15.2.4 Check Basic Integrity of Catalog <UseCase>...............................1363.2.15.2.5 Check Coadd QA Diagnostics <UseCase>...................................1373.2.15.2.6 Compare Uncertainties in Object Properties with Expectation <UseCase>1373.2.15.2.7 Compute Completeness and Reliability <UseCase>....................1383.2.15.2.8 Correlate SDQA metric with other data <UseCase>.....................1383.2.15.2.9 Correlate SDQA metrics <UseCase>............................................139

vi

Page 8: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.15.2.10 Examine Data Quality Status <UseCase>..................................1393.2.15.2.11 Generate Data Quality Alarm <UseCase>..................................1393.2.15.2.12 Modify Operations Based on Data Quality Status <UseCase>.. .1403.2.15.2.13 Override Data Quality Status/Add Comments <UseCase>.........1403.2.15.2.14 Perform Large-Scale Sanity Checks <UseCase>.......................1413.2.15.2.15 Placeholder for Additonal Use Cases <UseCase>......................1413.2.15.2.16 Present Default Detailed SDQA Information <UseCase>...........1413.2.15.2.17 Present Default Summary SDQA Information <UseCase>.........1423.2.15.2.18 Review Customized Statistical Summary <UseCase>................1423.2.15.2.19 Review Data for Specifc Period of Time <UseCase>..................1433.2.15.2.20 Review data for specific region of focal plane <UseCase>.........1433.2.15.2.21 Review Data from Facilities Database <UseCase>....................1443.2.15.2.22 Review Default SDQA Data <UseCase>....................................1443.2.15.2.23 Review Histograms <UseCase>.................................................1453.2.15.2.24 Review Histograms Generic <UseCase>....................................1453.2.15.2.25 Review Time Series <UseCase>................................................1463.2.15.2.26 Select SDQA Display Type <UseCase>....................................1463.2.15.2.27 Specify SDQA Task <UseCase>.................................................147

3.2.15.2.27.1 SDQA Results <Object>........................................................1483.2.15.2.28 Summarize Quality of Observing Conditions <UseCase>...........1483.2.15.2.29 View Calibration Data <UseCase>..............................................1493.2.15.2.30 View Data from Ancillary Telescope <UseCase>........................1493.2.15.2.31 View Processed Image <UseCase>............................................1493.2.15.2.32 View Raw Exposure Image <UseCase>.....................................1503.2.15.2.33 View Selected Catalog Data <UseCase>....................................150

3.2.16 Community Science Subsystems..............................................................1513.2.16.1 LSST Science Use Cases....................................................................152

3.2.16.1.1 Analyze Color-Color Diagram <UseCase>....................................1543.2.16.1.2 Create Cleaned Color-Magnitude Diagram <UseCase>...............154

3.2.16.1.2.1 can we do this? <Issue>..........................................................1553.2.16.1.3 Create Color-Magnitude Diagram <UseCase>.............................156

3.2.16.1.3.1 generate Color Magnitude Table <Object>..............................1573.2.16.1.4 Create Corrected Color-Magnitude Diagram <UseCase>.............1573.2.16.1.5 Create Stellar Color-Color Diagram <UseCase>..........................1603.2.16.1.6 Derive Galaxy Luminosity Function <UseCase>...........................1613.2.16.1.7 Derive Stellar Luminosity Function <UseCase>............................1623.2.16.1.8 Discover groups and clusters of galaxies <UseCase>..................1633.2.16.1.9 Extract Time Sequence of Images <UseCase>............................1643.2.16.1.10 Extract Time Series for Objects <UseCase>...............................165

3.2.16.1.10.1 Source vs. Object? <Issue>..................................................1663.2.16.1.11 Find all Lensed Quasar Candidates <UseCase>........................1663.2.16.1.12 Generate photometric redshift for a galaxy <UseCase>.............169

3.2.16.2 Example Complex Science Use Cases..............................................1703.2.16.3 Example Simple Science Use Cases...................................................171

3.2.17 Classification.............................................................................................172

vii

Page 9: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.17.1 Classify Objects <UseCase>..............................................................1723.2.17.1.1 Classify Extended Objects <UseCase>........................................1743.2.17.1.2 Classify Stars <UseCase>............................................................1743.2.17.1.3 Classify Variable Objects <UseCase>..........................................175

3.2.18 Alert Generation Pipeline..........................................................................1763.2.18.1 Generate Alerts from Visit <UseCase>...............................................1773.2.18.2 Alert Generation Pipeline <Activity>...................................................178

3.2.19 Alert/Notification Toolkit.............................................................................1783.2.19.1 Create Alert Category <UseCase>.....................................................1793.2.19.2 Create an Alert Filter <UseCase>.......................................................1803.2.19.3 Deliver Alerts <UseCase>...................................................................1803.2.19.4 Process Subscription Requests <UseCase>......................................1803.2.19.5 Record Alert <UseCase>....................................................................1813.2.19.6 Retrieve Alerts <UseCase>................................................................1813.2.19.7 Subscribe to Alert Category <UseCase>............................................181

3.3 Middleware Use Cases...................................................................................1823.3.1 Pipeline Construction Toolkit.....................................................................182

3.3.1.1 Construct Pipeline <UseCase>.............................................................1823.3.1.1.1 Create Component <UseCase>......................................................183

3.3.1.1.1.1 Componentize and Add to Component Library <UseCase>......1843.3.1.1.1.2 Create Component Algorithm <UseCase>................................1843.3.1.1.1.3 Define Component Data Structures <UseCase>.......................1843.3.1.1.1.4 Define Component Interface <UseCase>..................................185

3.3.1.1.2 Create Pipeline <UseCase>............................................................1853.3.1.1.2.1 Browse Component Library <UseCase>...................................1863.3.1.1.2.2 Define Execution Environment <UseCase>..............................1863.3.1.1.2.3 Define Pipelines/Tools Associations <UseCase>......................1863.3.1.1.2.4 Define Processing Steps <UseCase>.......................................1873.3.1.1.2.5 Publish Pipeline Configuration <UseCase>...............................1893.3.1.1.2.6 Save Pipeline Configuration <UseCase>..................................1893.3.1.1.2.7 Target to Execution Environment <UseCase>..........................189

3.3.1.1.2.7.1 Pipeline Developer <Actor>.................................................1903.3.1.1.2.7.2 Data Products Access List <Class>.....................................1903.3.1.1.2.7.3 Pipeline Services Library (per target) <Class>....................1903.3.1.1.2.7.4 Pipeline Workflow Database <Class>..................................1913.3.1.1.2.7.5 Target <Class>....................................................................1913.3.1.1.2.7.6 Target Executable (binary) <Class>....................................1913.3.1.1.2.7.7 Target Platform Specification Database <Class>................191

3.3.2 Pipeline Execution Services......................................................................1913.3.2.1 Run a Pipeline <UseCase>...................................................................192

3.3.2.1.1 Updating Sources & Sinks <Issue>.................................................1943.3.2.1.2 Preload a Database <UseCase>.....................................................1953.3.2.1.3 Stage a Named Collection <UseCase>...........................................1953.3.2.1.4 Pipeline Execution and Monitoring <UseCase>..............................196

3.3.2.1.4.1 Create Slice Intracommunicator <UseCase>...........................197

viii

Page 10: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.3.2.1.4.2 Initialize Pipeline Events <UseCase>........................................1983.3.2.1.4.3 Load Pipeline Policy <UseCase>..............................................1993.3.2.1.4.4 Execute Processing Stage <UseCase>.....................................200

3.3.2.1.4.4.1 Initialize Processing Stage <UseCase>...............................2013.3.2.1.4.4.2 Process a Data Input Through A Stage <UseCase>...........202

3.3.2.1.4.4.2.1 Perform InterSlice Communication <UseCase>.............2023.3.2.1.4.4.2.1.1 Post Received Data to Clipboard <UseCase>..........2043.3.2.1.4.4.2.1.2 Retrieve Shared Data from Clipboard <UseCase>...2043.3.2.1.4.4.2.1.3 Transmit Data between Slices <UseCase>..............205

3.3.2.1.4.4.2.2 Process <UseCase>......................................................2053.3.2.1.4.4.2.3 Post Process <UseCase>..............................................2063.3.2.1.4.4.2.4 Pre-Process <UseCase>...............................................206

3.3.2.1.4.4.3 Terminate Processing Stage <UseCase>............................2073.3.2.1.4.5 Create Slices <UseCase>.........................................................2073.3.2.1.4.6 Record Pipeline Provenance <UseCase>.................................208

3.3.2.1.5 Checkpoint/Restart Pipeline <UseCase>........................................2083.3.2.1.5.1 Checkpoint/Restart Between Processing Steps <UseCase>....2093.3.2.1.5.2 Checkpoint/Restart within Processing Step <UseCase>...........209

3.3.2.1.6 Clean Up after Execution <UseCase>............................................2093.3.2.1.7 Configure Pipeline <UseCase>.......................................................210

3.3.2.1.7.1 Retrieve Default Pipeline Policies <UseCase>..........................2123.3.2.1.7.2 Distribute Programs to Processing Nodes <UseCase>.............2133.3.2.1.7.3 Initialize Processing Nodes <UseCase>....................................2143.3.2.1.7.4 Select Input Sources and Output Sinks <UseCase>.................2143.3.2.1.7.5 Set Monitoring and Control Parameters <UseCase>................215

3.3.2.1.8 Monitor Pipeline Execution <UseCase>..........................................2163.3.2.1.8.1 Detect Failure <UseCase>........................................................2183.3.2.1.8.2 Display Pipeline Status <UseCase>..........................................2183.3.2.1.8.3 Recover from Hardware Failure <UseCase>.............................2193.3.2.1.8.4 Recover from Software Failure <UseCase>..............................219

3.3.2.1.9 Record Pipeline Execution Status <UseCase>...............................2203.3.2.1.10 Stage Input Data <UseCase>.......................................................2213.3.2.1.11 Stop Pipeline Execution <UseCase>............................................222

3.3.2.1.11.1 Shutdown Slices <UseCase>..................................................2233.3.3 Control and Management Services...........................................................224

3.3.3.1 Control DM System <UseCase>...........................................................2243.3.3.1.1 Fetch Template Images & Updated Orbit Catalog from Archive <UseCase>.......................................................................................................2273.3.3.1.2 Produce Nightly DMS Summaries <UseCase>...............................2273.3.3.1.3 Send Raw Image Data to Archive <UseCase>...............................2273.3.3.1.4 Initialize DMS <UseCase>..............................................................2273.3.3.1.5 Monitor DM System Health and Status <UseCase>.......................2283.3.3.1.6 Publish Data Products <UseCase>.................................................228

3.3.3.1.6.1 Preserve/Retire Data Product <UseCase>................................2303.3.3.1.6.2 Publish in Archive Center <UseCase>......................................230

ix

Page 11: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.3.3.1.6.3 Register in VO <UseCase>.......................................................2303.3.3.1.6.4 Replicate in Data Centers <UseCase>......................................230

3.3.3.1.7 Reprocess Observations <UseCase>.............................................2313.3.3.1.7.1 Analyze Re-processing Needed <UseCase>............................2323.3.3.1.7.2 Create Re-processing Strategy <UseCase>.............................232

3.3.3.1.8 Stop DMCS <UseCase>.................................................................2323.3.3.2 Event Handling......................................................................................233

3.3.3.2.1 Create Event Receiver <UseCase>................................................2343.3.3.2.2 Create Event Transmitter <UseCase>............................................2343.3.3.2.3 Create Timer <UseCase>...............................................................2353.3.3.2.4 Establish Event Receiver In Event System <UseCase>.................2353.3.3.2.5 Establish Event Transmitter In Event System <UseCase>.............2363.3.3.2.6 Initialize Event Monitor <UseCase>................................................2373.3.3.2.7 Matching Receive Event <UseCase>..............................................237

3.3.3.2.7.1 Publish Event <UseCase>.........................................................2373.3.3.2.8 Process Incoming Event <UseCase>..............................................2383.3.3.2.9 Publish Event Using Event System <UseCase>.............................2383.3.3.2.10 Receive Event <UseCase>...........................................................2393.3.3.2.11 Record Event <UseCase>............................................................2393.3.3.2.12 Retrieve Event Using Event System <UseCase>.........................2403.3.3.2.13 Retrieve Matching Event Using Event System <UseCase>..........2403.3.3.2.14 Run Event Monitor <UseCase>.....................................................2413.3.3.2.15 Run Fault Monitor <UseCase>......................................................2413.3.3.2.16 Subscribe to an Event Topic <UseCase>.....................................242

3.3.4 Science Database and Data Access Services..........................................2423.3.4.1 Catalog Construction Toolkit..................................................................243

3.3.4.1.1 Construct Catalog <UseCase>........................................................2433.3.4.1.1.1 Create Catalog <UseCase>......................................................244

3.3.4.1.1.1.1 Compile and Publish Catalog <UseCase>...........................2453.3.4.1.1.1.2 Create Loader <UseCase>..................................................2463.3.4.1.1.1.3 Define Access Constraints <UseCase>...............................2463.3.4.1.1.1.4 Define Validation Constraints <UseCase>...........................2463.3.4.1.1.1.5 Initialize Catalog Contents <UseCase>...............................2473.3.4.1.1.1.6 Select Data Types <UseCase>............................................247

3.3.4.1.1.2 Create Data Type <UseCase>..................................................2473.3.4.1.1.2.1 Browse Data Types <UseCase>..........................................2483.3.4.1.1.2.2 Compile and Publish Data Type <UseCase>......................2493.3.4.1.1.2.3 Define Attributes <UseCase>..............................................2493.3.4.1.1.2.4 Define Data Associations and Indexing <UseCase>...........2493.3.4.1.1.2.5 Define Physical Storage <UseCase>...................................250

3.3.4.2 Query Services......................................................................................2503.3.4.2.1 Formulate and Submit Query <UseCase>......................................2513.3.4.2.2 Process Query <UseCase>.............................................................252

3.3.4.2.2.1 Log Queries <UseCase>...........................................................2523.3.4.2.2.2 Catch Hostile Queries <UseCase>............................................252

x

Page 12: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.3.4.2.2.3 Estimate Query Cost <UseCase>.............................................2533.3.4.2.2.4 Access Database Systems <UseCase>....................................2533.3.4.2.2.5 Access File Systems <UseCase>.............................................2533.3.4.2.2.6 Process Query Results <UseCase>..........................................253

3.3.4.2.3 SQL Syntax <Issue>.......................................................................2543.3.4.3 Data Ingest (Database Services)..........................................................254

3.3.4.3.1 Run Data Ingest <UseCase>..........................................................2543.3.4.3.1.1 Run Catalog Ingest Service <UseCase>...................................255

3.3.4.3.1.1.1 Initialize Data Ingest <UseCase>.........................................2573.3.4.3.1.1.2 Register Data Ingest Converter <UseCase>.......................2583.3.4.3.1.1.3 Unregister Data Ingest Converter <UseCase>....................2593.3.4.3.1.1.4 Control Ingest Tables <UseCase>.......................................2603.3.4.3.1.1.5 Verify Input Data <UseCase>..............................................2603.3.4.3.1.1.6 Load Data Chunk to Database <UseCase>.........................2613.3.4.3.1.1.7 Expose Ingested Data <UseCase>......................................2623.3.4.3.1.1.8 Cleanup <UseCase>............................................................2623.3.4.3.1.1.9 Finish Nightly Ingest <UseCase>.........................................263

3.3.4.3.1.1.9.1 Finish Nightly Ingest at Base Camp <UseCase>...........2633.3.4.3.1.1.9.2 Finish Nightly Ingest at Main Archive <UseCase>.........264

3.3.4.3.1.1.10 Shutdown Data Ingest <UseCase>....................................2643.3.4.3.1.1.11 Record event <Object>......................................................2653.3.4.3.1.1.12 Get Catalog Ingest Configuration <Object>.......................265

3.3.4.3.1.2 Run File Ingest Service <UseCase>.........................................2653.3.4.3.1.2.1 Assign Identifiers to Dataset <UseCase>............................2663.3.4.3.1.2.2 Copy Data to Long-Term Storage <UseCase>....................2673.3.4.3.1.2.3 Ensure Data is Staged for Ingest <UseCase>.....................2673.3.4.3.1.2.4 Expose Ingested Data Files and Metadata <UseCase>......2673.3.4.3.1.2.5 Extract & Verify Metadata From Dataset <UseCase>..........2673.3.4.3.1.2.6 Replicate Data Files and Metadata to Mirror Sites <UseCase>

2683.3.4.3.2 Run Data Ingest Converter <UseCase>..........................................268

3.3.4.3.2.1 Initialize Data Ingest Converter <UseCase>..............................2703.3.4.3.2.2 Convert Input Data <UseCase>................................................2713.3.4.3.2.3 Shutdown Data Ingest Converter <UseCase>...........................271

3.3.4.4 Data Access Framework........................................................................2723.3.4.4.1 LsstData and Citizen Use Cases......................................................272

3.3.4.4.1.1 Lsst Application <Actor>............................................................2733.3.4.4.1.2 Configure LsstData Support <UseCase>..................................273

3.3.4.4.1.2.1 Policy Factory <Object>.......................................................2743.3.4.4.1.3 Obtain an LsstData realization <UseCase>...............................274

3.3.4.4.2 Persistence Use Cases....................................................................2763.3.4.4.2.1 Define persistence policies <UseCase>....................................2783.3.4.4.2.2 Execute persistence <UseCase>..............................................2783.3.4.4.2.3 Execute retrieval <UseCase>....................................................2793.3.4.4.2.4 Format and send Persistable object to Storage(s) <UseCase> 280

xi

Page 13: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.3.4.4.2.5 Obtain and configure Persistence object <UseCase>...............2813.3.4.4.2.6 Obtain and configure Storage object(s) for persistence <UseCase> 2813.3.4.4.2.7 Persist Data from Pipeline <UseCase>.....................................2823.3.4.4.2.8 Retrieve Persistable object from Storage(s) <UseCase>..........2823.3.4.4.2.9 Specify additional object metadata <UseCase>........................283

3.3.4.4.3 Provenance Use Cases...................................................................2833.3.4.4.3.1 Re-create Science Exposure <UseCase>.................................284

3.3.4.5 Data Access for Pipelines......................................................................2853.3.4.5.1 Configure Object Catalog for association <UseCase>....................2863.3.4.5.2 Generate list of Sky Patches <UseCase>.......................................288

3.3.4.5.2.1 Sky Coverage <Object>............................................................2903.3.4.5.3 Initialize Catalogs <UseCase>........................................................2903.3.4.5.4 Prepare Data Access for Pipeline <UseCase>...............................2913.3.4.5.5 Retrieve Image Fragments from Image Collection <UseCase>......2923.3.4.5.6 Retrieve Image from Image Collection <UseCase>........................293

3.3.4.5.6.1 Map to physical location <Object>.............................................2943.3.4.5.7 Retrieve Template/Co-Add covering an area <UseCase>..............2953.3.4.5.8 Setup access to Co-add/Template Collection <UseCase>.............2963.3.4.5.9 Setup access to Image Collection <UseCase>...............................297

3.3.5 Security and Access Control Services.......................................................2993.3.5.1 Administer Certificates <UseCase>......................................................2993.3.5.2 Administer Groups and Users <UseCase>...........................................3003.3.5.3 Authenticate <UseCase>......................................................................3003.3.5.4 Authorize <UseCase>...........................................................................3013.3.5.5 Configure Security Profiles and Policies <UseCase>...........................3013.3.5.6 Enforce Security Policies <UseCase>..................................................302

3.3.6 Sys Admin and Opns Services..................................................................3023.3.6.1 Administer and Maintain Applications Software <UseCase>................3033.3.6.2 Administer and Maintain Data <UseCase>...........................................3043.3.6.3 Administer and Maintain Systems <UseCase>.....................................304

3.3.7 User Interface/Visualization Services........................................................3053.3.7.1 Dynamic Display of Data and Meta Data <UseCase>..........................3053.3.7.2 Static Display of Data and Meta Data <UseCase>...............................306

xii

Page 14: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

1 IntroductionThis document is an export of the Use Cases and Activities in the LSST Data Management UML Model. The UML Model is the baseline repository for all LSST Data Management System (DMS) software engineering specifications, including Use Cases, Activities, Domain Model, Robustness Diagrams, Logical Model (Class and Sequence Diagrams).

LSST DM uses the Iconix Process for software engineering depicted below.

The Use Case / Activity Model specifies the "dynamic" behavior of the DMS, i.e. its processing requirements. A companion document is the DMS Domain Model, which expresses the "static" data model of the DMS. From the Use Case / Activity and Domain Models, the design is developed and expressed in the Robustness, Sequence, and Class Diagrams. Finally, the design is used to both generate code in a forward engineering process, and to update the design with programming changes in a reverse engineering process.

1.1 Purpose of DocumentThe Purpose of this Document is to document the Use Case / Activity Model for the LSST Data Management System (DMS). This model defines the primary processing requirements for the DMS.

1.2 Glossary.Group Term DefinitionBusiness Accounting PeriodsTechnical Association A relationship between two or

more entities. Implies a connection of some type - for example one entity uses the

1

Page 15: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

Technical Class A logical entity encapsulating data and behavior. A class is atemplate for an object - the class is the design, the object the runtime instance.

Technical Component Model The component model provides a detailed view of the various hardware and softwarecomponents that make up the proposed system. It shows both where these components reside and how they inter-relate with other components. Component requirements detail what responsibilities a component has to supply functionality or behavior within the system.

Business Customer A person or a company that requests An entity to transport goods on their behalf.

Technical Deployment Architecture A view of the proposed hardware that will make up thenew system, together with the physical components that will execute on that hardware. Includes specifications for machine, operating system, network links, backup units &etc.

Technical Deployment Model A model of the system as it willbe physically deployed

Technical Extends Relationship A relationship between two use cases in which one use case 'extends' the behavior of another. Typically this represents optional behavior ina use case scenario - for example a user may optionally request a list or report at somepoint in a performing a business use case.

Technical Includes Relationship A relationship between two use cases in which one use case 'includes' the behavior. This is indicated where there aspecific business use cases which are used from many other places - for example updating a train record may bepart of many larger business

2

Page 16: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

Technical Use Case Each Use Case has a description which describes the functionality that will be built in the proposed system. A Use Case may 'include' another Use Case's functionality or 'extend' another Use Case with its own behavior.Use Cases are typically related to 'actors'. An actor is a human or machine entity that interacts with the system to perform meaningful work.

2 Application OverviewThis section describes the scope and context of the LSST Data Management System.

2.1 Define the ScopeThe principal functions of the DMS are to:

§ Process the incoming stream of images generated by the camera system during observing to generate and archive the nightly data products.

§ Periodically process the accumulated nightly data products to measure the properties of fainter objects and to classify objects based on their time-dependent behavior. The results of such a processing run form a data release (DR), which is a static, self-consistent data set for use in performing scientific analysis of LSST data and publication of the results.

§ Make all LSST data publicly available through an interface that utilizes, to the maximum possible extent, community-based standards such as those being developed by the Virtual Observatory (VO).

2.2 ContextThe LSST Data Management System is one of the three main subsystems of the LSST. It accepts science raw images from the Camera Data Acquisition subsytem (DAQ) and control information and metadata from the Observatory Control System (OCS).

3

Page 17: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

Use Cases

3 DMS Use Case and Activity Model

Use Case Model Packages - Use Case Diagramuc Use Case Model Packages

Image Processing Pipeline

Difference Imaging Pipeline

Community Science Subsystems

Pipeline Construction Toolkit

Association Pipeline

Deep Detection Pipeline

Security and Access Control Services

Alert/Notification Toolkit

Query Services

Name: Use Case Model PackagesAuthor: Jeff KantorVersion: 1.0Created: 2/12/2001 12:00:00 AMUpdated: 2/21/2011 10:30:51 AM

The LSST Data Management Subsystem

LSST User Observatory Control System

VO RegistryCamera Simulator

Application Layer

Middleware Layer

Moving Object Pipelines (Day and Night)

Actors

Data Ingest (Database Services) Catalog Construction Toolkit

Sys Admin and Opns Services User Interface/Visualization Services

DistributedProcessingServices

DataAccessServices

OtherServices

Calibration Products Production Subsystem

Data Release Production Subsystem

Alert Production Subsystem Alert Generation Pipeline

Astrometric Calibration

Classification

Image Coaddition Pipelines Science Data Quality Analysis Toolkit

Photometric Calibration Pipeline

Event Handling Data Access Framework

Data Access for Pipelines

Use Case Model Packages - Use Case Diagram

3.1 Actors

4

Page 18: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

Actors - Logical Diagramclass Actors

Data Management System AdministratorObservatory Operations Pipeline Operator

LSST Operations

Human ActorsSystem Actors

Observatory Control System

Public Interface User

VO Registry

Science User

Pipeline Creator Catalog Creator

LSST User

Camera Simulator

VOEvent Subscriber

Alert Category Author

VOEvent Author

«realize»«realize»

Actors - Logical Diagram

3.1.1 Alert Category Author <Actor>This is a user that sets up LSST Alert Categories, allowing for later Subscriptions to these Categories

3.1.2 Camera <Actor>This actor represents the Camera subsystem of the LSST.

3.1.3 Catalog Creator <Actor>This actor is any user that has the access necessary to create a new catalog type or a new instance of an existing catalog type and to cause that instance to be populated with data.

3.1.4 Data Management System Administrator <Actor>

3.1.5 LSST Operations <Actor>This actor is any user that performs an operational role in the LSST Observatory, including

5

Page 19: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.1.6 LSST User <Actor>

3.1.7 Observatory Control System <Actor>This actor represents the overall master control system that coordinates the operation of all LSSTsubsystems.

3.1.8 Observatory Operations <Actor>This actor has authority to permit LSST Data Products to be released external to the project.

3.1.9 Pipeline Creator <Actor>This actor is any user that has the access necessary to create a new component or pipeline type or a new instance of an existing component or pipeline type and to cause that instance to be available for execution.

3.1.10 Pipeline Operator <Actor>This actor is any user with access to cause pipelines to execute, to terminate, or to be stopped and started.

3.1.11 Public Interface User <Actor>This actor represents all users/systems that access LSST public interfaces

3.1.12 Science User <Actor>This actor is any user who has access to LSST Data Products, Pipelines, or both.

3.1.13 Simulator <Actor>This actor represents any source of simlulated LSST science data, including images, meta-data, catalog data, alerts, etc.

6

Page 20: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.1.14 VO Registry <Actor>This actor represents any system that hosts a publicly accessible VO Registry.

3.1.15 VOEvent Author <Actor>Author of VOEvent types offered by a Producer/Publisher

3.1.16 VOEvent Subscriber <Actor>This is a user/system that subscribes to LSSTAlerts via the VOEvent Services

3.2 Application Use Cases and Activities

3.2.1 Alert Production Subsystem

includes software programs, configuration files, databases and data files, unit tests, component integration tests, and documentation implementing the Alert Production Pipelines. This Alert Production element is deployed at both the Base Center and the Archive Center.

DMS Alert Production - Activity Diagram

7

Page 21: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

act DMS Alert Production

Process Exposures«stream»

predicted :Ephemerides nightMOPS SDQA quantity

«structured»Night MOPS Pipeline

(from Moving Object Pipelines (Day and Night))

predicted :Ephemerides nightMOPS SDQA quantity

diffExposure :Difference Exposure

transientAlert :Transient Alert

Alert Generation SDQA quantity«structured»

Alert Generation Pipeline

(from Alert Generation Pipeline)

diffExposure :Difference Exposure

transientAlert :Transient Alert

Alert Generation SDQA quantity

Begin Alert Production

Ephemerides SDQA quantity«structured»Compute Coarse

Ephemerides for Night

(from Moving Object Pipelines (Day and Night))

Ephemerides SDQA quantity

Sync 1

SDQA Alert Production Summary Pipeline

(from Science Data Quality Assessment Pipeline)

diaSource :DIA SourceAP Association SDQA quantity

«structured»Night Source Association

(from Association Pipeline)

diaSource :DIA SourceAP Association SDQA quantity

Per-CCD Processing«parallel»

(from Data Release Production Subsystem)

RawImage :Segment

InstrumentCal :Calibration Product

PostISRImage

ISR SDQA quantity

«structured»ISR Pipeline

(from Image Processing Pipeline)

RawImage :Segment

InstrumentCal :Calibration Product

PostISRImage

ISR SDQA quantity

segment :MaskedImage

ccd :MaskedImage

CCD Assembly SDQA quantitiy

«structured»CCD Assembly Pipeline

(from Image Processing Pipeline)

segment :MaskedImage

ccd :MaskedImage

CCD Assembly SDQA quantitiy

visitImage1

visitImage2

visitSum

CrSplit SDQA quantity

«structured»CR Split Handling Pipeline

(from Image Processing Pipeline)

visitImage1

visitImage2

visitSum

CrSplit SDQA quantity

calibratedExposure :Calibrated Exposure

postIsrImage

astrometricStandards :Astrometric Standard

photometricStandards :Photometric StandardImage Char SDQA quantity

«structured»Image Characterization

Pipeline

(from Image Processing Pipeline)calibratedExposure :Calibrated Exposure

postIsrImage

astrometricStandards :Astrometric Standard

photometricStandards :Photometric StandardImage Char SDQA quantity

======= COLOR KEY =======

Green - DC3a pipelines enhanced during DC3b PT1.0

Orange - DC3a pipeline enhanced during DC3b PT1.1

Turquoise - Original DC3a pipelines

Yellow - Pipelines defined for Alert Production but not prototyped

End Alert Production

Ephemerides SDQA quantityISR SDQA quantity

CCD Assembly SDQA quantityCrSplit SDQA quantity

Diff Image SDQA quantityImage Char SDQA quantity

AP Association SDQA quantitynightMOPS SDQA quantity

Alert Generation SDQA quantity

Alert Production SDQA Monitoring Pipeline

(from Science Data Quality Assessment Pipeline)

Ephemerides SDQA quantityISR SDQA quantity

CCD Assembly SDQA quantityCrSplit SDQA quantity

Diff Image SDQA quantityImage Char SDQA quantity

AP Association SDQA quantitynightMOPS SDQA quantity

Alert Generation SDQA quantity

Sync2

diffimage :Difference Exposure

template :Template Exposure

outputDIASources :DIA Source

calibratedExposure :Calibrated Exposure

DiffImage SDQA quantity«structured»

AP Difference Imaging Pipeline

(from Difference Imaging Pipeline)

diffimage :Difference Exposure

template :Template Exposure

outputDIASources :DIA Source

calibratedExposure :Calibrated Exposure

DiffImage SDQA quantity

8

Page 22: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

DMS Alert Production - Activity Diagram

3.2.1.1 Process an Observing Night <UseCase>

DESCRIPTION: Process an Observing Night - Manage the execution of the nightly pipelines. This is an abstract use case that is realized differently depending on whether this is run at the Base Camp or the Archive Center

BASIC COURSE:

3.2.1.1.1 Process Raw Images to Alerts <UseCase>

BASIC COURSE:

Upon signal from the OCS that observing will soon start, the DMS lauches the Nightly Pipelines by invoking the "Run Nightly Pipelines" Use Case.

The OCS invokes the "Capture CCD Images" UC.

An initial quality assesment is run by invoking the "Assess CCD Image Data Quality" UC.

If the data is valid, the "Transfer Data to Base Facility" is invoked.

When transfer is complete, the Pipeline Manager (created via "Run Nightly Pipelines") invokes the "Feed Transfered Data to Pipelines" UC.

The raw data is also transferred to the Archive Center for archiving (where the alert-producing pipelines are re-run as well as additional analysis pipelines). 

As Alerts are generated by the Pipelines, the PM invokes the "Transfer Alert Data to Archive Center"

Process Raw Images to Alerts - Use Case Diagram

9

Page 23: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

uc Process Raw Images to Alerts

Prepare for Observing

(from Difference Imaging Pipeline)

Detect Sources in a Visit

(from Association Pipeline)

Associate Visit Sources with Objects

(from Alert Generation Pipeline)

Generate Alerts from Visit

(from Image Processing Pipeline)

«Controller»Process Raw Images to

Calibrated Images

(from Moving Object Pipelines (Day and Night))

Find Known Moving Objects in Visit

(from Difference Imaging Pipeline)

«Controller»Detect Sources

(from Association Pipeline)

«Controller»Associate Sources with

Objects

«Controller»Process Raw Images to Alerts

(from Moving Object Pipelines (Day and Night))

«Controller»Mask Moving Objects from

Image

(from Science Data Quality Assessment Pipeline)

«Controller»Assess Data Quality for

Nightly Processing at Base

«invokes»

«invokes» «invokes»

«invokes»«invokes»

«invokes»

«precedes»

«precedes»

«precedes»

«precedes»

«invokes»

«precedes»

«invokes»

«precedes»

«invokes»

Name:Package:Version:Author:

Process Raw Images to AlertsAlert Production Subsystem1.0Tim Axelrod

Process Raw Images to Alerts - Use Case Diagram

3.2.1.1.1.1 Moving Object overlays star <Issue>What happens when a moving object, position predicted OK by MOPS passes in front of a star?

3.2.1.1.1.2 When does Template Image get rotated? <Issue>

10

Page 24: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.1.1.1.3 Prepare for Observing <UseCase>

BASIC COURSE:

Prior to the observing night, a sequence of preparation steps have been performed:1. All fields F that are possible targets for tonight are obtained from the Scheduler2. For each Field F:1. Identify the current template image T for the Field F.2. If T is not at Base, fetch it from Archive3. If the version of T is later than the version used for DIASources, delete DIASources for F (note that this implies that a DIASource must be mappable to F).4. If DIASources for F does not contain at least the latest N nights, fetch the missing nights from Archive.

3.2.1.1.1.4 Visit Image Processing <UseCase>

BASIC COURSE:

1. The two raw images from the visit, I1 and I2, are processed through the detrending stages of the Image Processing Pipeline (IPP), producing calibrated images C1 and C2. These have been corrected for instrumental response, had the atmospheric fringe image subtracted, and have been astrometrically and photometrically calibrated in a preliminary way (final calibration will occur at the Archive Centerduring Data Release processing). These steps make use of the calibration products produced at the Archive Center by the Calibration Pipeline, and an LSST catalog of photometric and astrometric standards.

2. The template image for the field, T, which has been produced at the Archive Center, is subtracted from C1 and C2, producing difference images D1 and D2. Image subtraction requires that the template image be registered to the input image, and the PSF’s matched. Note that T incorporates the information from a stack of images, and both cosmicray events and moving objects have been removed from it by median filtering. The processing of steps 1 and 2 are logically independent for the two images I1 and I2, and this allows the processing for I1 to be completed while I2 is still being exposed.ISSUE: when does T get rotated to PA of visit V?

3. Image D2 is registered to D1, and added to it, producing D+. This is the last step of the IPP. Note the implicit assumption that the PSF is stable between the two images of the visit. If not true in practice, this step will need a slight modification during commissioning.

3.2.1.2 Alert Production <Activity>

11

Page 25: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

DESCRIPTION: Alert Production - Process raw images to alerts.

ALGORITHM:

1. Compute Coarse Ephemerides for Night2. Run Alert Production SDQA Monitoring Pipeline3. Run ISR Pipeline4. Run CCD Assembly Pipeline5. Run CRSplit Handling Pipeline6. Run Image Characterization Pipeline7. Run AP Difference Imaging Pipeline8. Run Night MOPS Pipeline9. Run AP Source Association10. Run Alert Generation Pipeline11. Run SDQA Alert Production Summary Pipeline

EXCEPTIONS:

3.2.1.2.1 Process Exposures <ExpansionRegion>Process Exposures

INPUTS:

OUTPUTS:

GIVEN:

ALGORITHM:

EXCEPTIONS:

NOTES:

3.2.2 Data Release Production Subsystem

includes software programs, configuration files, unit tests, component integration tests, and documentation implementing the Data Release Production Pipelines and Application Framework. This WBS element is deployed only at the Archive Center.

DR Production with Control Flow - Activity Diagram

12

Page 26: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

act DR Production with Control Flow

Per-CCD Processing«parallel»

Detection

Per-Detection Processing«parallel»

Detection

RawImage :Segment

InstrumentCal :Calibration Product

PostISRImage

«structured»ISR Pipeline

(from Image Processing Pipeline)

RawImage :Segment

InstrumentCal :Calibration Product

PostISRImage

calibratedExposure :Calibrated Exposure

astrometricStandards :Astrometric Standard photometricStandards

:Photometric Standard

postIsrImage

«structured»Image Characterization

Pipeline

(from Image Processing Pipeline)

calibratedExposure :Calibrated Exposure

astrometricStandards :Astrometric Standard photometricStandards

:Photometric Standard

postIsrImage

«structured»Deep Detection Pipeline

(from Deep Detection Pipeline)

diaSources :DIA Source

predictedEphemerides :Ephemerides

associatedDiaSources:DIA SourcessObject :Solar System Object

«structured»DayMOPS Pipeline

(from Moving Object Pipelines (Day and Night))

diaSources :DIA Source

predictedEphemerides :Ephemerides

associatedDiaSources:DIA SourcessObject :Solar System Object

outputSources :Source

inputExposure :Calibrated Exposure

«structured»Single Frame Source Measurement Pipeline

(from Single Frame Measurement Pipeline)

outputSources :Source

inputExposure :Calibrated Exposure

ObjectAstrometric Model :Astrometric Model

ObjectMeasurements :AstroObject Model

calibratedExposures :Exposure Stackdetection :Source Collection

forcedSources

«structured»Galaxy Model Generation

Pipeline

(from Object Characterization Pipeline)

ObjectAstrometric Model :Astrometric Model

ObjectMeasurements :AstroObject Model

calibratedExposures :Exposure Stackdetection :Source Collection

forcedSources

Model :Astrometric Model

Sources :Source«structured»

Astrometric Model Generation Pipeline

(from Astrometric Calibration)Model :Astrometric Model

Sources :Source

Sources

Exposures :Calibrated Exposure Objects :AstroObject

«structured»Photometric Calibration

Pipeline

(from Photometric Calibration Pipeline)

Sources

Exposures :Calibrated Exposure Objects :AstroObject

ObjectAstrometricModel :Astrometric Model

FinalObjects :AstroObject

movingObjects :Solar System Object

ObjectMeasurements :AstroObject ModeltransientObjects :Detection Collection

sources :Source

«structured»Object Merge/Association

Pipeline

(from Association Pipeline)

ObjectAstrometricModel :Astrometric Model

FinalObjects :AstroObject

movingObjects :Solar System Object

ObjectMeasurements :AstroObject ModeltransientObjects :Detection Collection

sources :Source

calibratedExposure :Calibrated Exposure

template :Template Exposure

outputDIASources :DIA Source

diffImage :Difference Exposure

«structured»Difference Imaging

Pipeline

(from Difference Imaging Pipeline)

calibratedExposure :Calibrated Exposure

template :Template Exposure

outputDIASources :DIA Source

diffImage :Difference Exposure

visitImage1 visitImage2

visitSum

«structured»CR Split Handling Pipeline

(from Image Processing Pipeline)

visitImage1 visitImage2

visitSum

inputExposure :Calibrated Exposure

outputExposure :ExposuremovingSources :DIA Source

«structured»Mask Moving Objects

Pipeline

(from Moving Object Pipelines (Day and Night))

inputExposure :Calibrated Exposure

outputExposure :ExposuremovingSources :DIA Source

predictedEphemerides :Ephemerides

diffImage :Difference Exposure forcedDiaSources :DIA Source

transientObject :AstroObject«structured»

Difference Forced Photometry Pipeline

(from Photometric Calibration Pipeline)

predictedEphemerides :Ephemerides

diffImage :Difference Exposure forcedDiaSources :DIA Source

transientObject :AstroObject

nonMovingDiaSourceList :DIA Source Collection deepDetectionSources :SourceCollection

unassociatedDiaSource :AstroObject

«structured»DiaSource Association

Pipeline

(from Association Pipeline)

nonMovingDiaSourceList :DIA Source Collection deepDetectionSources :SourceCollection

unassociatedDiaSource :AstroObject

sources :Source

DeepDetection :SourceCollection

associatedSources :Source

«structured»Source Association

Pipeline

(from Association Pipeline)

sources :Source

DeepDetection :SourceCollection

associatedSources :Source

Fork1

Start DR Production

End DR Production

Fork2Sync

Fork2

deepCoAdds :Coadded Exposure

CoAdd :Coadded Exposure

«structured»Deep Coadd Generation

Pipeline

(from Image Coaddition Pipelines)deepCoAdds :Coadded Exposure

CoAdd :Coadded Exposure

Per-Sky Tile Processing«parallel»

inputExposure :Calibrated Exposure

skyMapImage :Sky Map Image

«structured»PSFMatch Pipeline

(from Image Coaddition Pipelines)

inputExposure :Calibrated Exposure

skyMapImage :Sky Map Image

Fork1 Sync

list :Sky Map Image

Template :Template Exposure

«structured»TemplateGen Pipeline

(from Image Coaddition Pipelines)

list :Sky Map Image

Template :Template Exposure

segment :MaskedImage

ccd :MaskedImage

«structured»CCD Assembly Pipeline

(from Image Processing Pipeline)

segment :MaskedImage

ccd :MaskedImage

13

Page 27: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

DR Production with Control Flow - Activity Diagram

3.2.2.1 Produce a Data Release <UseCase>

DESCRIPTION: Produce a Data Release - reprocess archive exposure data.

BASIC COURSE:

Invoke: Create Data Release PolicyInvoke: Monitor Data Release Progress

Data Release Production Operations Management - Use Case Diagramuc Data Release Production Operations Management

DR Operator

Create Data Release Policy

Monitor Data Release Progress

Name:Package:Version:Author:

Data Release Production Operations ManagementData Release Production Subsystem1.0Robyn Allsman

Data Release Production Operations Management - Use Case Diagram

14

Page 28: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

uc Produce a Data Release

«Controller»Acquire Calibration Data

«Controller»Process Raw Images to

Calibrated Images

«Controller»Detect Sources

«Controller»Associate Sources with

Objects

«Controller»Detect Deep Sources

«Controller»Create Deep Coadds

«Controller»Identify Moving Objects

«Controller»Mask Moving Objects from

Image

«Controller»Produce a Data Release

«Controller»Assess Data Quality for

Data Release

«Controller»Generate Astrometric

Models

«Controller»Recalibrate Data Release

Photometry

Difference Image Forced Photometry

Generate Galaxy ModelsMeasure Single Frame Sources

«invokes»

«precedes»

«precedes»«invokes» «invokes»

«invokes»

«invokes»

«invokes»

«invokes»

«invokes»

«invokes»

«precedes»

«invokes»

«precedes»

«invokes»

«invokes»

«invokes»

«precedes»

«precedes»

«precedes»

«precedes»

«precedes»

«precedes»

«precedes»

«invokes»

Produce a Data Release - Use Case Diagram

3.2.2.1.1 Create Data Release Policy <UseCase>

BASIC COURSE:

The DR operator enters the parameters required to perform the Data Release Processing:

Cutoff date for the DR (DRDate)Survey area to includeProcessing steps to include

15

Page 29: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.2.1.2 Monitor Data Release Progress <UseCase>

BASIC COURSE:

3.2.2.2 Data Release Production <Activity>

DESCRIPTION: Data Release Production - reprocess archive exposure data.

ALGORITHM:

1. Run ISR Pipeline2. Run CCD Assembly Pipeline3. Run CRSplit Handling Pipeline4. Run Image Characterization Pipeline5. Run Single Frame Source Measurement Pipeline5. Run PSF Match Pipeline6. Run TemplateGen PIpeline7. Run Difference Imaging Pipeline8. Run DayMOPS Pipeline9. Run Mask Moving Objects Pipeline10. Run Deep Coadd Generation Pipeline11. Run Deep Detection Pipeline12. Run Source Association Pipeline13. Run DiaSource Association Pipeline14. Run Difference Forced Photometry Pipeline15. Run Astrometric Model Generation Pipeline16. Run Galaxy Model Generation Pipeline17. Run Object Merge/AssociationPipeline18. Run Photometric Calibration Pipeline

EXCEPTIONS:

DR Production with Object Flow - Activity Diagram

16

Page 30: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011act DR Production with Object Flow

Per-Sky Tile Processing«parallel»

Per-CCD Processing«parallel»

Detection

Per-Detection Processing«parallel»

Detection

RawImage :Segment

InstrumentCal :CalibrationProduct

PostISRImage

«structured»ISR Pipeline

(from Image Processing Pipeline)

RawImage :Segment

InstrumentCal :CalibrationProduct

PostISRImage

calibratedExposure:CalibratedExposure

photometricStandards:Photometric Standard

postIsrImage

astrometricStandards:Astrometric Standard

«structured»Image Characterization

Pipeline

(from Image Processing Pipeline)

calibratedExposure:CalibratedExposure

photometricStandards:Photometric Standard

postIsrImage

astrometricStandards:Astrometric Standard

«structured»Deep Detection Pipeline

(from Deep Detection Pipeline)

associatedDiaSources :DIA Source

diaSources :DIASource

predictedEphemerides:Ephemerides

ssObject :SolarSystem Object

«structured»DayMOPS Pipeline

(from Moving Object Pipelines (Day and Night))

associatedDiaSources :DIA Source

diaSources :DIASource

predictedEphemerides:Ephemerides

ssObject :SolarSystem Object

outputSources:Source

inputExposure:CalibratedExposure

«structured»Single Frame Source Measurement Pipeline

(from Single Frame Measurement Pipeline)outputSources:Source

inputExposure:CalibratedExposure

ObjectMeasurements:AstroObject Model

ObjectAstrometric Model :Astrometric Model

calibratedExposures :Exposure Stack

detection :Source Collection

«structured»Galaxy Model Generation

Pipeline

(from Object Characterization Pipeline)ObjectMeasurements:AstroObject Model

ObjectAstrometric Model :Astrometric Model

calibratedExposures :Exposure Stack

detection :Source Collection

Model :Astrometric Model

Sources :Source

«structured»Astrometric Model Generation Pipeline

(from Astrometric Calibration)Model :Astrometric Model

Sources :Source

Exposures :CalibratedExposure

Sources

Objects :AstroObject«structured»

Photometric Calibration Pipeline

(from Photometric Calibration Pipeline)

Exposures :CalibratedExposure

Sources

Objects :AstroObject

«cache»Calibrated Exposure

Cache

«cache»Assembled Image

Cache

«datastore»Astrometric Standard

Catalog

«datastore»Photometric

Standard Catalog

«datastore»Exposure Catalog

«cache»Raw Image Cache

«datastore»Instrument Calibration

Repository

«datastore»Detection Co-Add

Store

«cache»Deep Detection

Source Collection

«datastore»Source Catalog

«datastore»Object Catalog

«cache»SFM Source Cache

«cache»Astrometric Model

Cache

FinalObjects:AstroObject

ObjectAstrometricModel:Astrometric Model

movingObjects:Solar SystemObject

ObjectMeasurements:AstroObject Model

transientObjects :Detection Collection

sources :Source

«structured»Object Merge/Association

Pipeline

(from Association Pipeline)FinalObjects:AstroObject

ObjectAstrometricModel:Astrometric Model

movingObjects:Solar SystemObject

ObjectMeasurements:AstroObject Model

transientObjects :Detection Collection

sources :Source

calibratedExposure :Calibrated Exposure template :Template

Exposure

outputDIASources :DIA Source

diffImage :DifferenceExposure

«structured»Difference Imaging

Pipeline

(from Difference Imaging Pipeline)

calibratedExposure :Calibrated Exposure template :Template

Exposure

outputDIASources :DIA Source

diffImage :DifferenceExposure

visitImage1 visitImage2

visitSum

«structured»CR Split Handling Pipeline

(from Image Processing Pipeline)

visitImage1 visitImage2

visitSum

«cache»Visit Image Cache

inputExposure :CalibratedExposure

outputExposure :Exposure

movingSources:DIA Source

«structured»Mask Moving Objects

Pipeline

(from Moving Object Pipelines (Day and Night))

inputExposure :CalibratedExposure

outputExposure :Exposure

movingSources:DIA Source

«datastore»Template Co-Add

Store

diffImage :DifferenceExposure

predictedEphemerides:Ephemerides

forcedDiaSources:DIA Source

transientObject :AstroObject«structured»

Difference Forced Photometry Pipeline

(from Photometric Calibration Pipeline)

diffImage :DifferenceExposure

predictedEphemerides:Ephemerides

forcedDiaSources:DIA Source

transientObject :AstroObject

«cache»Masked Exposure

Cache

«cache»DIA Source Cache

unassociatedDiaSource:AstroObject

nonMovingDiaSourceList:DIA Source Collection

deepDetectionSources :Source Collection

«structured»DiaSource Association

Pipeline

(from Association Pipeline)unassociatedDiaSource:AstroObject

nonMovingDiaSourceList:DIA Source Collection

deepDetectionSources :Source Collection

sources :Source

DeepDetection :Source Collection

associatedSources :Source object :AstroObject

«structured»Source Association

Pipeline

(from Association Pipeline)

sources :Source

DeepDetection :Source Collection

associatedSources :Source object :AstroObject

inputExposure :Calibrated Exposure

skyMapImage :Sky MapImage

«structured»PSFMatch Pipeline

(from Image Coaddition Pipelines)

inputExposure :Calibrated Exposure

skyMapImage :Sky MapImage

CoAdd :CoaddedExposure

deepCoAdds :CoaddedExposure

«structured»Deep Coadd Generation

Pipeline

(from Image Coaddition Pipelines)CoAdd :CoaddedExposure

deepCoAdds :CoaddedExposure

«cache»SkyMapImage

Cache

list :Sky MapImage

Template :TemplateExposure

«structured»TemplateGen Pipeline

(from Image Coaddition Pipelines)

list :Sky MapImage

Template :TemplateExposure

ccd :MaskedImage

segment :MaskedImage

«structured»CCD Assembly Pipeline

(from Image Processing Pipeline)ccd :MaskedImage

segment :MaskedImage

«cache»Post-ISR Image

Cache

======== COLOR KEY =========

GREEN - PT1 required;

ORANGE - PT1 stretch; PT2 required;

WHITE - PT3 required

BLUE - External Input required

============== INPUT DATA ==============

Astrometric and Photometric Standards for PT1:* Sim Data * Tim provides criteria for extract cuts to Lynne * KT provides DM schema to Lynn * Lynne extracts data according to selection criteria and generates export DB according to schema * CFHTLS Data * Tim extracts data according to selection criteria and generates export DB according to schema

=============== INPUT DATA ================

RAW IMAGE Cache* Sim Raw Images * See: http://dev.lsstcorp.org/trac/wiki/DC3bSimInputData* CFHTLS Raw Images * See http://dev.lsstcorp.org/trac/wiki/DC3bCFHTLSInputData

======== Source Association =======

PT1: SrcAssoc will use sources provided by SFM;

PT2: SrcAssoc will be modified from PT1 version to use sources provided by Deep Detection

The output of 'Mask Moving Objects Pipeline' should probably be 'MaskedImage' and not 'CalibratedExposure'.

«datastore»Solar System

Catalog

«cache»Ephemerides

Cache

«cache»Transient Object

Cache

«cache»Source Match

Cache

«cache»Diff Image Cache

«cache»Object Measurement

Cache

DR Production with Object Flow - Activity Diagram

3.2.2.2.1 Per-CCD Processing <ExpansionRegion>

17

Page 31: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

Per-CCD Processing is a pipeline orchestration construct which groups pipelines operating on the same temporal or spatial domain or object size..

3.2.2.2.2 Per-Detection Processing <ExpansionRegion>

Per-Detection Processing is a pipeline orchestration construct which groups pipelines operating in the same temporal or spatial domain.

3.2.2.2.3 Per-Sky Tile Processing <ExpansionRegion>

Per-Sky Tile Processing is a pipeline orchestration construct which groups pipelines operating in the same temporal or spatial domain.

3.2.3 Calibration Products Production Subsystem

includes software programs, configuration files, unit tests, component integration tests, and documentation implementing the Calibration Products Production which implements the following capabilities:

· Pre-sequence Exposure Catalog· Produce Crosstalk Matrix· Produce Illumination Correction Exposure· Produce Master Bias Exposure· Produce Master Dark Current Exposure· Produce Master Flat Exposure· Produce Master Fringe· Produce Pupil Ghost Exposure.

Calibration Products Production - Activity Diagram

18

Page 32: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

act Calibration Products Production

Pre-sequence Exposure Catalog

Produce Crosstalk Matrix Pipeline

Produce Illumination Correction Exposure

Pipeline

Produce Master Bias Exposure Pipeline

Produce Master Dark Current Exposure Pipeline

Produce Master Flat Exposure Pipeline

Produce Pupil Ghost Exposure Pipeline

Produce Master Fringe Pipeline

Begin Calibration Products Production

Sync Calibration Products ProductionEnd Calibration Products Pipeline

Name:Package:Version:Author:

Calibration Products ProductionCalibration Products Production Subsystem1.0doug

Calibration Products Production - Activity Diagram

3.2.3.1 Produce Calibration Data Products <UseCase>

DESCRIPTION: Produce Calibration Data Products - generate the calibration products used to remove the instrument signature from exposures.

BASIC COURSE:

Invoke: Acquire Calibration Data

Produce Calibration Data Products - Use Case Diagram

19

Page 33: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

uc Produce Calibration Data Products

«Controller»Produce Calibration Data

Products

«System»Build Super Flat

«System»Build Illumination Correction

«Controller»Acquire Calibration Data

«System»Build Monochromatic Dome

Flat

«System»Build Bias Frame

«System»Build Crosstalk Correction

Matrix

«System»Build Fringe Frames

«System»Build Bad Pixel Image

«System»Build Dark Image

«System»Prepare Standards and

References

(from Science Data Quality Assessment Pipeline)

«Controller»Assess Data Quality for

Calibration Products

«invokes»

«invokes» «invokes»

«invokes»

«invokes» «invokes»

«invokes»

«invokes»

«invokes»

«invokes»

«invokes»

Produce Calibration Data Products - Use Case Diagram

3.2.3.1.1 Acquire Calibration Data <UseCase>

Controller for running the Calibration Pipeline

BASIC COURSE:

Invoke: Build Illumination CorrectionInvoke: Build Super FlatInvoke: Prepare Standards and ReferencesInvoke: Build Bad Pixel ImageInvoke: Build Bias FrameInvoke: Build Crosstalk Correction MatrixInvoke: Build Dark ImageInvoke: Build Monochromatic Dome FlatInvoke: Build Fringe Frames

20

Page 34: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

ALTERNATE COURSES:

3.2.3.1.1.1 Prepare Standards and References <UseCase>

Prepare Standards and References

BASIC COURSE:

This use case is done only once during commissioning, and then only if references or standards are updated.

The System configures the Photometric Objects CatalogThe System configures the Astrometric Objects CatalogThe System configures the Fake Objects CatalogThe System configures the Solar System Objects Catalog

ALTERNATE COURSES:

3.2.3.2 Calibration Products Production <Activity>

DESCRIPTION: Calibration Products Pipeline - generate the calibration products used to remove the instrument signature from exposures.

ALGORITHM:

1. Pre-Sequence Exposure Catalog2. Run Produce Crosstalk Matrix Pipeline3. Run Produce Illumination Correction Exposure Pipeline4. Run Produce Master Bias Exposure Pipeline5. Run Produce Master Dark Current Exposure Pipeline6. Run Produce Master Flat Exposure Pipeline7. Run Produce Master Fringe Pipeline8. Run Produce Pupil Ghost Exposure Pipeline

EXCEPTIONS:

3.2.4 Association Pipeline

21

Page 35: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

DESCRIPTION:Correlates difference sources with known objects -- either fixed or moving objects predicted to be within the FOV -- to determine whether new objects have been detected, in which case a new object is created, stored and made available for association in future visits. The associations can be used to avoid issuing an alert for a difference source matching a known moving, variable or transient object.

Nightly Source Association - Activity Diagram

22

Page 36: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

act Nightly Source Association

«structured»Night Source Association

Start Nightly Source Association

Calculate Object Zone Indices for Visit

Preprocess AstroObject Catalog

Match DiaSources to AstroObjects

Update AstroObject and DiaSource Catalogs

End Nightly SourceAssociation

Nightly Source Association - Activity Diagram

23

Page 37: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

act Object Merge/Association Pipeline

«structured»Object Merge/Association Pipeline

Acquire Visit metadata

Run NightMOPS

Associate Moving Objects to Sources

Merge Source/Object Pairs

Associate Sources with Transient Objects

Start Object/Merge Association Pipeline

End Object/Merge Association Pipeline

Object Merge/Association Pipeline - Activity Diagram

24

Page 38: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

act Source Association

associatedSources:Source

DeepDetection :Source Collection

sources :Source

«structured»Source Association Pipeline

associatedSources:Source

DeepDetection :Source Collection

sources :Source

Start SourceAssociation

End SourceAssociation

Source Clustering

Source Cluster Attributes

Acquire Metadata

Source Association - Activity Diagram

3.2.4.1 Associate Visit Sources with Objects <UseCase>

DESCRIPTION: Associate Visit Sources with Objects -

BASIC COURSE:

1. The incoming DIA Source Collection, S+, is cross matched with the MOPS Predicted Source

25

Page 39: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

Collection. The ID of the matching MOPS source (which may be NULL) is entered into the record for each Source. Note that both here and in Step 2 the difference image context makes magnitude information useless in performing the cross-match. It must be based on position alone.

2. The DIA Source Collection, S+, is cross matched with the Object Catalog. The ID of the matching Object (which may be NULL) is entered into the record for each Source.

3. The DIA Source Catalog and Object Catalog are updated: If the Object ID for a source in S+ is NULL, a new entry is made for it in the Object Catalog and the resulting ID set in S+. The collection S+ is added to the DIA Source Catalog. Note that as a moving object moves across the sky, new Objects will be formed each time it is detected. These Object entries will be pruned after MOPS processing is complete, and the Sources will be linked instead to a single entry in the MovingObject Catalog (NOTE possible DB implementation issues)

4. The sources in S+, all of which now have an associated object, are passed to the Source Router, which implements these rules:a. Negative excursions are sent to Alert Processingb. Positive excursions with a NULL MOPS ID are sent to Alert Processing. Those with a non-NULL MOPS ID are sent to MOPS. If the object associated with the source is not classified as a variable, the source is sent to MOPS.c. Fast movers are sent to MOPS.d. Flashes are sent to Alert Processing.

3.2.4.2 Associate Sources with Objects <UseCase>

DESCRIPTION: Associate Sources with Objects

BASIC COURSE:

Run for each visit (2 successive exposures) so that Moving Object Pipeline can detect and remove moving objects. Alll further steps assume that there are no moving objects left.

Invoke "Run the Moving Object Pipeline"

Invoke "Associate Subtracted Image Catalog"

3.2.4.2.1 Number of Object Catalogs <Issue>Do we require separate master catalog for Subtracted Images and non-subtracted images or a

26

Page 40: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.4.3 Night Source Association <Activity>

DESCRIPTION: Night Source Association Pipeline - Correlates sources with known objects -- either fixed or moving objects predicted to be within the FOV -- to determine whether new sources have been detected.

INPUTS:- DiaSource

OUTPUTS:- updated AstroObject catalog- updated DiaSource catalog

ALGORITHM: PT11. Process stage: Preprocess AstroObject Catalog

2. Process stage: Calculate Object Zone Indices for Visit

3. Process stage: Match DiaSources to AstroObjects

4. Process stage: Update AstroObject and DiaSource Catalogs

EXCEPTIONS:1. NightSourceAssoc policy not found.

2. Stage not found

3. Stage terminated in error

Nightly Source Association Pipeline Robustness Diagram - Analysis Diagram

27

Page 41: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011analysis Nightly Source Association Pipeline Robustness Diagram

Prepopulate Sky Tiles with Astro Objects

Find Tiles that Overlap Field of View

Match DIA Sources to AstroObjects Update AstroObject and DIA Source Catalogs

Name: Nightly Source Association Pipeline Robustness DiagramAuthor: Tim AxelrodVersion: 1.0Created: 4/10/2009 3:43:53 PMUpdated: 2/15/2011 8:27:36 AM

:AstroObjectCatalog

:Declination StripeTessellation

Association PolicyReader

:Ephemeris Collection

OCS

OCS Event Queue

:Sky Tile Collection

Build DestinationStripe Tesselation

:Field of View

sort astro objects by RA

sorted :AstroObjectCollection

:DIA Source Catalog

Pipeline Event Queue

get tile size

get matching tolerance

:DIA Source Collection

WBS: 02C.03.02 Association Pipeline

DESCRIPTION: Night Source Association Pipeline - Correlates sources with known objects -- either fixed or moving objects predicted to be within the FOV -- to determine whether new sources have been detected.

INPUTS:- DiaSource

OUTPUTS:- updated AstroObject catalog- updated DiaSource catalog

ALGORITHM: PT11. Process stage: Preprocess AstroObject Catalog

2. Process stage: Calculate Object Zone Indices for Visit

3. Process stage: Match DiaSources to AstroObjects

4. Process stage: Update AstroObject and DiaSource Catalogs

EXCEPTIONS:1. NightSourceAssoc policy not found.

2. Stage not found

3. Stage terminated in error

:Association Policy

night MOPS and detection complete

new FOV received event

Nightly Source Association Pipeline Robustness Diagram - Analysis Diagram

3.2.4.3.1 Calculate Object Zone Indices for Visit <Activity>

28

Page 42: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

Calculate Object Zone Indices for Visit (controller: Find Tiles that Overlap Field of View)

INPUTS:- visitId of current visit.- RA of the current telescope pointing.- Dec of the current telescope pointing.- MJD of current exposure.- Name of filter of current exposure ('u', 'g', 'r', 'i', 'z', or 'y').

OUTPUTS:- zone index for objects in the FOV.

GIVEN:From policy:- Size of LSST FOV (radius).- Sky partitioning parameters.- Location of chunk files/chunk delta files.

ALGORITHM:Given bounding circle for current FOV and sky partitioning parameters, compute the set of chunks overlapping the FOV.For each chunk C overlapping the FOV Read chunk file for C. Read chunk delta file for C.end

Create zone index for objects (used for spatial matching in "Process Visit").

3.2.4.3.2 Match DiaSources to AstroObjects <Activity>Match DIA Sources to AstroObjects (controller: Match DIA Sources to AstroObjects)

INPUTS:- (optional) Match radius R for difference source to object matching- Difference sources for visit from detection pipeline- Moving object predictions for visit from moving object pipeline- Zone index for objects from "Prepare for Visit"

OUTPUTS:- List of difference source to object matches.- List of moving object prediction to difference source matches.- List of difference sources to create new objects from.

GIVEN:From policy:- Default match radius R for difference source to object matching.- Maximum semi-major axis length for moving object prediction error ellipses.- Clamp values for semi-major and semi-minor axis lengths.

29

Page 43: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

ALGORITHM:Wait for event to signal detection pipeline completion for current visit.Read difference sources for current visit from database.Create zone index for difference sources.For each difference source S find all objects within distance R of S (using zone indexes for difference sources and objects).Write list of difference source to object matches to database.Wait for event to signal moving object pipeline completion for current visit.Read moving object predictions from database.Discard predictions with error-ellipses that are too large.Clamp semi-major/semi-minor axis lengths of prediction error ellipse (DC2 only).For each remaining moving object prediction P, find all difference sources that did not match a known variable object and which are within the positional error ellipse of P.Write list of moving object prediction to difference source matches to database.

Write list of difference sources which did not match any object to the database.

3.2.4.3.3 Preprocess AstroObject Catalog <Activity>

Preprocess AstroObject Catalog

INPUTS:- Object catalog from deep detection.- Height of stripes (H)- Width of chunks.

OUTPUTS:- Chunk files containing object attributes necessary for association.

ALGORITHM:Create a Declination Stripe Tessellation of height H. For each stripe S Retrieve objects inside S. Sort retrieved objects in right ascension. For each object in S, in order of right ascension Determine chunk object belongs to. Append values of objectId, ra, decl, and (u|g|r|i|z|y)VarProb to chunk file. endend

NOTES:H is currently chosen such that ~100 chunks overlap the full 10 deg2 LSST FOV. This minimizes IO of objects lying outside a visit FOV and allows for parallelization of chunk file reads/writes. This step does not need to be run every night. It must be run at least once per deep detect cycle.

30

Page 44: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.4.3.4 Update AstroObject and DiaSource Catalogs <Activity>Update AstroObject and DIA Source Catalog (Controller: Update AstroObject and DIA Source Catalogs)

INPUTS:- Difference sources for current visit from detection pipeline.- Moving object predictions for current visit from moving object pipeline.- List of difference source to object matches (from "Match DIA Sources to AstroObjectst") for current visit.- List of moving object prediction to difference source matches (from "Process Visit") for current visit.- List of ids for difference sources to create new objects from (from "Process Visit") for current visit.

OUTPUTS:- Updated historical Object and DIASource catalog.- Updated chunk delta files.

GIVEN:From policy:- Sky partitioning parameters.- Location of chunk delta files.

ALGORITHM:For each new object Determine chunk that new object belongs to (using object position and sky partitioning parameters) Append values of objectId, ra, decl, and (u|g|r|i|z|y)VarProb for new object to chunk delta fileendFor each chunk overlapping the current FOV Relinquish ownership of the chunk to any interested party (see "Prepare for Visit")endFor each difference source If difference source matched 1 or more objects Set objectId of difference source to id of closest matching object. Else set objectId of difference source to id of object to be created from that source.endAppend difference sources for current visit to historical difference source table.Update latest observation time and number of times an object observed for each matched object in historical Object catalog.Create new objects from difference sources that didn't match a historical object.Append new objects to historical Object catalog.Append list of difference source to object matches to historical table (for debugging).Append list of moving object prediction to difference source matches to historical table (for debugging).Append list of ids for difference sources to create new objects from to historical table (for debugging).Drop all per-visit tables created by the association pipeline, detection pipeline, and moving object pipeline.

31

Page 45: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.4.3.5 Find Tiles that Overlap Field of View <Class>Calculate Object Sone Indices for Visit

INPUTS:- visitId of current visit.- RA of the current telescope pointing.- Dec of the current telescope pointing.- MJD of current exposure.- Name of filter of current exposure ('u', 'g', 'r', 'i', 'z', or 'y').

OUTPUTS:- zone index for objects in the FOV.

GIVEN:From policy:- Size of LSST FOV (radius).- Sky partitioning parameters.- Location of chunk files/chunk delta files.

ALGORITHM:

Given bounding circle for current FOV and sky partitioning parameters, compute the set of chunks overlapping the FOV.

For each chunk C overlapping the FOV Read chunk file for C. Read chunk delta file for C.end

Create zone index for objects (used for spatial matching in "Process Visit").

EXCEPTIONS:

NOTES:

3.2.4.3.6 Match DIA Sources to AstroObjects <Class>Match DIA Sources to AstroObjects

INPUTS:- (optional) Match radius R for difference source to object matching- Difference sources for visit from detection pipeline- Moving object predictions for visit from moving object pipeline

32

Page 46: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

- Zone index for objects from "Prepare for Visit"

OUTPUTS:- List of difference source to object matches.- List of moving object prediction to difference source matches.- List of difference sources to create new objects from.

GIVEN:From policy:- Default match radius R for difference source to object matching.- Maximum semi-major axis length for moving object prediction error ellipses.- Clamp values for semi-major and semi-minor axis lengths.

ALGORITHM:

Wait for event to signal detection pipeline completion for current visit.Read difference sources for current visit from database.Create zone index for difference sources.For each difference source S find all objects within distance R of S (using zone indexes for difference sources and objects).Write list of difference source to object matches to database.Wait for event to signal moving object pipeline completion for current visit.Read moving object predictions from database.Discard predictions with error-ellipses that are too large.Clamp semi-major/semi-minor axis lengths of prediction error ellipse (DC2 only).For each remaining moving object prediction P, find all difference sources that did not match a known variable object and which are within the positional error ellipse of P.Write list of moving object prediction to difference source matches to database.Write list of difference sources which did not match any object to the database.

EXCEPTIONS:

NOTES:

3.2.4.3.7 Prepopulate Sky Tiles with Astro Objects <Class>Preprocess AstroObject Catalog

INPUTS:- Object catalog from deep detection.- Height of stripes (H)- Width of chunks.

OUTPUTS:- Chunk files containing object attributes necessary for association.

GIVEN:

33

Page 47: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

ALGORITHM:

Create a Declination Stripe Tessellation of height H.

For each stripe S Retrieve objects inside S. Sort retrieved objects in right ascension. For each object in S, in order of right ascension Determine chunk object belongs to. Append values of objectId, ra, decl, and (u|g|r|i|z|y)VarProb to chunk file. endend

EXCEPTIONS:

NOTES:H is currently chosen such that ~100 chunks overlap the full 10 deg2 LSST FOV. This minimizes IO of objects lying outside a visit FOV and allows for parallelization of chunk file reads/writes. This step does not need to be run every night. It must be run at least once per deep detect cycle.

3.2.4.3.8 Update AstroObject and DIA Source Catalogs <Class>Update AstroObject and DIA Source Catalog

INPUTS:- Difference sources for current visit from detection pipeline.- Moving object predictions for current visit from moving object pipeline.- List of difference source to object matches (from "Match DIA Sources to AstroObjectst") for current visit.- List of moving object prediction to difference source matches (from "Process Visit") for current visit.- List of ids for difference sources to create new objects from (from "Process Visit") for current visit.

OUTPUTS:- Updated historical Object and DIASource catalog.- Updated chunk delta files.

GIVEN:From policy:- Sky partitioning parameters.- Location of chunk delta files.

ALGORITHM:

34

Page 48: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

For each new object Determine chunk that new object belongs to (using object position and sky partitioning parameters) Append values of objectId, ra, decl, and (u|g|r|i|z|y)VarProb for new object to chunk delta fileend

For each chunk overlapping the current FOV Relinquish ownership of the chunk to any interested party (see "Prepare for Visit")end

For each difference source If difference source matched 1 or more objects Set objectId of difference source to id of closest matching object. Else set objectId of difference source to id of object to be created from that source.end

Append difference sources for current visit to historical difference source table.Update latest observation time and number of times an object observed for each matched object in historical Object catalog.Create new objects from difference sources that didn't match a historical object.Append new objects to historical Object catalog.Append list of difference source to object matches to historical table (for debugging).Append list of moving object prediction to difference source matches to historical table (for debugging).Append list of ids for difference sources to create new objects from to historical table (for debugging).Drop all per-visit tables created by the association pipeline, detection pipeline, and moving object pipeline.

EXCEPTIONS:

NOTES:

3.2.4.3.9 Build Destination Stripe Tesselation <Object>initialize a new one

3.2.4.4 DiaSource Association Pipeline <Activity>

DESCRIPTION: DiaSource Association (DiaSourceAssoc) Pipeline handles association of DiaSources that have not already been associated with MovingObjects by DayMOPS. It associates them with prototype Objects from Deep Detection and Multifit. Any unassociated DiaSources are taken to be transients and new Objects are created for them.

ALGORITHM:1. Process stage: ObjectDiaSource

35

Page 49: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

2. .......tbd......

EXCEPTIONS:1. DIaSourceAssoc policy not found

2. Stage not found

3. Stage terminated in error

3.2.4.5 Object Merge/Association Pipeline <Activity>

DESCRIPTION: Object Merge/Association Pipeline (formerly ObjectAssoc) Associate transient objects created by DiaSourceAssoc with Sources (via a spatial match), and Sources with moving objects. Merge together transient, moving, and deep-detection objects with object measurements and astrometry to produce final objects for ingestion into the data release Object catalogs"

INPUTS:* movingObjects(Solar System Object)* objectMeasurements (AstroObjectModel)* objectAstrometricModel ( Astrometric Model)* sources (Source)* transientObjects (Detection Collection)

OUTPUTS:* finalObjects (AstroObject)

ALGORITHM:1. Process stage: Acquire visit metadata

2. Process stage: Run nightMops Pipeline

3 Process stage: Associate Moving Objects to Sources (MovingObjectSource)

EXCEPTIONS:1. MopsSrcAssoc policy not found

2. Stage not found

3. Stage terminated in error

36

Page 50: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

NOTES: Serge:

Serge: There's a non-zero chance that we may want to re-associate sources with non-transient objects (as in SourceAssociation).

Why? Because after multi-fit, my understanding is that we'll have some kind of proper motion measurements for stars, and we could presumably then produce more accurate associations.

Also, SourceAssociation knows nothing about moving objects or transient objects (I'm not sure whether SourceAssociation can count on the sources getting passed in to have been cleaned of moving object sightings). If not, it could for example assign a source to a non-moving object that will later also get associated to a moving object.

3.2.4.6 Source Association Pipeline <Activity>

DESCRIPTION: Source Associate Pipeline - Correlates difference sources with known objects -- either fixed or moving objects predicted to be within the FOV -- to determine whether new objects have been detected, in which case a new object is created, stored and made available for association in future visits. The associations can be used to avoid issuing an alert for a difference source matching a known moving, variable or transient object.

Until deep detection is implemented, Source Association Phase 1 creates Objects by clustering sources

INPUTS:- Source- Detection

OUTPUTS: astrometric model for each Object

ALGORITHM: PT11. Process stage: Acquire Metadata2. Process stage: Source Clustering3. Process stage: Source Cluster Attributes

EXCEPTIONS:1. SourceAssoc policy not found.2. Stage not found

3. Stage terminated in error

3.2.4.6.1 Acquire Metadata <Activity>

ALGORITHM:

37

Page 51: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

EXCEPTIONS:

3.2.4.6.2 Source Cluster Attributes <Activity>

INPUTS:- clustered sources- 'bad' sources

OUTPUT:- objects

ALGORITHM:1. compute SourceClusterAttributes for each input cluster2. create source clusters for bad sources3. compute SourceClusterAttributes for each bad source cluster

EXCEPTIONS:

3.2.4.6.3 Source Clustering <Activity>

The algorithm visits each source S in the sky-tile (in some arbitrary order). If the epsilon neighborhood of a source S contains at least MinPts other sources and S has not already been placed into a cluster:1. Create a new cluster C.2. Add all the epsilon-neighbors of S that don't already belong to a cluster to C.3. Recursively perform 2 for each epsilon-neighbor S' of S that has an epsilon neighborhood containing at least MinPts other sources.

If the epsilon neighborhood of S contains less than MinPts other sources, it is called a noise source and is discarded

INPUTS:- list of source sets

OUTPUT:- list of invalid sources (out of range) - list of bad sources

38

Page 52: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

ALGORITHM:1. remove sources with invalid positions2. discard sources outside the current sky-tile3. segregate 'bad' sources; keep for reporting.4. cluster remaining sources

EXCEPTIONS::

3.2.5 Astrometric Calibration

includes software programs, configuration files, unit tests, component integration tests, and documentation implementing the Astrometric Calibration Pipeline with the following capabilities:

· Associate each Source with a Deep Detection Object· Calibrate Object Astrometry

Astrometric Calibration - Activity Diagram

39

Page 53: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

act Astrometric Calibration

Model :AstrometricModel

Sources:Source

«structured»Astrometric Model Generation Pipeline

Model :AstrometricModel

Sources:Source

Start AstrometricCalibration

Acquire Sky Tile Metadata

Associate Detections and Sources

Calibrate Astrometry

EndAstrometricCalibration

Astrometric Calibration - Activity Diagram

3.2.5.1 Generate Astrometric Models <UseCase>

DESCRIPTION: Generate Astrometric Models -

BASIC COURSE:

3.2.5.2 Astrometric Model Generation Pipeline <Activity>

40

Page 54: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

DESCRIPTION: Astrometric Model Generation Pipeline (formerly Astrometric Calibration) Pipeline: Each slice operates on all Sources within a given sky tile across all epochs. Before calibration can be performed, each Source must be associated with a deep Detection.

INPUTS:- associated Sources, Detections

OUTPUTS:- Object AstrometricModel

ALGORITHM:1. Process stage: Acquire Sky Tile Metadata

2. Process stage: Associate Detections and Sources

3. Process stage: Calibrate Astrometry

EXCEPTIONS:1. AstroCal Policy not found

2. Stage not found

3. Stage terminated due to error

3.2.5.2.1 Acquire Sky Tile Metadata <Activity>

ALGORITHM:

EXCEPTIONS:

3.2.5.2.2 Associate Detections and Sources <Activity>

ALGORITHM:1. Iterate until convergence 1.1 Build transforms 1.2. Iterate on Exposure per CCD group 1.2.1Apply Transforms 1.2.2 Fit for Astrometric Parameters 1.3. Evaluate for Convergence

EXCEPTIONS:1. Failure to Converge2.

41

Page 55: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.5.2.3 Calibrate Astrometry <Activity>

ALGORITHM:For each object:

- Get the Time/Sky Coordinate List

- Fit the Time/Sky Coordinate List for the astrometric parameters, creating a new Astrometric Model, which contains a goodness-of-fit metric as well as the model parameters

- Update the Object Catalog with the new Astrometric Model

EXCEPTIONS:

3.2.6 Deep Detection Pipeline

includes software programs, configuration files, unit tests, component integration tests, and documentation that implement the following capabilities:

· Generate Chi-Squared Co-add· Detect Bright Sources· Measure Bright Sources (position, brightness, shape, orientation, errors on those parameters)· Populate Object Catalog

Deep Detection - Activity Diagram

42

Page 56: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

act Deep Detection

«structured»Deep Detection Pipeline

Start DeepDetection

Acquire Sky Tile Metadata

Generate ChiSquared Coadd

(from Image Coaddition Pipelines)

Detect Sources

Measure Sources

End DeepDetection

Deep Detection - Activity Diagram

3.2.6.1 Detect Deep Sources <UseCase>

DESCRIPTION: Detect Deep Sources -

BASIC COURSE:

43

Page 57: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

- Optimally combine information from multiple images of a field to yield object detections to maximal depth.

- Generate detailed shape measures.

- Make best possible measurements of multiband photometric magnitudes to enable accurate Photo-Z

- Measure proper motions and parallaxes

- Populate the Object Catalog with the results

3.2.6.2 Deep Detection Pipeline <Activity>

DESCRIPTION: Deep Detection Pipeline (DeepDet) handles the generation of the deep panchromatic chi-square co-add and detection and measurement on that co-add. Each slice operates on a single sky tile.

INPUTS:- calibratedExposure , PSF (optional)

OUTPUTS:- coadd, weightMap

ALGORITHM:1. Process stage: Acquire SkyTileMetadata

2. Process stage: Generate ChiSquared Coadd

3. Process stage: Detect Sources

4. Process stage: Measure Sources

EXCEPTIONS:1. DeepDet policy not found.

2. Stage not found

3. Stage terminated in error

44

Page 58: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.6.2.1 Acquire Sky Tile Metadata <Activity>

ALGORITHM:

EXCEPTIONS:

3.2.6.2.2 Detect Sources <Activity>

INPUTS:- Calibrated science Exposure(s) (including background)

OUTPUTS:- background subtracted Exposure used in the detection, measured - background, psf used to smooth the exposure before detection, - PositiveFootprintSet- NegativeFootprintSet

ALGORITHM:1. if more that one input exposure, create ExposureStack.

2. Create background Exposure from input Exposure (or ExposureStack)

3. Make a smoothing PSF according to the PSF policy

4. Perform detections using backgroundExposure, PSF and Detection Policy generating FootprintSets

5. Copy detectionMask to the input Exposures

EXCEPTIONS:1. Failure to find exposure: exit in error2. Failure to find PSF policy

3. Failure to find SourceDetection policy

3.2.6.2.3 Measure Sources <Activity>

INPUTS:- Exposure- Footprint Collection

OUTPUTS:- Source Collection

45

Page 59: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

ALGORITHM: PT11. Merge positive and negative detection sets (aka Footprint Collection)

2. Determine Measure Algorithm to use from Measure Policy

3. Measure Sources using Exposure, PSF, Footprint Collection (is this done via the pre-PT1 method below?)

Exceptions:1. Measure Policy file not found2. Exposure File not found3. Footprint Collection not found4. PSF not found

pre-PT11. The WCS is extracted from the Exposure.2. For each Footprint in the Footprint Collection: 2.1 Measure its position and flux in the Exposure. 2.2 Create a new Source to contain the measured properties. 2.3 Calculate the ra and dec from the pixel coordinates using the WCS. 2.4 Add the Source to the Source Collection

EXCEPTIONS:

NOTES:An Exposure may be a DIAExposure. A Source may be a DIASource.

3.2.7 Difference Imaging Pipeline

includes software programs, configuration files, unit tests, component integration tests, and documentation that implement the following capabilities:

· Subtract Template and Science Exposure· Detect Difference Image Sources (DIASources)· Measure DIASources

AP Difference Imaging Pipeline - Activity Diagram

46

Page 60: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

act AP Difference Imaging PipelineStart AP Difference Imaging PIpeline

«structured»AP Difference Imaging Pipeline

«structured»Difference Imaging

Pipeline

End AP Difference Imaging Pipeline

AP Difference Imaging Pipeline - Activity Diagram

3.2.7.1 Detect Sources in a Visit <UseCase>

DESCRIPTION: Detect Sources in a Visit -

BASIC COURSE:- The Detection Pipeline (DP) begins by detecting sources in D+, producing source collection S+. Note that the input image is a difference image, so that the sources in it (which may have negative flux in some cases) are mostly point sources or cosmic rays, the principal exception being streaks from rapidly moving objects.

- The source collection S+, and the images D1 and D2, are used by the next processing stage, “Classify Sources”, in the DP to classify the type of the sources. To do this, the information in S+ is used to extract for each source a small subimage from each of D1 and D2 containing the source. The subimages are saved in an image collection for later use. The source fluxes and shapes in these subimages are analyzed according to the scheme in Table XX. Cosmic rays are recognized as such by their shape in a single image or, if present in both images, because they have shapes which do not match. A small fraction of cosmic rays will result in a PSF-like shape. If present in only a single image, these will be misclassified as a Flash. As a result, if Alerts are generated from Flashes a significant noise level will be present.

- The probability of two independent cosmic rays producing a PSF shape at the same location in

47

Page 61: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

the image is very small (the actual probability will not be known until commissioning data is available), and therefore the resulting contamination of “Positive Excursions” will also be small. Cosmic ray sources are removed from the DIA Source Collection and placed in the Junk Source Collection. The Junk Source Collection is utilized by the DQA system and periodically emptied. The remaining sources in DIA Source Collection are tagged with their type and passed to the Association Pipeline.

3.2.7.2 Detect Sources <UseCase>

DESCRIPTION: Detect Sources -

BASIC COURSE:

Invoke: Determine Sky.

Invoke: Build PSF Model.

Invoke: Detect Sources.

Invoke: Measure Sources.

Invoke:l Refine WCS.

Invoke: Determine ZeroPoint.

Invoke: Create Image Source Catalog.

Write PSF map, PSF Model, WCS, Image Mask.

ALTERNATE COURSES:Any stage fails: throw exception.

Can not store results: throw exception.

3.2.7.3 AP Difference Imaging Pipeline <Activity>

DESCRIPTION: Alert Production Difference Imaging Pipeline - a wrapper around Difference Imaging Pipeline strictly to get around LsstTools limit of one input and one output to an activity. Alert Production uses this wrapper to invoke Difference Imaging Pipeline whereas Data Release Production invokes it directly.

ALGORITHM:

48

Page 62: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

EXCEPTIONS:

3.2.7.4 Difference Imaging Pipeline <Activity>

DESCRIPTION: Difference Imaging Pipeline handles computation of Difference Exposures by subtracting Template Exposures from Exposures. It also detects and measures DiaSources on the resulting Difference Images. Each slice operates on a single CCD Exposure.

INPUTS:- calibratedExposure - TemplateExposure

OUTPUTS:- DIfferenceExposure- DIASources- Spatial Kernels

ALGORITHM:1. Process stage: IdentifySkyTile

2. Process stage: Extract Template

3. Process stage: Image DIfferencing

4. Process stage: Difference Detection

5. Process stage: Difference Measurement

EXCEPTIONS:

Difference Imaging Pipeline - Activity Diagram

49

Page 63: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

act Difference Imaging Pipeline

calibratedExposure :Calibrated Exposure

outputDIASources:DIA Source

template :TemplateExposure

diffImage :Difference Exposure

outputSpatialKernel:Spatial Model

«structured»Difference Imaging Pipeline

calibratedExposure :Calibrated Exposure

outputDIASources:DIA Source

template :TemplateExposure

diffImage :Difference Exposure

outputSpatialKernel:Spatial Model

templateExposure :Calibrated Exposure

differenceExposure :Difference Exposure

calibratedExposure :CalibratedExposure[1..*]

outputSpatialKernel:Spatial Model

Image Differencing

templateExposure :Calibrated Exposure

differenceExposure :Difference Exposure

calibratedExposure :CalibratedExposure[1..*]

outputSpatialKernel:Spatial Model

differenceExposure :Difference Exposure

differenceDetections:Detection

Difference DetectiondifferenceExposure :Difference Exposure

differenceDetections:Detection

differenceDetections:Detection

outputDIASources:DIA Source

Difference MeasurementdifferenceDetections:Detection

outputDIASources:DIA Source

End DifferenceImagingPipeline

BeginDifferenceImagingPipeline

Identify Sky Tiles

«datastore»Difference Image

Cache

templateExposure :Calibrated Exposure

template :CalibratedExposure

Extract Template

templateExposure :Calibrated Exposure

template :CalibratedExposure

Difference Imaging Pipeline - Activity Diagram

50

Page 64: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.7.4.1 Identify Sky Tiles <Activity>

Identify Sky Tiles -

INPUTS:

SkyMapImage Stack

OUTPUTS:

list of SkyMapImages

ALGORITHM:

1 ) extract visit metadata from process startup.

EXCEPTIONS:

1) Visit metadata not found.

NOTES:

3.2.7.4.2 Extract Template <Activity>

INPUTS:Template SkyMapImage Collection

OUTPUTS:Template Exposure

ALGORITHM:1. Extract TemplateSkyMapImage from Template SkyMapImage Collection

2. Convert template SkyMapImage into Exposure

EXCEPTIONS:1. Extract Template policy not found

2. Template SkyMapImage not found.

51

Page 65: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.7.4.3 Image Differencing <Activity>

INPUTS:- Template Exposure (to be convolved),- Science Calibrated Exposure with reference PSF (aka Calibrated Exposure),- Policy

OUTPUTS:- Difference Exposure,- Spatial Convolution Kernel (LinearCombination),- Differential Background Model (Polynomial Function 2),- Spatial Cell Set: Kernel Candidates, SDQA elements

ALGORITHM:1. Warp the template Exposure to align with the science Calibrated Exposure.

2. Detect Footprints on the warped template Exposure and add to FootprintList

3. Create Kernel Candidate by extracting sub-images from template Exposure and science Exposure using Footprint

4. Assign Kernel Candidate to Spatial Cell

5. Build single Kernel for each Kernel Candidate in each Spatial Cell

6. Build spatial Kernel and differential Background Model with valid Spatial Cells' Kernel Candidates

7. Create Difference Exposure using template Exposure and science Calibrated Exposure, Spatial Kernel, and differential Background Model

8. Create difference imaging SDQA Ratings from Difference Exposure, Spatial Cells and Spatial Kernel

EXCEPTIONS:2. No Footprints Found

5. - 6. No "good" Kernel Candidates

6. Unable to determine Spatial Kernel

52

Page 66: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.7.4.4 Difference Detection <Activity>

INPUT:- Difference image Exposure, - Policy

OUTPUT:- FootprintSet of detected Footprints

ALGORITHM:1. Create positive-going detection Threshold from Policy

2. Create positive-going FootprintSet from Threshold and difference Exposure

3. Create negative-going detection Threshold from Policy

4. Create negative-going FootprintSet from Threshold and difference Exposure

5. Merge positive-going and negative-going FootprintSets into output FootprintSet

EXCEPTIONS:

3.2.7.4.5 Difference Measurement <Activity>

INPUT:- Difference image Exposure,- Difference image FootprintSet,- Policy

OUTPUT:- Difference image DiaSource vector

ALGORITHM:1. Create DiaSource from each Footprint in FootprintSet

2. Create Measurement object for each measurement type specified in Policy

3. Add Algorithm to Measurement for each measurement algorithm specified in Policy

4. Apply each Measurement to DiaSource

5. Add DiaSource to vector

53

Page 67: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

EXCEPTIONS:

NOTES:Assumes the design review of meas_algorithms is implemented in DC3b. Uses RHL notes at Trac MeasAlgorithmsDesignII.

3.2.7.5 Commissioning vs. Main Processing <Issue>

One will be for a "commissioning" run, where we need to populate the PSF Standards Catalog, Photometric Standards Catalog, Astrometeric Standards Catalog. This information will not be known a priori.This pipeline will have more responsibilities than the main pipeline.It needs to "select" as opposed to "use" these Catalogs.

The other will be for nightly operations, assuming these Catalogs exist.

This is a recommendation for a commissioning pipeline.

3.2.7.6 Complex Shape Determination <Issue>How to deal with shapes that are not well fit to common symmetric models.

Basically how to detect arcs - wavelets!?!

3.2.7.7 Inter-CCD/Raft/Focal Plane Communication <Issue>Amps in CCD may share similar PSF Model, Sky, ZeroPoint.

CCDs in Raft may share similar PSF Model, Sky, ZeroPoint.

Rafts in Focal Plane may share similar PSF Model, Sky, ZeroPoint.

More complex model can be built with additional constraints outside of single data unit.

3.2.8 Image Coaddition Pipelines

includes software programs, configuration files, unit tests, component integration tests, and documentation implementing the Image Coaddition Pipelines with the following capabilities:

54

Page 68: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

· Mask Moving Object Footprints in Exposure· Generate Deep Co-add· Generate Co-add without Outlier Rejection· PsfMatch Multiple Exposures· Generate Template Exposure

Deep Coadd Generation Pipeline - Activity Diagramact Deep Coadd Generation Pipeline

CoAdd :CoaddedExposure

deepCoAdds :CoaddedExposure

CoAddTile

«structured»Deep Coadd Generation Pipeline

CoAdd :CoaddedExposure

deepCoAdds :CoaddedExposure

CoAddTile

Start Deep Coadd GenerationPipeline

End Deep Coadd GenerationPipeline

Acquire Sky Tile Metadata

Coadd

Convert Exposure to SkyMapImage

Coadd

Generate ChiSquared Coadd

Deep Coadd Generation Pipeline - Activity Diagram

55

Page 69: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

act PSFMatch Pipeline

inputExposure :CalibratedExposure

inputPSFModel:Spatial Model

desiredPSFModel:Spatial Model

skyMapScheme :SkyMap Scheme

skyMapImage :Sky MapImage

«structured»PSFMatch Pipeline

inputExposure :CalibratedExposure

inputPSFModel:Spatial Model

desiredPSFModel:Spatial Model

skyMapScheme :SkyMap Scheme

skyMapImage :Sky MapImage

Acquire Sky Tile Metadata

psfMatchedExposure :Calibrated Exposure

skyMapImage :Sky MapImage

skyMapScheme

Warp Exposure to SkyMapImage

psfMatchedExposure :Calibrated Exposure

skyMapImage :Sky MapImage

skyMapScheme

PSFMatchedExposure:Calibrated Exposure

inputPSFModel inputExposure :CalibratedExposure

desiredPSFModel

Match PSF

PSFMatchedExposure:Calibrated Exposure

inputPSFModel inputExposure :CalibratedExposure

desiredPSFModel

StartPSFMatchPipeline

End PSFMatch Pipeline

Build Exposure Stack

Reject Outliers

56

Page 70: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

PSFMatch Pipeline - Activity Diagramact TemplateGen Pipeline

End TemplateGen

Begin TemplateGen

Template :Template Exposure

list :Sky Map Image

«structured»TemplateGen Pipeline

Template :Template Exposure

list :Sky Map Image

list :Sky MapImage

Template :CalibratedExposure

single :Sky Tile«structured»

Coadd without Outlier Rejection

list :Sky MapImage

Template :CalibratedExposure

single :Sky Tile

single :Sky Tile

list :Sky MapImage

«structured»Generate Sky Tile

single :Sky Tile

list :Sky MapImage

TemplateGen Pipeline - Activity Diagram

3.2.8.1 Create Deep Coadds <UseCase>

DESCRIPTION: Create Deep Coadds -

BASIC COURSE:

3.2.8.2 Deep Coadd Generation Pipeline <Activity>

57

Page 71: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

DESCRIPTION: Deep Coadd Generation Pipeline (aka CoaddGen) handles the co-addition of masked Exposures to form deep templates. There will be 6 of these, one per filter. Each slice operates on a single science CCD Exposure, adding it to the appropriate filter's deep co-add.

INPUTS:- PSF Matched Exposure

OUTPUTS:- Coadd ExposureweightMap- weight

ALGORITHM:1. Process stage: Acquire SkyTile Metadata

2. Process stage: Generate ChiSquared Coadd

3. Process stage: Convert Exposure To SkyMapImage

EXCEPTIONS:1. CoaddGen policy not found

2. Stage not found

3. Stage terminated in error

3.2.8.2.1 Acquire Sky Tile Metadata <Activity>

ALGORITHM:

EXCEPTIONS:

3.2.8.2.2 Convert Exposure to SkyMapImage <Activity>

ALGORITHM:

EXCEPTIONS:

58

Page 72: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.8.2.3 Generate ChiSquared Coadd <Activity>

ALGORITHM:

EXCEPTIONS:

3.2.8.3 PSFMatch Pipeline <Activity>

DESCRIPTION: PSFMatch Pipeline - Psf-match one exposure to another. Exposures must have the same WCS to get reasonable results.

INPUTS:- calibratedExposure- skyMapScheme- input PSFModel- desired PSFModel

OUTPUTS:coaddSkyMap Image

ALGORITHM:1. Process stage: Acquire SkyTile Metadata

2. Process stage: Match PSF

3. Process stage: Warp Exposure to SkyMap Image

4. Process stage: Build Exposure Stack

5. Process stage: Reject Outliers

EXCEPTIONS:

3.2.8.3.1 Acquire Sky Tile Metadata <Activity>

Acquire Sky Tile Metadata

INPUTS:

59

Page 73: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

OUTPUTS:

ALGORITHM:

EXCEPTIONS:

NOTES:

3.2.8.3.2 Build Exposure Stack <Activity>

Build Exposure Stack - Add warped and psf-matched exposures to a coadd

INPUTS:

OUTPUTS:

ALGORITHM:

EXCEPTIONS:

3.2.8.3.3 Match PSF <Activity>

Match PSF - A set of Calibrated Exposures is PSF-matched to a PSF Spatial Model.

Due to the primary use for image subtractions, the important characteristics of  the PSF are:- shoud be compact and normal-looking- must vary slowly, without jumps or discontinuities- must be the same for faint and bright sources

PSF Matching works best if images are convolved (if their PSF is narrower than the model PSF) instead of deconvolved ( to match a narrower PSF model), because deconvolution introduces noice.

INPUTS:

set of Calibrated Exposures, input PSF Spatial Model,desired PSF Spatial Model

OUTPUTS:

60

Page 74: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

PSF-matched Exposure

ALGORITHM:

1. Compute a coadd PSF Spatial Model based on these input Calibrated Exposures whose input PSF Spatial Model is wider than most or all of the input images.

2. For each input image:

2.1. PSF Match the input Calibrated Exposure to the coadd PSF Spatial Model; ie compute a kernel that matches the Calibrated Exposure's PSF Spatial Model to the desired PSF Spatial Model.

EXCEPTIONS:

NOTES:

Re step 2.1: Each input image will already have a PSF Spatial Model that suffices for this, from another part of the pipeline.

3.2.8.3.4 Reject Outliers <Activity>

Reject Outliers - Combine a list of masked images computing an outlier-rejected mean.

INPUTS:

OUTPUTS:

ALGORITHM:

EXCEPTIONS:

3.2.8.3.5 Warp Exposure to SkyMapImage <Activity>

Warp Exposure to SkyMapImage - Warp one exposure to match a reference exposure.

INPUTS:

CalibratedExposure,SkyMapScheme

61

Page 75: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

OUTPUT:

SkyMapImage

ALGORITHM:

1. Determine SkyMapIdCollection that overlaps the Calibrated Exposure. Including border cases (sky pixels with partial overlap of the Calibrated Exposure boundaries) and possibly even pixel data list.

2. For each SkyMapPixelId in the SkyMapImage:

2.1. Compute the associated SkyMapImage pixel value from the Calibrated Exposure using a Lanczos kernel (or similar)

2.2. If value is valid, add it to the SkyMapImage

2.3. If value is invalid, set pixel to 'Edge" invalid value and add it to the SkyMapImage.

3. Return sky pixel data list (ie SkyMapImage)

EXCEPTIONS:

NOTES:

This is not symmetrical with' Warp SkyMapImage to Exposure." One could image a variant that takes a sky pixel data list and updates it (and sets missing pixels to 'edge'). However, the use case as described more closely meets our needs (see the coadd use cases), and it's not clear we need both.

3.2.8.4 TemplateGen Pipeline <Activity>

DESCRIPTION: Template Generation (TemplateGen) Pipeline- builds coadded SkyMapImage to be used to construct Template Exposure

INPUTS:- list of Calibrated Exposures

OUTPUTS:- Coadd SkyMapImage (instance of SkyMapDataList),- Weight Map (instance of SkyMapDataList)

ALGORITHM:1. Process stage: Generate Sky Tile

62

Page 76: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

2. Process stage: Coadd without Outlier Rejection

EXCEPTIONS:1. TemplateGen policy not found

2. Stage not found

3. Stage terminates in error

3.2.8.4.1 Coadd without Outlier Rejection <Activity>

INPUTS:- list of Calibrated Exposures

OUTPUTS:- Coadd SkyMapImage (instance of SkyMapDataList)- Weight Map (instance of SkyMapDataList)

ALGORITHM:1. Create blank Coadd SkyMapImage and weight map2. For each Calibrated Exposure: Compute weight; add the sky map data list to the Coadd SkyMapImage; update weight map3. Divide final Coadd SkyMapImage by weight map

EXCEPTIONS:1. Coadd without OR policy not found2. Input list of CalibratedExposures not found

NOTES:Weight is 1/mean variance, where mean variance is computed with outlier rejection .

3.2.8.4.2 Generate Sky Tile <Activity>

ALGORITHM:

EXCEPTIONS:

63

Page 77: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.9 Image Processing Pipeline

includes software programs, configuration files, unit tests, component integration tests, and documentation that implement the following capabilities:

· Acquire and Assemble Image· Remove Instrument Signature· Determine WCS· Determine PSF· Create Difference Image· Generate Image Quality Metrics

CCD Assembly Pipeline - Activity Diagramact CCD Assembly Pipeline

«structured»CCD Assembly Pipeline

Identify Defects

CCD Assembly

SDQA for CCD Assembly

Start CCDAssembly

End CCDAssembly

CCD Assembly Pipeline - Activity Diagram

64

Page 78: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011act ISR Pipeline

InstrumentCal:CalibrationProduct

PostISRImage

RawImage :Segment

«structured»ISR Pipeline

InstrumentCal:CalibrationProduct

PostISRImage

RawImage :Segment

Acquire Visit Metadata

Transform Metadata

Validate Metadata

Identify Calibration Product

Linearity

Remove Saturation

Remove Overscans

Remove Bias

Apply Darks

Flatten

Defringe

EndISR

StartISR

Add Variance

Remove Cross-Talk

SDQA for ISR

65

Page 79: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

ISR Pipeline - Activity Diagram

66

Page 80: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

act Image Characterization Pipeline

astrometricStandards :Astrometric Standard

calibratedExposure :CalibratedExposure

photometricStandards :Photometric Standard

postIsrImage

«structured»Image Characterization Pipeline

astrometricStandards :Astrometric Standard

calibratedExposure :CalibratedExposure

photometricStandards :Photometric Standard

postIsrImage

image :MaskedImage

detections :Detection

Bright Star Detection

image :MaskedImage

detections :Detection

psf :PSF

sources :Source

PSF Determination

psf :PSF

sources :Source

standards :AstrometricStandard

sources :Source

wcs :WCS

WCS Determination

standards :AstrometricStandard

sources :Source

wcs :WCS

psf :PSF

exposure :CalibratedExposureimage :Image

wcs :WCS

photoCalInfo :PhotometricCalibration

Exposure Generation

psf :PSF

exposure :CalibratedExposureimage :Image

wcs :WCS

photoCalInfo :PhotometricCalibration

image :MaskedImage

detections:Detection

sources :Source

Bright Star Measurement

image :MaskedImage

detections:Detection

sources :Source

sources:Source

standards :PhotometricStandard

psf :PSF

photoCalInfo :PhotometricCalibration

wcs :WCSCCD Photometric

Calibrationsources:Source

standards :PhotometricStandard

psf :PSF

photoCalInfo :PhotometricCalibration

wcs :WCS

Synch

Start Image CharacterizationPipeline

End ImageCharacterizationPipeline

psf Aperture Correctionpsf wcsWCS Verification wcs

67

Page 81: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

Image Characterization Pipeline - Activity Diagramact CR Split Handling Pipeline

visitImage1 visitImage2

visitSum

«structured»CR Split Handling Pipeline

visitImage1 visitImage2

visitSum

inputImage1 inputImage2

detection

Simple Image Differencing

inputImage1 inputImage2

detection

inputImage

outputImage

Find and Mask CRs 1

inputImage

outputImage outputImage

inputImage

Find and Mask CRs 2

outputImage

inputImage

cosmicsinputImage1 inputImage2

maskedSum

Mask and Sum

cosmicsinputImage1 inputImage2

maskedSum

bgSubtractedImage

image

Background Estimation 1

bgSubtractedImage

image image

bgSubtractedImage

Background Estimation 2

image

bgSubtractedImage

Start CR SplitHandling

Fork1

Fork1 Sync

End CR Split HandlingPipeline

CR Split Handling Pipeline - Activity Diagram

68

Page 82: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.9.1 Process Raw Images to Calibrated Images <UseCase>

DESCRIPTION: Process Raw Images to Calibrated Images -

BASIC COURSE:

Process Raw Images to Calibrated Images - Use Case Diagramuc Process Raw Images to Calibrated Images

«Controller»Remove Instrument

Signature

«Controller»Process Raw Images to

Calibrated Images

«System»Mask Cosmic Rays

«System»Generate Image Quality

Metrics

«System»Add Fake Objects

«System»Mask Satellite Trails

«Controller»Determine WCS for a

Science Image

«System»Measure Bright Sources

«System»Determine Rough Photometric

ZP

«System»Build PSF Model

Prototype is TBD

«invokes»

«precedes»

«precedes»

«precedes»

«precedes»

«precedes»

«precedes»

«invokes»

«precedes»

«invokes»

«invokes»

«invokes»«invokes»

«invokes»

«invokes»«invokes»

«precedes»

Name:Package:Version:Author:

Process Raw Images to Calibrated ImagesImage Processing Pipeline1.0LSST_EA user

Process Raw Images to Calibrated Images - Use Case Diagram

69

Page 83: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.9.1.1 Required TF <Requirement>There are only order 5 ops per pixel to debias, flatfield, and fringe correct. So the number of ops per image is then about 1.5 x 10^10. If the 1 TF required for crosstalk is applied to this as well, only about 15 msec are added to the processing time. Negligible

3.2.9.1.2 Build PSF Model <UseCase>

Build PSF Model -

BASIC COURSE:

System retrieves Detection Pipeline Policy from Policy Library.

System gets CCD Image, (includes Image Mask, WCS, Sky Region and Noise Image) from Image Stream

System queries PSF Standards Catalog for PSF stars in the Sky Region of the CCD Image returning an Object Collection.

System creates a Source Amplifier Pixels Collection of PSF stars in the CCD Image using the WCS to convert from sky coordinates to x,y position.System resamples each Source Amplifier Pixels Collection (recentering on PSF center and possibly compensating for WCS distortion).

System builds spatially varying PSF Model for Amplifier Pixels, ignoring Amplifier Pixels in Image Mask.

System logs date and return status (by middleware).

ALTERNATE:

Too few PSF stars for model: throw exception (e.g. System reduces the complexity of PSF fit and calls PSF Model; grab PSF from neighboring CCDs??).

PSF Model fails : Throw a pipeline failed exception, log the failure of the processing step. Refers to Policy for optional proceedure (i.e., simple FWHM measure and assume Waussian profile).

PSF fit RMS exceeds Policy setting: refer to Policy options (i.e., trying higher or lower order).

3.2.9.1.3 Add Fake Objects <UseCase>

70

Page 84: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

Add Fake Objects -

BASIC COURSE:

Get lpolicyGet listof objects per policyIterate the list Add object based on PSF

ALTERNATE COURSES:

3.2.9.1.4 Determine WCS for a Science Image <UseCase>

Determine WCS for a Science Image -

BASIC COURSE:

System creates an instance of the "Determine WCS Controller" processing step.

Controller processing step gets the policy for determining WCS from the policy library, gets the observing configuration from the image metadata, and computes sky regions for the science image and for all contained CCD images.

System loops over all CCD images within the science image

Get CCD Image Invoke "Determine WCS for a CCD Image" Put CCD Image

End Loop

System determines global WCS fitInvoke "Determine Global WCS Fit"

System re-computes sky regions for the science image and for all contained CCD images.

ALTERNATE COURSES:

Unable to get CCD Image, Unable to Put CCD Image: throw a pipeline failed exception

Pipeline Failed Exception Caught: abort the processing step increment the fail count get the alternate parameter set from the policy

71

Page 85: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

if the fail count doesn't exceed the allowed maximum then retry the processing step with a new parameter set

3.2.9.2 ISR Pipeline <Activity>

DESCRIPTION: Instrument Signature Removal Pipeline removes the instrument signature and standardizes the input image and metadata into LSST format. Each slice operates on a single amplifier image.

INPUTS:- raw Image- instrument calibration data: bias, dark, flat

OUTPUTS:- maskedImage

ALGORITHM:1. Process stage: Acquire Visit Metadata2. Process stage: Transform Metadata 3. Process stage: Validate Metadata 4. Process stage: Identify Calibration Product 5. Process stage: Remove Cross-Talk 6. Process stage: Remove Saturation 7. Process stage: Remove Overscans 8. Process stage: Remove Bias 9. Process Stage: Add Variance10. Process stage: Apply Darks11. Process stage: Linearity 12. Process stage: Defringe13. Process stage: Flatten14. Process stage: SDQA for ISR

EXCEPTIONS:1. ISR Policy not found

2. Stage not found

3. Stage terminated in error

3.2.9.2.1 Remove Cross-Talk <Activity>

ALGORITHM:

EXCEPTIONS:

72

Page 86: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.9.2.2 SDQA for ISR <Activity>

ALGORITHM:

EXCEPTIONS:

3.2.9.2.3 Acquire Visit Metadata <Activity>

ALGORITHM:

EXCEPTIONS:

3.2.9.2.4 Add Variance <Activity>

INPUTS:- Exposure- ISR Policy- Variance Policy

OUTPUTS:- modifies Exposure to add variance

ALGORITHM:1. get ObservationMetadata from Exposure2. if VarianceProcessed exists in ObservationMetadata, exit algorithm3. get Variance Policy from ISR Policy4. get MaskedImage from Exposure5. .....add variance to MaskedImage9. update ObservationMetadata to add VarianceProcessed10. update Exposure with modified MaskedImage and modified ObservationMetadata

EXCEPTIONS:

73

Page 87: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.9.2.5 Apply Darks <Activity>

INPUTS:- Calibrated Exposure- ISR Policy- Dark Calibration Exposure

OUTPUTS:- dark corrected Calibrated Exposure

ALGORITHM:

1. get ObservationMetadata from Calibrated Exposure2. if DarkCorrectionProcessed flag exists in ObservationMetadata, then exit algorithm3. get Dark Exposure from ISR Policy4. get DarkObservationMetadata from Dark Exposure5. get Scaling from ObservationMetadata6. get DarkScaling from DarkObservationMetadata7. calculate Scale from Scaling and DarkScaling8. get MaskedImage from Calibrated Exposure9. get DarkMaskedImage from Dark Exposure10. scale the MaskedImage based on Scale and DarkMaskedImage11. add DarkCorrectionProcessed flag to Observation Metadata12. update the Calibrated Exposure with the modified MaskedImage and modified ObservationMetadata

EXCEPTIONS:

3.2.9.2.6 Defringe <Activity>

ALGORITHM:

EXCEPTIONS:

74

Page 88: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.9.2.7 Flatten <Activity>

INPUT:- Exposure- ISR Policy OUTPUTS:- modified Exposure

ALGORITHM:

EXCEPTIONS:

3.2.9.2.8 Identify Calibration Product <Activity>

ALGORITHM:

EXCEPTIONS:

3.2.9.2.9 Linearity <Activity>

INPUTS:- Exposure- ISR Policy- Linearity Table

OUTPUTS:- modified Exposure which has been linearized

ALGORITHM:1. get ObservationMetadata from Exposure2. if LinearityProcessed flag exists in ObservationMetadata, exit algorithm

3. get Linearity Table from ISR Policy4. get Gain from ObservationMetadata5. get MaskedImage from Exposure6. apply Linearity Table and Gain to MaskedImage

75

Page 89: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

7. add LinearityProcessed flag to ObservationMetadata8. update Exposure with modified MaskedImage and modified ObservationMetadata

EXCEPTIONS:

3.2.9.2.10 Remove Bias <Activity>

INPUTS:- Calibrated Exposure- ISR Policy- Bias Exposure

OUTPUTS:- bias-corrected Calibrated Exposure

ALGORITHM:1.get ObservationMetadata from Calibrated Exposure2. if BiasCorrectionProcessed flag exists in ObservationMetadata, exit algorithm

3. get Bias Exposure from ISR Policy4. get BiasMaskedImage from Bias Exposure5. get MaskedImage from Calibrated Exposure6. subtract the BiasMaskedImage from MaskedImage

7. add BiasCorrectionProcessed flag to ObservationMetadata8. update Calibrated Exposure with modified MaskedImage and modified ObservationMetadata

EXCEPTIONS:

3.2.9.2.11 Remove Overscans <Activity>

INPUTS:- Exposure- ISR Policy- Overscan Policy

OUTPUTS:

76

Page 90: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

- modifies Exposure to remove overscan regions

ALGORITHM:1. get ObservationMetadata from Exposure2. if OverscanProcessed exists in ObservationMetadata, exit algorithm3. get Overscan Policy from ISR Policy4. get MaskedImage from Exposure5. get OverscanRegion from ObservationMetadata, if it exists; else get from OverscanPolicy 6. build OverScan BoundingBox based on OverScan Region7. get the OverscanFitType from the OverscanPolicy8. based on the OverscanFitType {mean, median, poly} 8.1 remove the overscan from the MaskedImage9. update ObservationMetadata to remove Overscan and add OverscanProcessed10. update Exposure with modified MaskedImage and modified ObservationMetadata

EXCEPTIONS:

3.2.9.2.12 Remove Saturation <Activity>

INPUTS:- Exposure- ISR Policy- InterpolateFlag

OUTPUTS:- modifies Exposure to correct for saturation

ALGORITHM:1. get ObservationMetadata from Exposure2. if SaturationCorrectionProcessed flag exists in ObservationMetadata, exit algorithm

3. get MaskedImage from Exposure4. get Saturation Policy from ISR Policy5. get Saturation Threshold from Saturation Policy6. create DetectionSet from MaskedImage and Saturation Threshold

7. create Footprint Collection from the DetectionSet

8. get SaturatedPixelMask from MaskedImage 9. get GrowSaturated from Saturation Policy10. get InterpolateFlag from ISR Policy

11. for each Footprint in the Footprint Collection 11.1 create a SaturatedFootprint based on GrowSaturated 11.2 update SaturatedPixelMask using SaturatedFootprint

77

Page 91: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

11.3 if InterpolateFlag indicates so, 11.3.1 create a Defect based on the BoundingBox of the SaturatedFootprint 11.3.2 add the Defect to the DefectMap12. update Mask with the SaturatedPixelMask

13. if InterpolateFlag indicates so, 13.1 get DefaultFWHM from the ISR Policy 13.2 create a PSF based on DefaultFWHM 13.3 interpolate over MaskedImage based on PSF and DefectMap

14. add SaturationCorrectionProcessed flag to ObservationMetadata15. update Exposure with modified MaskedImage and modified ObservationMetadata

EXCEPTIONS:

3.2.9.2.13 Transform Metadata <Activity>

ALGORITHM:

EXCEPTIONS:

3.2.9.2.14 Validate Metadata <Activity>

Make sure that images are the same size, were derived from the same chunk of the focal plane, etc.

Things to check:* Image Size (all)* From the same piece of the focal plane(illum,flat)* From the same piece of silicon (bad pixel mask, bias)* Through the same filter (dome flat)* Appropriate for the date range (anything time variable; dflats, etc)

Not performed in DC3

INPUTS:- Exposure- ISR Policy

ALGORITHM:

78

Page 92: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

EXCEPTIONS:

3.2.9.3 CCD Assembly Pipeline <Activity>

DESCRIPTION: CCD Assembly Pipeline assembles amplifiers from ISR into CCD Exposures.

INPUTS:- segment of maskedImage

OUTPUTS:- assembled maskedImage

ALGORITHM:1. Process stage: Assemble Ccd

2. Process stage: Identify Defects

3. Process stage: SDQA for CCD Assembly

EXCEPTIONS:1. CcdAssemble policy not found

2. Stage not found

3. Stage terminated in error

3.2.9.3.1 CCD Assembly <Activity>

INPUTS:- Calibrated Exposure- CCD Assembly Policy

ALGORITHM:

EXCEPTIONS:

79

Page 93: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.9.3.2 SDQA for CCD Assembly <Activity>

ALGORITHM:

EXCEPTIONS:

3.2.9.3.3 Identify Defects <Activity>

INPUTS:- Calibrated Exposure- ISR Policy BadPixelMaskName InterpolateFlag

OUTPUTS:- modifed Calibrated Exposure with Bad Pixels removed

ALGORITHM:1. get ObservationMetadata from Calibrated Exposure2. if BadPixelsProcessed flag exists in ObservationMetadata, exit algorithm

3. get MaskedImage from Calibrated Exposure4. get BadPixelMaskName from ISR Policy5. get BadPixelMaskPlane from MaskedImage based on BadPixelMaskName

6. get DefectMap from MaskedImage7. for each Defect in DefectMap 7.1 get the Defect BoundingBox 7.2 build a Footprint from the Defect BoundingBox 7.3 mask out the Bad Pixels in the BadPixelMaskPlane using the Defect's Footprint 8. update the BadPixelMaskPlane in the MaskedImage

9. get InterpolateFlag from ISR Policy10. If InterpolateFlag so indicates, update the MaskedImage by interpolating over the DefectMap

11. add BadPixelsProcessed flag to ObservationMetadata 12. update Calibrated Exposure with modified MaskedImage and modified Observation Metadata

EXCEPTIONS:

80

Page 94: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.9.4 CR Split Handling Pipeline <Activity>

DESCRIPTION: Cosmic Ray Split Handling Pipeline This pipeline handles cosmic ray removal using the two "split" exposures per visit, algorithmically rejecting cosmics on each one and then doing a difference between them to find more cosmics. Background estimation is in this pipeline because it is required before CrReject.

Cosmic rays (CRs) are detected in two stages: in the first pass sources that are not similar in shape to a PSF are recorded. When pairs of images for a visit are processed together, a difference image is created from which additional CRs are recorded.

Following CR detection, a flag is set in the quality mask of the calibrated images for all pixels that are affected by CRs, and the corresponding pixels in the individual science frames are then interpolated over. The pair of visit exposures are then averaged to create a single science image.

Each slice operates on a visit pair of CCD images, assembling them into a single CCD exposure.

INPUTS:- maskedImage1- maskedImage2

OUTPUTS:- maskedImage

ALGORITHM:

1. Process stage: BackgroundEstimation{1 2}

2. Process stage: Find and Mask CRs {1 2}

3. Process stage: Simple Image Differencing

4. Process stage: Difference Detection

5. Process stage: Mask and Sum

EXCEPTIONS:1. CrSplit Policy not found

2. Stage not found

3. Stage terminates in error

NOTES:SimpleDiffIm should not need to use a Kernel, since the seeing is presumed to be identical across the two exposures.

81

Page 95: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

Similarly, SumStage should not need to do PSF-matching.

3.2.9.4.1 Background Estimation 1 <Activity>

INPUTS:- Background policy, Exposure, subtractFlag

OUTPUTS:- background,- backgroundSubtractedExposure

ALGORITHM:1. Acquire parameters in backgroundPolicy. 2. If subtract is true, make a copy of the exposure and subtract the background. 3. Return background, backgroundSubtractedExposure

EXCEPTIONS:

3.2.9.4.2 Background Estimation 2 <Activity>

INPUTS:- Background policy- Exposure- subtractFlag

OUTPUTS:- background- backgroundSubtractedExposure

ALGORITHM:1. Acquire parameters in backgroundPolicy. 2. If subtract is true, make a copy of the exposure and subtract the background. 3. Return background, backgroundSubtractedExposure

EXCEPTIONS:

82

Page 96: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.9.4.3 Find and Mask CRs 1 <Activity>

ALGORITHM:

EXCEPTIONS:

3.2.9.4.4 Find and Mask CRs 2 <Activity>

ALGORITHM:

EXCEPTIONS:

3.2.9.4.5 Mask and Sum <Activity>

ALGORITHM:

EXCEPTIONS:

3.2.9.4.6 Simple Image Differencing <Activity>

INPUTS:- Two calibrated science Exposures

OUTPUTS:- Difference Exposure

ALGORITHM:

EXCEPTIONS:

83

Page 97: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.9.5 Image Characterization Pipeline <Activity>

DESCRIPTION: Image Characterization Pipeline handles image characterization, including determination of PSF and WCS. It also handles initial CCD-exposure-level photometric calibration and background subtraction. It combines all of this information into a single Exposure. The CalibSources used for the determinations are saved. Each slice operates on a single CCD image and produces a science CCD exposure that is used throughout the rest of the Data Release Production.

INPUTS:- Photometric Standards,- Astrometric Standards,- maskedImage

OUTPUTS:- calibratedExposure

ALGORITHM:1. Process stage: Bright Star Detection

2. Process stage: Bright Star Measurement

3. Process stage: PSF Determination

4. Process stage: Aperture Correction

5. Process stage: WCS Determination

6. Process stage: WCS Verification

7. Process stage: CCD Photometric Calibration

8. Process stage: Exposure Generation

EXCEPTIONS:1. ImChar Policy not found

2. Stage not found

3. Stage terminated in error

3.2.9.5.1 Aperture Correction <Activity>1. Perform aperture photometry on the PSF stars, using a pre-configured radius (3.0 arcsec).2. Determine the aperture correction, defined as the ratio of Flux(PSF)/Flux(Aper), using a

84

Page 98: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

second-order polynomial to account for the spatial variation across the image. The aperture correction will be applied to all point sources that are identified.

ALGORITHM:

EXCEPTIONS:

3.2.9.5.2 Bright Star Detection <Activity>

INPUTS:- Calibrated science Exposure(s) (including background)

OUTPUTS:- background subtracted Exposure used in the detection, measured - background, psf used to smooth the exposure before detection, - PositiveFootprintSet, - NegativeFootprintSet

ALGORITHM:1. if more that one input exposure, create ExposureStack.

2. Create background Exposure from input Exposure (or ExposureStack)

3. Make a smoothing PSF according to the PSF policy

4. Perform detections using backgroundExposure, PSF and Detection Policy generating FootprintSets

5. Copy detectionMask to the input Exposures

EXCEPTIONS:1. Failure to find exposure: exit in error2. Failure to find PSF policy3. Failure to find SourceDetection policy

3.2.9.5.3 Bright Star Measurement <Activity>

INPUTS:- Exposure- Footprint Collection

OUTPUTS:

85

Page 99: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

- Source Collection

ALGORITHM: PT11. Merge positive and negative detection sets (aka Footprint Collection)

2. Determine Measure Algorithm to use from Measure Policy

3. Measure Sources using Exposure, PSF, Footprint Collection (is this done via the pre-PT1 method below?)

Exceptions:1. Measure Policy file not found2. Exposure File not found3. Footprint Collection not found4. PSF not found

pre-PT11. The WCS is extracted from the Exposure.2. For each Footprint in the Footprint Collection: 2.1 Measure its position and flux in the Exposure. 2.2 Create a new Source to contain the measured properties. 2.3 Calculate the ra and dec from the pixel coordinates using the WCS. 2.4 Add the Source to the Source Collection

EXCEPTIONS:

NOTES:An Exposure may be a DIAExposure. A Source may be a DIASource.

3.2.9.5.4 CCD Photometric Calibration <Activity>

INPUTS:- sourceMatchSet

OUTPUTS:- zeropint- zeropoint uncertainty

ALGORITHM:1. Convert fluxes to magnitudes2. Fit to get zeropoint (Fit a polynomial to a dataset in a manner that is highly insensitive to outliers)

EXCEPTIONS:

86

Page 100: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.9.5.5 Exposure Generation <Activity>

ALGORITHM:

EXCEPTIONS:

3.2.9.5.6 PSF Determination <Activity>

The size and shape of the point-spread function is determined from well isolated, relatively bright stellar sources in the visit image. The procedure is the following:1. From a list of candidate detected sources, retain only those with fluxes that exceed a relatively bright (configurable) threshold.2. Measure the second moments of these sources and exclude those that deviate significantly from the central locus (which are assumed to be populated by point sources).3. Perform PSF photometry on the remaining sources.

INPUTS:- lists of (exposure, sourceCollection) pairs

OUTPUTS:- PSF

ALGORITHM: pre-PT1

1. Select Sources from the Source Collection which are brighter than a Threshold in the PSF Determination Policy.

2. Identify those which are consistent with being a linear superposition of the PSF (which will in general be spatially varying).

3. Iteratively adjust the model of the PSF and the list of input components until a satisfactory PSF model has beenarrived at.

EXCEPTIONS:

87

Page 101: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.9.5.7 WCS Determination <Activity>

The per-CCD WCS solution is based on the Astrometry.net code. Briefly, the approach is to find asterisms composed of a few stars (typically 4) in the image, and search for similar asterisms (i.e., with similar relative geometry, invariant to position, rotation, and scale) in a reference catalog. This generates hypotheses about where the image might be on the sky. Each hypothesis is checked by predicting where other stars should be found, and evaluating this prediction using Bayesian decision theory. If the image already has a complete WCS, it is possible to skip the first stage and go straight to evaluating whether it is correct. Knowledge about the plate scale and an estimate of the pointing can be used to constrain (and thus speed up) the search.

WCS Determination: given an initial guess at a Wcs (hidden inside an exposure) and a set of sources (sourceSet), use astrometry.net to confirm the Wcs, then calculate distortion terms.

INPUTS:- policy- exposure- sourceset- filtername- trimFlag

OUTPUTS:- matchList- WCS

ALGORITHM:1. Extract an initial guess wcs if available2. Do a blind solve if we're told to, or if we don't have an input wcs3. Generate a list of catalogue objects in the field4. Now generate a list of matching objects5. Do sip corrections

EXCEPTIONS:

3.2.9.5.8 WCS Verification <Activity>

ALGORITHM:

EXCEPTIONS:

88

Page 102: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.10 Moving Object Pipelines (Day and Night)

includes software programs, configuration files, unit tests, component integration tests, and documentation that implement the following capabilities:

· Classify Orbits· Generate Alerts· Find Known Objects in Image· Find Tracklets· Link Tracklets· Orbit Maintenance

DayMOPS - Activity Diagram

89

Page 103: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

act DayMOPS

associatedDiaSources :DIASource

diaSources :DIA SourcepredictedEphemerides :Ephemerides

«structured»DayMOPS Pipeline

associatedDiaSources :DIASource

diaSources :DIA SourcepredictedEphemerides :Ephemerides

Setup DayMOPS

Intra Night Linking

Inter Night Linking

Orbit Determination

Orbit Management

End DayMOPS

StartDayMOPS

DayMOPS - Activity Diagram

90

Page 104: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

act Mask Moving Objects

inputExposure:CalibratedExposure

movingSources:DIA Source

outputExposure :Exposure

«structured»Mask Moving Objects Pipeline

inputExposure:CalibratedExposure

movingSources:DIA Source

outputExposure :Exposure

Acquire Visit Metadata

inputExposure :CalibratedExposure

movingSources :DIA Source

outputImage :MaskedImage

Mask Footprints

inputExposure :CalibratedExposure

movingSources :DIA Source

outputImage :MaskedImage

End Mask MovingObjects

Start Mask MovingObjects

Mask Moving Objects - Activity Diagram

3.2.10.1 Identify Moving Objects <UseCase>

DESCRIPTION: Identify Moving Objects -

BASIC COURSE:

Invoke: Find Known Objects in Image to "clean" the Subtracted Image Source Catalogs, leaving only the sources which have not been matched with an orbit.

Invoke: Find Trackets to link sources over a small time window (determined by Policy).

Invoke: Link Tracklets to link tracklets over a larger time window. These have enough observations to determine orbits, which are stored in an Orbit Catalog.

Invoke: Orbit Maintenance to do [we're not sure what].

91

Page 105: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

Invoke: Classify Orbits and Generate Alerts.

3.2.10.1.1 Running MOP before Transients <Issue>Moving object pipeline should be run before finding Sne to reduce noise and errors do to asteroids (assigning them as transient)

3.2.10.2 Mask Moving Objects from Image <UseCase>

DESCRIPTION: Mask Moving Objects from Image -

BASIC COURSE:Given the coordinates and time of an observation, calculate the ephemerides for objects in the Orbit Catalog which will appear in the Image

3.2.10.3 Mask Moving Objects Pipeline <Activity>

DESCRIPTION: Mask Moving Objects (MaskMovers) Pipeline handles the masking of moving objects from science CCD Exposures that are to be used for deep detection and multifit measurement. Each slice operates on a single science CCD Exposure as well as (some type of) footprints for all moving objects that appear within that Exposure.

ALGORITHM:1. Process stage: Acquire Visit Metadata

2. Process stage: MaskFootprints

EXCEPTIONS:1. MaskMovers policy not found

2. Stage not found

3. Stage terminated in error

3.2.10.3.1 Acquire Visit Metadata <Activity>

ALGORITHM:

92

Page 106: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

EXCEPTIONS:

3.2.10.3.2 Mask Footprints <Activity>

ALGORITHM:

EXCEPTIONS:

3.2.10.4 Compute Coarse Ephemerides for Night <Activity>

DESCRIPTION: Compute Coarse Ephemerides for Night -

INPUTS:- Solar System Object Catalog- MJD of current date and time.- Policy - LSST Observatory Code

OUTPUTS:- List of predicted positions: at last midnight, next midnight and the midnight after, for Solar System Object in the Solar System Object Catalog.

ALGORITHM:Compute MJD of the three midnights.For the Orbit of each Solar System Object in the Solar System Object Catalog: Compute Ephemerides at each midnight MJD

EXCEPTIONS:

Compute Coarse Ephemerides for Night - Analysis Diagram

93

Page 107: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

analysis Compute Coarse Ephemerides for Night

WBS: 02C.03.06 Moving Object Pipelines (Day and Night)

DESCRIPTION: Compute Coarse Ephemerides for Night -

INPUTS:- Solar System Object Catalog- MJD of current date and time.- Policy - LSST Observatory Code

OUTPUTS:- List of predicted positions: at last midnight, next midnight and the midnight after, for Solar System Object in the Solar System Object Catalog.

ALGORITHM:Compute MJD of the three midnights.For the Orbit of each Solar System Object in the Solar System Object Catalog: Compute Ephemerides at each midnight MJD

EXCEPTIONS:

Get current time

Compute Ephemerides

Get Next Solar SystemObject

:Solar System ObjectCatalog

:Solar System Object

:Ephemerides

:Time

Name: Compute Coarse Ephemerides for NightAuthor: Robyn AllsmanVersion: 1.0Created: 4/8/2009 1:21:08 PMUpdated: 10/26/2010 1:20:13 PM

Compute Coarse Ephemerides for Night - Analysis Diagram

3.2.10.5 Night MOPS Pipeline <Activity>

DESCRIPTION: Night MOPS - Predict locations of known objects expected to appear in difference image.

INPUTS:- RA of the current telescope pointing.- Dec of the current telescope pointing.- MJD of current exposure.- NightMOPS Policy - Size of LSST FoV (diameter). - LSST observatory code. - Maximum allowable positional error ellipse size.

OUTPUTS:- List of predicted positions of solar system objects in the current exposure FoV (field of view).

ALGORITHM:Compute current FoV Bounding Box.

Fetch three night positions (i.e. at midnight of the previous night, current midnight, next midnight) for all known Solar System Objects, as produced by "Compute Coarse Ephemerides for Night".

94

Page 108: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

Select Solar System Objects whose Orbit can intersect current FoV bounding Box.

For each Solar System Object from the above step Compute precise Ephemerides (with positional uncertainties) at MJD of current exposure.

if Ephemerides is inside current FoV Bounding Box, append it to return list.end

EXCEPTIONS:

No known solar system object: return empty list.

No known solar system orbit can intersect current FoV Bounding Box: return empty list.

Orbit does not have covariance matrix: use maximum allowable error ellipse size as positional uncertainty (in the ephemerides comutation).

Night MOPS Pipeline - Analysis Diagram

95

Page 109: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

analysis Night MOPS Pipeline

WBS: 02C.03.06 Moving Object Pipelines (Day and Night)

DESCRIPTION: Night MOPS - Predict locations of known objects expected to appear in difference image.

INPUTS:- RA of the current telescope pointing.- Dec of the current telescope pointing.- MJD of current exposure.- NightMOPS Policy - Size of LSST FoV (diameter). - LSST observatory code. - Maximum allowable positional error ellipse size.

OUTPUTS:- List of predicted positions of solar system objects in the current exposure FoV (field of view).

ALGORITHM:Compute current FoV Bounding Box.

Fetch three night positions (i.e. at midnight of the previous night, current midnight, next midnight) for all known Solar System Objects, as produced by "Compute Coarse Ephemerides for Night".

Select Solar System Objects whose Orbit can intersect current FoV bounding Box.

For each Solar System Object from the above step Compute precise Ephemerides (with positional uncertainties) at MJD of current exposure.

if Ephemerides is inside current FoV Bounding Box, append it to return list.end

EXCEPTIONS:

No known solar system object: return empty list.

No known solar system orbit can intersect current FoV Bounding Box: return empty list.

Orbit does not have covariance matrix: use maximum allowable error ellipse size as positional uncertainty (in the ephemerides comutation).

Name: Night MOPS PipelineAuthor: Robyn AllsmanVersion: 1.0Created: 4/8/2009 1:19:09 PMUpdated: 2/4/2011 1:25:50 PM

Night MOPS Pipeline - Analysis Diagram

3.2.10.6 DayMOPS Pipeline <Activity>

DESCRIPTION: DayMOPS Pipeline - generates the predicted ephemerides for the current exposure's sky position

INPUTS:- exposure metadata

96

Page 110: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

- DiaSources- Solar System Objects

OUTPUTS:- moving DiaSources- predicted Ephemerides

ALGORITHM:1. Setup DayMOPS

2. Process Stage: Intra Night Linking

3. Process Stage: Inter Night Linking

4. Process stage: Orbit Determination

5. Process stage: Orbit Management

EXCEPTIONS:

3.2.10.6.1 Inter Night Linking <Activity>

Inter Night Linking - Look for inter-night tracks between unprocessed tracklets and tracklets from previous nights.

INPUTS:

OUTPUTS:

ALGORITHM:

EXCEPTIONS:

3.2.10.6.2 Intra Night Linking <Activity>

Intra Night Linking - Link detections from oldest unprocessing night to form tracklets.

INPUTS:

OUTPUTS:

97

Page 111: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

ALGORITHM:

EXCEPTIONS:

3.2.10.6.3 Orbit Determination <Activity>

Orbit Determination - Attempt to find orbits for all unprocessed tracks

INPUTS:

OUTPUTS:

ALGORITHM:

EXCEPTIONS:

3.2.10.6.4 Orbit Management <Activity>

Orbit Management - Find and merge redundant orbits

INPUTS:

OUTPUTS:

ALGORITHM:

EXCEPTIONS:

3.2.10.6.5 Setup DayMOPS <Activity>

Setup DayMOPS - Determine linking work to be done and set up temporary tables

INPUTS:

OUTPUTS:

98

Page 112: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

ALGORITHM:

EXCEPTIONS:

3.2.11 Object Characterization Pipeline

includes software programs, configuration files, unit tests, component integration tests, and documentation that implement the following capabilities:

· Project a Deep Detection over all overlapping Exposures· Extract Postage Stamp Exposure Stacks for each Deep Detection· Fit each Deep Detection's astrometric model on its Exposure Stack· Use Forced Photometry to Measure the Source on each Exposure in the Exposure Stack

Galaxy Model Generation Pipeline - Activity Diagram

99

Page 113: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

act Galaxy Model Generation Pipeline

Detection

Per-Detection Processing«parallel»

(from Data Release Production Subsystem)

Detection

forcedSources

detection :Source Collection

ObjectAstrometric Model :Astrometric Model

ObjectMeasurements :AstroObjectModel

calibratedExposures :ExposureStack

«structured»Galaxy Model Generation Pipeline

forcedSources

detection :Source Collection

ObjectAstrometric Model :Astrometric Model

ObjectMeasurements :AstroObjectModel

calibratedExposures :ExposureStack

Start Multifit MeasurementPipeline

Detect Transforms

Postage Stamp Generation

Multifit

forcedSources

Forced Photometry

forcedSources

End MultifitMeasurement Pipeline

Galaxy Model Generation Pipeline - Activity Diagram

100

Page 114: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.11.1 Generate Galaxy Models <UseCase>

DESCRIPTION: Generate Galaxy Models

BASIC COURSE:

3.2.11.2 Galaxy Model Generation Pipeline <Activity>

DESCRIPTION: Galaxy Model Generation (formerly Multifit) Pipeline handles the projection of a detection from a coadd to the all the exposures that overlap it, using the list of projected detections to input a stack of "postage stamp" images, and fitting of a model using the multifit algorithm. This includes forced photometry of the resulting shape (prototype Object) on each image of the stack to form ForcedSources. Each slice operates on a single detection. For efficiency, slices should be handed detections that come from the same region of sky.

Unlike normal input stages, the stage for getting a stack of postage stamps on the clipboard determines what to input from data on its clipboard, not from policy.

INPUTS:- exposure subregion list,- detection model (updated in place)

OUTPUTS:- ChiSquare,- covarianceMatrix,forcedSources

ALGORITHM:

1. Process stage: Detect Transforms2. Process stage: PostageStamp Generation3. Process stage: Multifit4. Process stage: ForcedPhotometry

EXCEPTIONS:1. Multifit policy not found

2. Stage not found

3. Stage terminated in error

101

Page 115: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.11.2.1 Detect Transforms <Activity>

ALGORITHM:

EXCEPTIONS:

3.2.11.2.2 Forced Photometry <Activity>provided in the ExposureStack INPUTS:- Model(s): point source and/or small galaxy model - ExposureStack

ALGORITHM:

EXCEPTIONS:

3.2.11.2.3 Multifit <Activity> INPUTS: - initial model - ExposureList

OUTPUTS: - fit model - double : sgChisq - double : psChisq

ALGORITHM:

EXCEPTIONS:

102

Page 116: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.11.2.4 Postage Stamp Generation <Activity>

ALGORITHM:

EXCEPTIONS:

3.2.12 Photometric Calibration Pipeline

includes software programs, configuration files, unit tests, component integration tests, and documentation implementing the Photometric Calibration Pipeline with the following capabilities:

· Photometric Calibration· Difference Image Forced Photometry

Difference Forced Photometry - Activity Diagram

103

Page 117: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

act Difference Forced Photometry

predictedEphemerides :Ephemerides

diffImage :DifferenceExposure

forcedDiaSources:DIA Source

«structured»Difference Forced Photometry Pipeline

predictedEphemerides :Ephemerides

diffImage :DifferenceExposure

forcedDiaSources:DIA Source

Acquire Visit Metadata

predictedDiaSource :DIA Source

diffImage :DifferenceExposure

forcedDiaSource :DIASource

Measure Sources predictedDiaSource :DIA Source

diffImage :DifferenceExposure

forcedDiaSource :DIASource

Start DIfference ForcedPhotometry

End Difference ForcedPhotometry

predicted :Ephemerides predictedDiaSource :DIASource«structured»

Night MOPS Pipeline

(from Moving Object Pipelines (Day and Night))

predicted :Ephemerides predictedDiaSource :DIASource

Difference Forced Photometry - Activity Diagram

104

Page 118: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

act Photometric Calibration Pipeline

Name: Photometric Calibration PipelineAuthor: dougVersion: 1.0Created: 1/5/2009 2:33:55 PMUpdated: 2/4/2011 12:38:36 PM

Objects :AstroObject

Exposures :Calibrated Exposure

Sources

«structured»Photometric Calibration Pipeline

Objects :AstroObject

Exposures :Calibrated Exposure

Sources

Find Non-gray Extinction

Correct for Atmospheric Extinction

Assess TOA Mag Distributions

Calculate Photometric Calibration SDQA Metrics

Compare Grey Atm with IR Camera Data

Start Photometric Calibration Pipeline

End Photometric Calibration Pipeline

105

Page 119: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

Photometric Calibration Pipeline - Activity Diagram

3.2.12.1 Difference Image Forced Photometry <UseCase>

DESCRIPTION: Difference Forced Photometry performs forced photometry on difference images to form Forced DIA Sources.

BASIC COURSE:

3.2.12.2 Recalibrate Data Release Photometry <UseCase>

DESCRIPTION: Photometric Calibration performs the global photometric calibration of all DM data products.

BASIC COURSE:

3.2.12.3 Difference Forced Photometry Pipeline <Activity>

DESCRIPTION: Difference Forced Photometry (formerly DiffPhotom) Pipeline handles forced photometry on difference images to form ForcedDiaSources. Each slice operates on a single CCD difference image as well as the MovingObjects and transient Objects that occur within it.

INPUTS:* difference Image (DifferenceExposure),* moving Object Detection (Ephemerides),* unassociatedObjectList (DetectionCollection),* calibratedExposure

OUTPUTS:* forced DiaSource (DIASource)

ALGORITHM:1. Process stage: Acquire VisitMetadata

106

Page 120: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

2. Process stage: Run NightMOPS Pipeline

3. Process stage: Measure Sources

Exceptions:1. DiffPhotom policy not found

2. Stage not found

3. Stage terminated in error

3.2.12.3.1 Acquire Visit Metadata <Activity>

ALGORITHM:

EXCEPTIONS:

3.2.12.3.2 Measure Sources <Activity>

ALGORITHM:

EXCEPTIONS:

3.2.12.4 Photometric Calibration Pipeline <Activity>

DESCRIPTION: Photometric Calibration (PhotoCal) Pipeline handles the global photometric calibration of all data products. It is expected to extract data from the database.

INPUTS:- sources,- forcedSources,- DiaSources,- ForcedDiaSources,- CalibSources,- Objects,- MovingObjexts,- Exposure: - all from DB

OUTPUT:updates for all inputs from DB

ALGORITHM:

107

Page 121: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

1. Process stage: Acquire visit Metadata

2. Process stage: Find Non-gray extinction

3. Process stage: Correct for Atmospheric Extinction

4.Fork

4.1 Process Stage: Compare gray atmospheric with IR camera

4.2 Process Stage: Assess TOA Magnitude Distribution

5. Process Stage: Calculate Photometric Calibration SDQA metrics.

EXCEPTIONS:1. PhotoCal policy not found

2. Stage not found

3. Stage terminated in error

3.2.12.4.1 Assess TOA Mag Distributions <Activity>

Assess TOA Mag Distributions -

INPUTS:

OUTPUTS:

ALGORITHM:

EXCEPTIONS:

NOTES:

3.2.12.4.2 Calculate Photometric Calibration SDQA Metrics <Activity>

Calculate Photometric Calibration SDQA Metrics -

INPUTS:

OUTPUTS:

ALGORITHM:

108

Page 122: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

EXCEPTIONS:

NOTES:

3.2.12.4.3 Compare Grey Atm with IR Camera Data <Activity>

Compare Grey Atm with IR Camera Data -

INPUTS:

OUTPUTS:

ALGORITHM:

EXCEPTIONS:

NOTES:

3.2.12.4.4 Correct for Atmospheric Extinction <Activity>

Correct for Atmospheric Extinction -

INPUTS:

OUTPUTS:

ALGORITHM:

Query the Object Catalog for all Calibration Standard Objects within the Data Release area, producing the Calibration Standards List.

Perform the specified number of iterations of the entire gray extinction algorithm, as follows :

For each Observing Filter in the List

Select Calibration Objects from Object Catalog, producing the Calibration Object ListInvoke Construct Least Squares SystemSolve Least Squares System resulting in a Gray Extinction Surface for each ExposureApply Gray Extinction Surface to Sources in Source CatalogUpdate Astro Object summary properties in Object Catalog

EXCEPTIONS:

Least Squares System fails to solve: throw LSQSolveFailure Exception

109

Page 123: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

NOTES:

Original Use Case notes:System gets # of iterations and Observing Filter List from Data Release Policy.

Invoke Find Non Gray Extinction

Query the Object Catalog for all Calibration Standard Objects within the Data Release area, producing the Calibration Standards List.

Perform the specified number of iterations of the entire gray extinction algorithm, as follows :

For each Observing Filter in the List

Select Calibration Objects from Object Catalog, producing the Calibration Object ListInvoke Construct Least Squares SystemSolve Least Squares System resulting in a Gray Extinction Surface for each ExposureApply Gray Extinction Surface to Sources in Source CatalogUpdate Astro Object summary properties in Object Catalog

Failure:Least Squares System fails to solve: throw LSQSolveFailure Exception

3.2.12.4.5 Find Non-gray Extinction <Activity>

Find Non-gray Extinction -

Get the DR time range from the Photometric Calibration Controller

INPUTS:

OUTPUTS:

ALGORITHM:Retrieve from the Atmospheric Data Catalog a list of all Aux Telescope Spectra that fall within the DR time range, generating the Aux Telescope Spectrum List

For each Spectrum in the List, retrieve from the Object Catalog the SED for the associated Object and attach it to the Spectrum

Aggregate the Spectra in the list into groups for each of which a single atmospheric model will be created, creating a set of Spectrum Lists

Iterate over the set:

110

Page 124: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

1. Get from the Atmospheric Data Catalog the MODTRAN Model Parameters from the previous element of the set, if there is one. If not, use a nominal parameter set from Policy.

2. Initialize the MODTRAN Model with the atmospheric parameters

3. Fit the MODTRAN Model to the Spectra

EXCEPTIONS:

NOTES:

3.2.13 Single Frame Measurement Pipeline

includes software programs, configuration files, unit tests, component integration tests, and documentation that implement the following capabilities:

· Detect and Measure Sources on a Single Frame· Apply Aperture Correction to Detected Sources· Convert Coordinates of Detected Sources into Sky Coordinates

Single Frame Measurement - Activity Diagram

111

Page 125: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

act Single Frame Measurement

inputExposure:CalibratedExposure

outputSources:Source

«structured»Single Frame Source Measurement Pipeline

inputExposure:CalibratedExposure

outputSources:Source

Start Single FrameMeasurement

Acquire Visit Metadata

Detect Sources

Measure Sources

End SingleFrameMeasurement

Go to code for partitioning into stages

Compute Source Sky Coordinates

Single Frame Measurement - Activity Diagram

112

Page 126: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.13.1 Measure Single Frame Sources <UseCase>

DESCRIPTION: Measure Single Frame Sources -

BASIC COURSE:

ALTERNATE COURSES:

3.2.13.2 Single Frame Source Measurement Pipeline <Activity>

DESCRIPTION: Single Frame Source Measurement Pipeline - measures Sources on a single frame. Each slice operates on a single science CCD Exposure.

INPUTS:- calibratedExposure,- PSF

OUTPUTS:- Sources,- positiveDetection,- background model

ALGORITHM:1. Process stage: Acquire Visit Metadata

2. Process Stage: Detect Sources

3. Process stage: Measure Sources

4. Process stage: Compute Source Sky Coordinates

EXCEPTIONS:1. SSF Policy not found

2. Stage not found

3. Stage terminated in error

3.2.13.2.1 Acquire Visit Metadata <Activity>

ALGORITHM:

EXCEPTIONS:

113

Page 127: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.13.2.2 Compute Source Sky Coordinates <Activity>

ALGORITHM:

EXCEPTIONS:

3.2.13.2.3 Detect Sources <Activity>

ALGORITHM:

EXCEPTIONS:

3.2.13.2.4 Measure Sources <Activity>

ALGORITHM:

EXCEPTIONS:

3.2.14 Science Data Quality Analysis Toolkit

includes software programs, configuration files, unit tests, component integration tests, and documentation that implement the following capabilities:

· Correlate SDQA Metric with Other Data· Correlate SDQA Metrics· Select SDQA Display Type· Analyze SDQA Metrics.

3.2.15 Science Data Quality Assessment Pipeline

114

Page 128: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

includes software programs, configuration files, unit tests, component integration tests, and documentation that implement the following capabilities:

· SDQA Pipeline per Exposure· Retrieve SDQA Flags and Status for all Exposures· Compute Selected Summary SDQA Information· SDQA per Amplifier

Alert Production SDQA Monitoring - Activity Diagramact Alert Production SDQA MonitoringStart Alert Production SDQA Monitoring

End Alert Production SDQA Monitoring

SDQA per Exposure Pipeline

Alert Production SDQA Monitoring - Activity Diagram

115

Page 129: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

act DC3 SDQA Pipeline per Exposure

Begin: DC3 SDQA Pipeline Post-Alert Processing,per Exposure

End: DC3 SDQA Pipeline Post-Alert Processing,per Exposure

DESCRIPTION:DC3 Post-Alert Production SDQA Processing, per Exposure

INPUTS:

OUTPUTS:

GIVEN:Completion of exposure's Alert production processing

ALGORITHM:Compute SDQA Metrics for Pipeline Products:* Check for Kernel Failures* Compare Median Delivered Seeing with Specification

Compute SDQA Summary Status for Pipeline Products:* Compute Delivered Seeing Metric Summaries* Compute Kernel Failure Metric Summaries

EXCEPTIONS:

NOTES:

Check for Kernel Failures Compare Median Delivered Seeing with

Specification

Synch

Compute Delivered Seeing Metric Summaries

Compute Kernel Failure Metric Summaries

Synch

Name: DC3 SDQA Pipeline per ExposureAuthor: Robyn AllsmanVersion: 1.0Created: 3/19/2009 2:19:21 PMUpdated: 12/8/2010 9:23:00 AM

DC3 SDQA Pipeline per Exposure - Activity Diagram

116

Page 130: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

act Generate SDQA Summary for Entire Alert Production Run

Retrieve SDQA Flags and Statuses for all Exposures

Compute Selected Summary SDQA

Information

Start: Generate SDQA Summary for Entire Run

End: Generate SDQA Summary for Entire Run

DESCRIPTION:Calculate the SDQA Summary information over the entire Alert Production Run.

INPUTS:

OUTPUTS:

GIVEN:

ALGORITHMS:Retrieve selected SDQA flags and SDQA status information for all Exposures in the Run.

Generate Summary.

EXCEPTIONS:

NOTES:

Name: Generate SDQA Summary for Entire Alert Production RunAuthor: Robyn AllsmanVersion: 1.0Created: 3/19/2009 2:09:58 PMUpdated: 11/3/2010 5:07:14 PM

Generate SDQA Summary for Entire Alert Production Run - Activity Diagram

117

Page 131: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

act SDQA Pipeline per Exposure

Generate SDQA Flags and SDQA Status for Pipeline

Products

DESCRIPTION:Run SDQA pipeline to retrieve SDQA Metric Rating values, compare with thresholds, set SDQA Flags and set SDQA Status for Exposure.

INPUTS:SDQA Metric Rating values and thresholds

OUTPUTS:SDQA Flags and SDQA Status for Exposure.

GIVEN:Exposure has been pipeline-processed, and SDQA Metric Rating values (supplied by the processing pipelines) have been computed and persisted.

ALGORITHM:

Determine if Rating value is "alarmable" (e.g. read from list or table)

If it is, then compare Rating to yellow and red thresholds (e.g from list or table)

Assign red or yellow alarm SDQA Flag

Take alarm action (e.g. color display, send a page, send an email)

Assign SDQA Status to Exposure (using SDQA Flag information and SDQA Staus criteria in list or table)

EXCEPTIONS:

NOTES:

Begin: Run SDQA Pipeline per Exposure

End: Run SDQA Pipeline per Exposure

Name: SDQA Pipeline per ExposureAuthor: Vince ManningsVersion: 1.0Created: 3/17/2009 3:36:32 PMUpdated: 12/8/2010 9:48:44 AM

Retrieve SDQA Metric Rating Values

SDQA Pipeline per Exposure - Activity Diagram

3.2.15.1 Assess Data Quality <UseCase>

02C.01.02.01 Science Data Quality Assessment Pipeline

DESCRIPTION:

Assess Data Quality -

BASIC COURSE:

Choose:invoke: Assess Data Quality for Calibration Productsinvoke: Assess Data Quality for Data Releaseinvoke: Assess Data Quality for Nightly Processing at Baseinvoke: Assess Data Quality for Nightly Processing at Archive

118

Page 132: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

ALTERNATE COURSES:

Assess Data Quality - Use Case Diagramuc Assess Data Quality

«Controller»Assess Data Quality for Nightly

Processing at Base

«Controller»Assess Data Quality for Data Release

«Controller»Assess Data Quality for Calibration

Products

«Controller»Assess Data Quality for Nightly

Processing at Archive

Observatory Control System

LSST Operations

SDQA Interactive Environment

Feedback To

Informs

Informs

Informs

Informs

Assess Data Quality - Use Case Diagram

3.2.15.1.1 Assess Data Quality for Nightly Processing at Archive <UseCase>

Assess Data Quality for Nightly Processing at Archive

On completion of a pre-defined number of observing nights or on command by Observatory Operations, the DMS does a complete assessment of the overall state of the LSST Data Products and produces Data Product Quality Reports. This assessment looks at the SRD-required observatory and mission satisfaction metrics, such as fields visited in each filter, % of raw images within photometric/astrometric specifications, etc.

BASIC COURSE:

119

Page 133: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

Choose:invoke: Analyze Astrometric Qualityinvoke: Analyze Image Qualityinvoke: Analyze Object Properties Qualityinvoke: Analyze Orbit Qualityinvoke: Analyze Outliersinvoke: Analyze Photometric Quality

ALTERNATE COURSES:

Assess Data Quality for Nightly Processing at Archive - Use Case Diagramuc Assess Data Quality for Nightly Processing at Archive

LSST User

(from Actors)

«System»Analyze Image Quality

«System»Analyze Photometric Quality

«System»Analyze Astrometric Quality

«System»Analyze Object Properties

Quality

«System»Analyze Outliers

«System»Analyze Orbit Quality

«Controller»Assess Data Quality for Nightly

Processing at Archive

«invokes»

«invokes»«invokes»

«invokes»

«invokes»

«invokes»

Assess Data Quality for Nightly Processing at Archive - Use Case Diagram

3.2.15.1.1.1 Analyze Astrometric Quality <UseCase>

Analyze Astrometric Quality

BASIC COURSE:

Choose:

120

Page 134: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

invoke: Analyze astrometric solutions in the image WCS; Flag outliers

invoke: Analyze proper motion/parallax solutions in the object database Flag outliers

ALTERNATE COURSES:

3.2.15.1.1.2 Analyze Image Quality <UseCase>

Analyze Image Quality

BASIC COURSE:

Analyze a wide variety of possible image quality issues:

1. Quality of calibration steps (flatfield, debias, defringe, etc)

2. Analyse artifacts: cosmic rays; ccd traps, bad columns, etc; satellite trails; stray light

3. Analyze telescope optical performance: psf shape over the field and associated wavefront params

4. Determine atmospheric seeing parameters - including spatial correlation

Flag Outliers

ALTERNATE COURSES:

3.2.15.1.1.3 Analyze Object Properties Quality <UseCase>

Analyze Object Properties Quality

BASIC COURSE:

Analyze the quality of object properties: Shape Type classification Deblending Photo Z

Flag Outliers

ALTERNATE COURSES:

121

Page 135: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.15.1.1.4 Analyze Orbit Quality <UseCase>

Analyze Orbit Quality

BASIC COURSE:

Analyze quality of fit of orbits to observations

Flag Outliers

ALTERNATE COURSES:

3.2.15.1.1.5 Analyze Outliers <UseCase>

Analyze Outliers

BASIC COURSE:

Given a set of object properties identified as "outliers", determine all observations (including calibrations) that have contributed to this set of data, calling on the system's provenance tracking information.

Perform interactive analysis on this chain of data, calling upon all available information from the Facility DB as well as the images involved, the object databases, and information from auxiliary sensors such as the cloud camera

ALTERNATE COURSES:

3.2.15.1.1.6 Analyze Photometric Quality <UseCase>

Analyze Photometric Quality

BASIC COURSE:

Analyze photometric quality using several methods 1. Lightcurve analysis 2. CMD analysis 3. Global consistency of standards

Flag Outliers

ALTERNATE COURSES:

122

Page 136: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.15.2 SDQA Interactive Environment <UseCase>

02C.01.02.01 Science Data Quality Assessment Pipeline

DESCRIPTION:

SDQA Interactive Environment

BASIC COURSES:

Choose:invoke: Adjust SDQA Thresholdsinvoke: Analyze SDQA Metrics using SQuATinvoke: Check Basic Integrity of Cataloginvoke: Check Coadd QA Diagnosticsinvoke: Compare Uncertainties in Object Properties with Expectationinvoke: Compute Completeness and Reliabilityinvoke: Correlate SDQA metric with other datainvoke:Correlate SDQA metricsinvoke: Examine Data Quality Statusinvoke: Generate Data Quality Alarminvoke: Modify Operations base on Data Quality Statusinvoke: Override Data Quality Status/Add Commentsinvoke: Perform Large-Scale Sanity Checksinvoke: Present Default Detailed SDQA Informationinvoke: Present Default Summary SDQA Informationinvoke: Review Data for Specific Period of Timeinvoke: Review Data from Facilities Databaseinvoke: Review Default SDQA Datainvoke: Review Histogramsinvoke: Review Histograms Genericinvoke: Review Time Seriesinvoke: Review data for specific region of focal planeinvoke: Select SDQA Display Typeinvoke: Select SDQA Taskinvoke: Summarize Quality of Observing Conditionsinvoke: View Calibration Datainvoke: View Data from Ancillary Telescopeinvoke: View Processed Imageinvoke: View Raw Exposure Imageinvoke: View Selected Catalog Data

ALTERNATE COURSES:

123

Page 137: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

SDQA Analysis Overview - Use Case Diagramuc SDQA Analysis Overview

SDQA Analyst

A

Analyze SDQA Metrics using SQuAT

Override Data Quality Status/Add Comments

Modify Operations Based on Data Quality Status

Adjust SDQA Thresholds

Examine Data Quality Status

Generate Data Quality Alarm

Present Default Detailed SDQA

Information

Present Default Summary SDQA

Information

Observatory Operations (Camera

Scientist/Pipeline Controller/Scheduler)

Post DC3 Specification

Changes:

Post-Alert Production Processing merges:* Compute Metrics for Pipeline Products* Compute Selected Summary SDQA Information

Since those operations should probably be performed as a Pipeline performed on completion of the Alert Production.

Post-Alert Production SDQA Processing and Alert Production SDQA Processing are ACTIVITIES inDC3/Pipelines and Productions/ SDQA/Alert Production/

Name: SDQA Analysis OverviewAuthor: Debra LevineVersion: 1.0Created: 3/19/2009 11:44:31 AMUpdated: 12/8/2010 9:22:59 AM

DC3 SDQA per Exposure Pipeline

SDQA per Exposure Pipeline

«precedes»

«precedes»

«precedes»

«precedes»

«precedes»

«precedes»

«precedes»

«invokes»

«precedes»

«invokes»

«precedes»

«precedes»

«precedes»

SDQA Analysis Overview - Use Case Diagram

124

Page 138: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

uc DC3 SDQA Analysis Overview

SDQA Analyst

A

Analyze SDQA Metrics using SQuAT

Name:Package:Version:Author:

DC3 SDQA Analysis OverviewScience Data Quality Assessment Pipeline1.0Robyn Allsman

DC3 SDQA Analysis Overview - Use Case Diagram

125

Page 139: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

uc Review Data Release

Generate Data Quality Alarm

Compute Completeness and

Reliability

Perform Large-Scale Sanity Checks

Compare Uncertainties in

Object Properties with Expectation

Summarize Quality of Observing ConditionsCheck Basic Integrity of

Catalog

Check Coadd QA Diagnostics

SDQA Analyst

Name:Package:Version:Author:

Review Data ReleaseScience Data Quality Assessment Pipeline1.0Robyn Allsman

Review Data Release - Use Case Diagram

126

Page 140: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

uc Analyze SDQA Data using SQuAT

SDQA Analyst

Select SDQA Display Type

Review Time SeriesReview HistogramsReview data for

specific region of focal plane

Review Data for Specifc Period of

Time

View Processed Image

View Raw Exposure Image

Review Customized Statistical Summary

View Data from Ancillary Telescope

View Selected Catalog Data

View Calibration Data

Review Data from Facilities Database

Correlate SDQA metrics

Correlate SDQA metric with other data

Review Histograms Generic

Placeholder for Additonal Use Cases

A

Analyze SDQA Metrics using SQuAT

«Manual»Review Default SDQA

Data

«invokes»

«invokes»

«invokes»

«invokes»«invokes»

«invokes»

«invokes»

«invokes»

«invokes»

«invokes»

«invokes» «invokes»«invokes» «invokes»

«invokes»

Analyze SDQA Data using SQuAT - Use Case Diagram

3.2.15.2.1 Observatory Operations (Camera Scientist/Pipeline Controller/Scheduler)<Actor>

This Actor represents any automated or human process that happens when data quality is judged to be non-nominal. Examples could be change observing cadence, begin troubleshooting camera or telescope, omit data from pipeline, flag catalog data as non-optmal, begin troubleshooting pipeline, write documentation, etc.

3.2.15.2.2 Adjust SDQA Thresholds <UseCase>Adjust SDQA Thresholds

INPUTS:

OUTPUTS:

GIVEN:

127

Page 141: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

BASIC COURSE:

ALTERNATE COURSES:

NOTES:

3.2.15.2.3 Analyze SDQA Metrics using SQuAT <UseCase>Analyze SDQA Metrics Using SQUaT

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:

System displays the Task Selection Page.

SDQA Analyst selects the SDQA Task "Analyze SDQA Metrics".

System displays the SDQA Metric Set Up Page.

SDQA Analyst selects the SDQA Metrics, Sky Region, Focal Plane Region, Time Range, and output format.

System: queries the SDQA Data Archive, generates SDQA Results, formats them for output, and displays them on the SDQA Results Page.

ALTERNATE COURSES:

1.1 SDQA Analyst selects SDQA Metric that has not yet been populated (e.g., Time range later than last data release and metrics are data-released oriented).

1.2 System generates partial SDQA Results and displays warning that not all SDQA Results are available.

2.1 SDQA Analyst reselects results filter and output format to analyze SDQA Results.

2.2 System displays filtered SDQA Results.

128

Page 142: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

Analyze SDQA Metrics using SQuAT - Analysis Diagramanalysis Analyze SDQA Metrics using SQuAT

display task selectionSQuAT: Task Selection

Page

(from SDQA Metrics)

display SDQA metricset up

SQuAT: SDQA MetricSet Up Page

(from SDQA Metrics)

display SDQA ResultsSQuAT: SDQA Results

Page

(from SDQA Metrics)

format output

query database

generate SDQA Results

SQuAT: Partial SDQAResults Page

(from SDQA Metrics)

SQuAT: SDQA ResultsFilter Page

(from SDQA Metrics)

SQuAT: Filtered SDQAResults Page

(from SDQA Metrics)

display partial SDQAresults

filter SDQA results

display filtered SDQAResults

DESCRIPTION:Analyze SDQA Metrics Using SQUaT

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:

System displays the Task Selection Page.

SDQA Analyst selects the SDQA Task "Analyze SDQA Metrics".

System displays the SDQA Metric Set Up Page.

SDQA Analyst selects the SDQA Metrics, Sky Region, Focal Plane Region, Time Range, and output format.

System: queries the SDQA Data Archive, generates SDQA Results, formats them for output, and displays them on the SDQA Results Page.

ALTERNATE COURSES:

1.1 SDQA Analyst selects SDQA Metric that has not yet been populated (e.g., Time range later than last data release and metrics are data-released oriented).

1.2 System generates partial SDQA Results and displays warning that not all SDQA Results are available.

2.1 SDQA Analyst reselects results filter and output format to analyze SDQA Results.

2.2 System displays filtered SDQA Results.

:Time Range

:Sky Region

:Output Format

SDQA Analyst

:SDQA Metric

Name: Analyze SDQA Metrics using SQuATAuthor: Russ LaherVersion: 1.0Created: 8/25/2008 10:53:42 AMUpdated: 12/8/2010 9:23:00 AM

Focal Plane Region

Analyze SDQA Metrics using SQuAT - Analysis Diagram

3.2.15.2.3.1 display filtered SDQA Results <Object>Implementation approach: Java classes.

3.2.15.2.3.2 display partial SDQA results <Object>Implementation approach: Java classes.

3.2.15.2.3.3 display SDQA metric set up <Object>

129

Page 143: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

Implementation approach: Java Swing classes and database stored functions.

3.2.15.2.3.4 display SDQA Results <Object>Implementation approach: Java classes.

3.2.15.2.3.5 display task selection <Object>Implementation approach: Java Swing classes.

3.2.15.2.3.6 filter SDQA results <Object>Implementation approach: Java classes.

3.2.15.2.3.7 format output <Object>Implementation approach: Java Swing classes and Java JFreeChart library.

3.2.15.2.3.8 generate SDQA Results <Object>Implementation approach: Java Swing classes and database stored functions.

3.2.15.2.3.9 query database <Object>Implementation approach: Database stored function(s).

3.2.15.2.4 Check Basic Integrity of Catalog <UseCase>Check Basic Integrity of Catalog

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE

ALTERNATE COURSES:

NOTES:For example, Retrieve, plot and summarize seeing and sky brightness throughout the period

130

Page 144: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

covered by the Data Release.

3.2.15.2.5 Check Coadd QA Diagnostics <UseCase>

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:

Review QA diagnostics for coadded quantities in catalog.

Two flavors:

1. All QA diagnostics generated during pipeline processing

2. QA diagnostics included in catalog (subset/summary of above)

ALTERNATE COURSES:

NOTES:

3.2.15.2.6 Compare Uncertainties in Object Properties with Expectation <UseCase>Compare Uncertainties in Object Properties with ExpectationINPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:

ALTERNATE COURSES:

NOTES:

Review uncertainties (summaries thereof) on photometry, astrometry, shapes, etc. Do we have precision expected for depth reached? Are we on track to meet the SRD requirements? (Probably checked frequently, not just at data release.)

131

Page 145: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.15.2.7 Compute Completeness and Reliability <UseCase>Compute Completeness and Reliability

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:Determine completeness and reliability of Catalog for both point sources and extended sources.

ALTERNATE COURSES:

NOTES:

3.2.15.2.8 Correlate SDQA metric with other data <UseCase>Correlate SDQA metric with other data

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:

ALTERNATE COURSES:

NOTES:Correlate SDQA metric with other information. E.g. seeing from ancillary telescope, voltages from facilities database, number of sources extracted for a given exposure from DM database, etc.

3.2.15.2.9 Correlate SDQA metrics <UseCase>Correlate SDQA Metrics

INPUTS:

132

Page 146: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

OUTPUTS:

GIVEN:

BASIC COURSE:

ALTERNATE COURSES:

NOTES:

Correlate SDQA metrics with each other: e.g. noise level with number of poor extractions.

3.2.15.2.10 Examine Data Quality Status <UseCase>Examine Data Quality Status

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:Either the actor queries to analyze data quality results or in response to an alarm examines the data quallity that raised the alarm.

ALTERNATE COURSES:

NOTES:

3.2.15.2.11 Generate Data Quality Alarm <UseCase>Generate Data Quality Alarm

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:If DQ metric exceeds thresholds by certain amounts in specific cases, the Observatory Operations Actor should be sent an alarm. This could be something colored red on a display or could be a page sent to a list of specific recipients.

ALTERNATE COURSES:

133

Page 147: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

No alarm is generated.

NOTES:

3.2.15.2.12 Modify Operations Based on Data Quality Status <UseCase>Modify Operations Based on Data Quality Status

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:If data are not of adequate quality initiate action as described under the Observatory Operations Actor

ALTERNATE COURSES:

NOTES:

3.2.15.2.13 Override Data Quality Status/Add Comments <UseCase>Overrise Data Quality Status/Add Comments

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:A function of SQuAT (tbd)

If automatically assigned quality should be over-ridden, use tool to assign new quality to an image, set of images, catalog entry (source/object) or set of catalog entries. Apply comment.

ALTERNATE COURSES:

Selection not available or user has insufficient prviledge to perform task

NOTES:

3.2.15.2.14 Perform Large-Scale Sanity Checks <UseCase>

134

Page 148: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

Perform Large-Scale Sanity Checks

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:Using large numbers of catalog entries, perform statistical and astrophysical sanity checks.

ALTERNATE COURSES:

NOTES:

3.2.15.2.15 Placeholder for Additonal Use Cases <UseCase>Placeholder for Additional Use Cases

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:

ALTERNATE COURSES:

NOTES:

3.2.15.2.16 Present Default Detailed SDQA Information <UseCase>Present Default Detailed SDQA Informtaion

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:

ALTERNATE COURSES:

NOTES:Display information in user friendly format

135

Page 149: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.15.2.17 Present Default Summary SDQA Information <UseCase>Present Default Summary SDQA Information

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:

ALTERNATE COURSES:

NOTES:

Display summary information in human friendly format

3.2.15.2.18 Review Customized Statistical Summary <UseCase>Review Customized Statistical Summary

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE::Not in DC3

For selected metric, region of focal plane and time range, summary statistics are presented.

ALTERNATE COURSES:

NOTES:Example: list the following

a color-coded pass/fail iconmedian PSF widthdesign specification for median PSF widthdifference (measured - design)mean, standard deviation, etc.

136

Page 150: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.15.2.19 Review Data for Specifc Period of Time <UseCase>Review Data for Specific Period of Time

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:

Specify a metric and a time rangeDisplay the resulting data as a plot or image

ALTERNATE COURSES:

NOTES:

3.2.15.2.20 Review data for specific region of focal plane <UseCase>DESCRIPTION:Review data for specific region of focal plane

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE::Not in DC3

Select region of focal planeSelect metric or metricsPlot or display in image form

ALTERNATE COURSES:

NOTES:

3.2.15.2.21 Review Data from Facilities Database <UseCase>Review Data from Facilities Database

137

Page 151: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:Not in DC3

Select data from FDB to viewOptionally select metric to view with FDB infoPlot or display as image

ALTERNATE COURSES:

NOTES:

3.2.15.2.22 Review Default SDQA Data <UseCase>Manually review Default SDQA Data

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:

Using SQuAT, select a standard canned report.

Review the report.

Go into 'Analyze SDQA Metrics' if more investigation is needed.

ALTERNATE COURSES:

NOTES:

3.2.15.2.23 Review Histograms <UseCase>DESCRIPTION:Review Histograms

INPUTS:

138

Page 152: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

OUTPUTS:

GIVEN:

BASIC COURSE:For selected metric, region of focal plane and time range, plot histogram(s) and indicate the requirement threshold as an overlay.

ALTERNATE COURSES:Example: make a histogram of point-source PSF width. Overlay lines on the plot that correcpond to the determined median and the design specification from the Science Requirements Document.

NOTES:

3.2.15.2.24 Review Histograms Generic <UseCase>Review Histograms Generic

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:NOt in DC3

This is a placeholder for an idea to present QA information graphically, for example, plot the average PSF measure as color/brightness vs. time horizontally and CCD vertically.

ALTERNATE COURSES:

NOTES:

3.2.15.2.25 Review Time Series <UseCase>Review Time Series

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:For selected metric, region of focal plane and time range, time series plots are displayed with

139

Page 153: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

requirement threshold overlay.

ALTERNATE COURSES:

NOTES:Example: plot point-source PSF width as function of time. Overlay a line on the plot that correcponds to the design specification from the Science Requirements Document.

3.2.15.2.26 Select SDQA Display Type <UseCase>Select SDQA Display Type.

INPUTS:

OUTPUTS:

GIVEN:Analyst is using SQuAT to perform these tasks

BASIC COURSE:

Select metric of interest (e.g. DC3 SDQA metric) .

Select region of focal plane (e.g. segment, CCD, raft, or full focal plane).

Select time range (e.g. full night, one exposure, etc.).

Select display type (histogram, xy plot, image).

Display chosen metric in chosen style.

ALTERNATE COURSES:

NOTES:

3.2.15.2.27 Specify SDQA Task <UseCase>Specify SDQA Task

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:

140

Page 154: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

On the SQuAT Task Selection Page, SDQA Analyst selects an SDQA Task from a menu and then clicks the apply button. System responds by invoking the Set Up and Execute SDQA Task.

ALTERNATE COURSES:Fault occurs: Try again later.

NOTES:

Specify SDQA Task - Analysis Diagramanalysis Specify SDQA Task

display Task SelectionSQuAT: Task Selection

Page

display SDQA MetricSet Up

SQuAT: SDQA MetricSet Up Page

generate SDQA ResultsSQuAT: SDQA ResultsPage

SDQA Analyst

SQuAT: Partial SDQAResults Page

SQuAT: Filtered SDQAResults

filter SDQA Results

SQuAT: Filter SDQAResults Selection Page

DESCRIPTION:Specify SDQA Task

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:On the SQuAT Task Selection Page, SDQA Analyst selects an SDQA Task from a menu and then clicks the apply button. System responds by invoking the Set Up and Execute SDQA Task.

ALTERNATE COURSES:Fault occurs: Try again later.

NOTES:

:SDQA Metric

format SDQA Results

SDQA Results

Name: Specify SDQA Task Author: Russ LaherVersion: 1Created: 8/21/2008 4:06:26 PMUpdated: 4/9/2009 2:43:35 PM

Specify SDQA Task - Analysis Diagram

3.2.15.2.27.1 SDQA Results <Object>Internal data structure that holds SDQA database query returned results.

141

Page 155: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.15.2.28 Summarize Quality of Observing Conditions <UseCase>DESCRIPTION:Summarize Quality of Observing Conditions

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:

ALTERNATE COURSES:

NOTES:For example, Retrieve, plot and summarize seeing and sky brightness throughout the period covered by the Data Release.

3.2.15.2.29 View Calibration Data <UseCase>View Calibration Data

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:

ALTERNATE COURSES:

NOTES:

3.2.15.2.30 View Data from Ancillary Telescope <UseCase>View Datra from Ancillary Telescope

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:

142

Page 156: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

ALTERNATE COURSES:

NOTES:Not in DC3

3.2.15.2.31 View Processed Image <UseCase>View Processed Image

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE::Select processed image or group of imagesDisplay imageManipulate display parameters or convert to FITS and pass to FITS viewer

ALTERNATE COURSES:

NOTES:

3.2.15.2.32 View Raw Exposure Image <UseCase>View Raw Exposure Image

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:Select a raw image or group of raw imagesDisplay image Manipulate display parameters (e.g. color table range stretch flux value for selection) (alt could convert to FITS and pass to DS9).

ALTERNATE COURSES:

NOTES:

3.2.15.2.33 View Selected Catalog Data <UseCase>

143

Page 157: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

View Selected Catalog Data

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:

ALTERNATE COURSES:

NOTES:

3.2.16 Community Science Subsystems

Community Science Subsystems

* LSST Science Use Cases* Example Complex Science Use Cases* Example Simple Science Use Cases

Community Science Packages - Use Case Diagram

144

Page 158: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

uc Community Science Packages

Science User

(from Actors)

Example Complex Science Use Cases

+ Calculate the Two Point Correlation Function of Galaxy Groups+ Classify / Analyze Eclipsing Binaries+ Search For Microlensed SN

+ Search For Planetary Transits+ Search for SN Light Echoes

Example Simple Science Use Cases

+ Execute Cone Search+ Find All g-band Stacked Images That Cover RA/DEC Region

+ Find All Galaxies With Given Properties+ Find All RR Lyrae Within RA/DEC Region+ Query Catalog

+ Query Image ArchiveLSST Science Use Cases

+ Analyze Color-Color Diagram+ Create Cleaned Color-Magnitude Diagram

+ Create Color-Magnitude Diagram+ Create Corrected Color-Magnitude Diagram+ Create Stellar Color-Color Diagram

+ Derive Galaxy Luminosity Function+ Derive Stellar Luminosity Function+ Discover groups and clusters of galaxies

+ Extract Time Sequence of Images+ Extract Time Series for Objects

+ Find all Lensed Quasar Candidates+ Generate photometric redshift for a galaxy

Query Services

+ SQL Syntax

+ Formulate and Submit Query+ Process Query

(from Science Database and Data Access Services)

«invokes»

«invokes»

«invokes»

Community Science Packages - Use Case Diagram

3.2.16.1 LSST Science Use Cases

LSST Science Use Cases

LSST Science Use Cases - Use Case Diagram

145

Page 159: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

uc LSST Science Use Cases

«Business»Extract Time Series for

Objects

«Business»Create Color-Magnitude

Diagram

«Business»Create Cleaned Color-Magnitude

Diagram

«Business»Create Stellar Color-Color

Diagram

«Business»Analyze Color-Color Diagram

«Business»Derive Stellar Luminosity

Function

«Business»Generate photometric redshift

for a galaxy

«Business»Find all Lensed Quasar

Candidates

«Business»Derive Galaxy Luminosity

Function

«Business»Create Corrected

Color-Magnitude Diagram

«web page»Get Color Magnitude

Sky Region

Color

Magnitude

Output format (ASCII, FITS, graphic)

CMD Processing Level {raw, corrected, cleaned}

URI of External Input Table for Cleaned CMD

Star Certainty Percent

LSST User

«invokes»

«invokes»«invokes»

«invokes»

raw

cleaned

corrected

Name:Package:Version:Author:

LSST Science Use CasesLSST Science Use Cases1.0LSST_EA user

LSST Science Use Cases - Use Case Diagram

146

Page 160: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.16.1.1 Analyze Color-Color Diagram <UseCase>

Analyze Color-Color Diagram

BASIC COURSE:

System invokes the use case "Create Color-Color Diagram" obtains LF of unresolved galaxies cleans faint end of CCD (statistically) using the LF of unresolved galaxies runs isochrone fitting algorithm derives ages, metallicities, distance, reddening for stellar populations

ALTERNATE COURSES:

System returns "Error" screen if isochrone fitting algorithm fails to find a solution

3.2.16.1.2 Create Cleaned Color-Magnitude Diagram <UseCase>

Create Cleaned Color-Magnitude Diagram

BASIC COURSE:

User requests Cleaned Color-Magnitude Diagram

System obtains Luminosity Function (LF) of unresolved galaxies cleans faint part of the CMD (statistically) using the LF of unresolved galaxies displays the Create Color Magnitude Result Screen which contains the URL pointer to the files and/or diagram with the CMD ALTERNATE COURSES:

System displays Create Color Magnitude Error Screen, if LF not found

Create Cleaned Color-Magnitude Diagram Robustness - Analysis Diagram

147

Page 161: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

analysis Create Cleaned Color-Magnitude Diagram Robustness

BASIC:

User requests Cleaned Color-Magnitude Diagram

System obtains Luminosity Function (LF) of unresolved galaxies cleans faint part of the CMD (statistically) using the LF of unresolved galaxies displays the Create Color Magnitude Result Screen which contains the URL pointer to the files and/or diagram with the CMD ALTERNATE:

System displays Create Color Magnitude Error Screen, if LF not found

obtainLF

clean

can we do this?

Create Cleaned Color-Magnitude Diagram Robustness - Analysis Diagram

3.2.16.1.2.1 can we do this? <Issue>Is the LF for galaxies available?

148

Page 162: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.16.1.3 Create Color-Magnitude Diagram <UseCase>

Create Color-Magnitude Diagram

BASIC COURSE:

User specifies on the Create Color Magnitude Input Screen sky region boundaries color and magnitude for CMD star certainty percent return format (ASCII file, FITS table, diagram, etc)

System validate user input parameters

queries the Object Catalog to produce a ColorMagnitudeTable that satisfy the constraints for objects whose classification is stellar at equal or greater than the star certainty percent

generates the requested output format from the ColorMagnitudeTable

displays the Color Magnitude Result Screen which displays the graphic and optionally contains the URI pointer to the files

ALTERNATE COURSES:

System displays Create Color Magnitude Error Screen, if sky region boundaries are not provided, or are incorrect CMD axes are not defined

System displays Sky Region Too Large Screen (too many sources)?

Create Color-Magnitude Diagram Robustness - Analysis Diagram

149

Page 163: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

analysis Create Color-Magnitude Diagram Robustness

Color Magnitude InputWebpage

GetUserInputValidateInputParameters

Color Magnitude ErrorWebpage

DisplayError

ConvertToReturnFormat

DisplayResultColor MagnitudeResult Webpage

generate ColorMagnitude Table

Filesystem

Create Color-Magnitude Diagram Robustness - Analysis Diagram

3.2.16.1.3.1 generate Color Magnitude Table <Object>SELECT where .....

3.2.16.1.4 Create Corrected Color-Magnitude Diagram <UseCase>

Create Corrected Color-Magnitude Diagram

BASIC COURSE:

User requests a Corrected Color-Magnitude Diagram from a given ColorMagnitudeTable

System invokes Create Color-Magnitude Diagram get CMD Analysis Policy from Policy Library bin stellar photometry by magnitude correct raw ColorMagnitudeTable: collect completeness data from Detection Efficiency Catalog statistically correct raw Color-Magnitude table generates the requested output format from the ColorMagnitudeTable displays the Corrected CMD Result Screen which contains the URL pointer to the files and/or diagram with the CMD

ALTERNATE COURSES:

150

Page 164: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

displays Corrected CMD Error Screen on error

Create Corrected Color-Magnitude Diagram Robustness - Analysis Diagram

151

Page 165: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

analysis Create Corrected Color-Magnitude Diagram Robustness

BASIC COURSE

User requests a Corrected Color-Magnitude Diagram from a given ColorMagnitudeTable

System invokes Create Color-Magnitude Diagram get CMD Analysis Policy from Policy Library bin stellar photometry by magnitude correct raw ColorMagnitudeTable: collect completeness data from Detection Efficiency Catalog statistically correct raw Color-Magnitude table generates the requested output format from the ColorMagnitudeTable displays the Corrected CMD Result Screen which contains the URL pointer to the files and/or diagram with the CMD

ALTERNATE COURSE

displays Corrected CMD Error Screen on error

binByMagnitudegetPolicycorrectCMD

Corrected CMD ResultScreen

prepareResult

Corrected CMD ErrorScreen

handleError

Filesystem

Create Corrected Color-Magnitude Diagram Robustness - Analysis Diagram

152

Page 166: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.16.1.5 Create Stellar Color-Color Diagram <UseCase>

Create Stellar Color-Color Diagram

BASIC COURSE:

User specifies on Color Color Input Screen: sky region boundaries colors for X- and Y- axes star certainty percent return format (ASCII file, FITS table, diagram, etc)

System: validates user input parameters queries the Object Catalog to produce a ColorColorTable that satisfies the contraints for objects whose classification is stellar at >= the star certainty percentage generates the requested output format from the ColorColorTable displays the Color Color Result Screen which contains the URI pointer to the files and/or color-color diagram

ALTERNATE COURSES:

System displays "Incorrect input" screen, if sky region boundaries are not provided, or are incorrect axes are not defined

System warns if the sky region is too large (too many sources)?

Create Stellar Color-Color Diagram Robustness - Analysis Diagram

153

Page 167: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

analysis Create Stellar Color-Color Diagram Robustness

BASIC COURSE

User specifies on Color Color Input Screen: sky region boundaries colors for X- and Y- axes star certainty percent return format (ASCII file, FITS table, diagram, etc)

System: validates user input parameters queries the Object Catalog to produce a ColorColorTable that satisfies the contraints for objects whose classification is stellar at >= the star certainty percentage generates the requested output format from the ColorColorTable displays the Color Color Result Screen which contains the URI pointer to the files and/or color-color diagram

ALTERNATE COURSE

System displays "Incorrect input" screen, if sky region boundaries are not provided, or are incorrect axes are not defined

System warns if the sky region is too large (too many sources)?

Color Color InputScreen

GetUserInputValidateInputParameters

Color Color ErrorScreen

DisplayError

ConvertToReturnFormat DisplayResult Color Color ResultScreen

doQuery

Filesystem

Create Stellar Color-Color Diagram Robustness - Analysis Diagram

3.2.16.1.6 Derive Galaxy Luminosity Function <UseCase>

154

Page 168: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

Derive Galaxy Luminosity Function

BASIC COURSE:

User specifies sky region boundaries filter information output format (table, plot, etc.)

System: query object catalog retrieve the photometry for objects satisfying the sky region and filter contraints bin stellar photometry by magnitude performs a luminosity function completeness analysis: collect completeness data from Detection Efficiency Catalog correct binned data using completeness data fit model to observed data, and corrected data infer completeness data from observed data and correct compare completeness corrections from efficiency pipeline and inferred from binned data, raise flag if significantly different obtains the best luminosity function fit analytic model to the distribution, derive IMF

shows a Luminosity Function Result screen which has: the output as specified by the user

ALTERNATE COURSES:

system shows Luminosity Error Result screen if: the specified sky region is invalid (too large, too small, etc.) filter information is not provided

system shows Luminosity No Result screen if: there are no objects in the specified sky region/filter combination

3.2.16.1.7 Derive Stellar Luminosity Function <UseCase>

Derive Stellar Luminosity Function

BASIC COURSE:

User specifies sky region boundaries filter information output format (table, plot, etc.)

155

Page 169: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

System: query object catalog retrieve the photometry for objects satisfying the sky region and filter contraints bin stellar photometry by magnitude performs a luminosity function completeness analysis: collect completeness data from Detection Efficiency Catalog correct binned data using completeness data fit model to observed data, and corrected data infer completeness data from observed data and correct compare completeness corrections from efficiency pipeline and inferred from binned data, raise flag if significantly different obtains the best luminosity function fit analytic model to the distribution, derive IMF

shows a Luminosity Function Result screen which has: the output as specified by the user

ALTERNATE COURSE:

system shows Luminosity Error Result screen if: the specified sky region is invalid (too large, too small, etc.) filter information is not provided

system shows Luminosity No Result screen if: there are no objects in the specified sky region/filter combination

3.2.16.1.8 Discover groups and clusters of galaxies <UseCase>

Discover groups and clusters of galaxies

BASIC COURSE:

User specifies sky region boundaries output format (table, plot, etc.)

System: select all galaxies in the specified region from the Object Catalog characterize the distribution of galaxies in terms of 2-point correlation function identify local maximum(s) in the distribution check if the local maxima in the source density distribution are statistically significant extract photo-Zs from the Object Catalog estimate distances with photo-Zs and remove outliers correct local maximum density with completeness data from efficiency pipeline calculate richness of overdensity, classify as group/cluster and how rich

156

Page 170: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

shows a Galaxy List screen which has: the output as specified by the user

ALTERNATE COURSES:

system shows a Galaxy Discovery FYI Screen if: there is no photo-Z in the object catalog no galaxies found no data for specified sky region

3.2.16.1.9 Extract Time Sequence of Images <UseCase>

Extract Time Sequence of Images

BASIC COURSE:

User enters the following in the Image Sequence User Input screen: Sky Region Observing Filter a prefix to be used in naming the returned URL a format for the image cited in the returned URL

System: validates the specified sky region and filter queries the image metadata catalogue using the parameters from the Image Sequence User Input screen. retrieves the CCD Image list which overlays the user's specified parameters. extracts the CCD Image Collection from the LSST Image Archive. converts the extracted CCD Image Collection into the format specified, and renames the images to include prefix, time, and filter places the converted images into a parent directory. displays an Image Sequence Success screen which shows: a) A URI for the directory containing the converted images b) The number of files in that directory c) Disclaimer that sampling is irregular in space and time. ALTERNATE COURSES:

sky region invalid

observing filter invalid

sky region completely outside LSST survey area Image Sequence Failure screen returns sky region not in data message

157

Page 171: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

system unavailable Image Sequence Failure screen returns system unavailable message

Extract Time Sequence of Images Robustness - Analysis Diagramanalysis Extract Time Sequence of Images Robustness

Image Sequence UserInput Screen

Get Region, Filter, etc.

Validate Sky Regin

get CCD image IDsGet CCD images and

metadata

Validate ObservingFilter

Image SequenceFailure Screen

Dsiplay "no imagesfound"

Display "bad skyregion" notice

Display "bad filterselection" notice

Convert and RenameImage Files

Display URI, FileCount, and Disclaimer

Image SequenceSuccess Screen

Store in Directory

Extract Time Sequence of Images Robustness - Analysis Diagram

3.2.16.1.10 Extract Time Series for Objects <UseCase>

Extract Time Series for Objects

BASIC COURSE:

User enters the following in the GUI screen "Get objects" sky region boundaries OR LSST object identifier(s) filter information time interval

158

Page 172: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

output format (file, diagram, etc) quantities to return

System gets time series from object catalog returns quantities specified by user, in user-specified format

ALTERNATE COURSES:

System returns "System unavailable" screen, if unavailable

System returns "No object found" screen, when no object satisfies search criteria

3.2.16.1.10.1 Source vs. Object? <Issue>Object Catalog should have time series for all objects!

3.2.16.1.11 Find all Lensed Quasar Candidates <UseCase>

Find all Lensed Quasar Candidates

BASIC COURSE:

User specifies on Quasar Lens Candidate Screen: Sky Region Search Radius (for searching for multiple images of same Quasar) quasar certainty percent return format (asci, fits, graph, etc)

System: validates the user input selects all variable Quasars from Object Catalog within specified Sky Region whose quasar classification >= quasar certainty percent identifes all multiple Quasar candidates from the quasar list obtain the light curves from the Object Catalog for the multiple Quasar candidates perform time-delay analysis generate Lensed System Tables for each successful lensed candidate with time delays and uncertainties displays the Lensed Quasar Result Screen which contains the URI pointer to the tables and/or diagram with light curves

159

Page 173: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

ALTERNATE COURSES:

Return No Object Message on Lensed Quasar Result Screen if no Quasars in the specified Sky Region no multiple Quasars within the specified Search Radius no lensed systems found Search Radius outside of allowed bounds Sky Region outside of allowed bounds

Find all Lensed Quasar Candidates Robustness - Analysis Diagram

160

Page 174: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

analysis Find all Lensed Quasar Candidates Robustness

BASIC COURSE:

User specifies on Quasar Lens Candidate Screen: Sky Region Search Radius (for searching for multiple images of same Quasar) quasar certainty percent return format (asci, fits, graph, etc)

System: validates the user input selects all variable Quasars from Object Catalog within specified Sky Region whose quasar classification >= quasar certainty percent identifes all multiple Quasar candidates from the quasar list obtain the light curves from the Object Catalog for the multiple Quasar candidates perform time-delay analysis generate Lensed System Tables for each successful lensed candidate with time delays and uncertainties displays the Lensed Quasar Result Screen which contains the URI pointer to the tables and/or diagram with light curves ALTERNATE COURSE:

Return No Object Message on Lensed Quasar Result Screen if no Quasars in the specified Sky Region no multiple Quasars within the specified Search Radius no lensed systems found Search Radius outside of allowed bounds Sky Region outside of allowed bounds

Quasar LensedCandidate Screen

validate user inputgetUserInput displayError Lensed Quasar ErrorScreen

select varialblequasars

select quasar multiplets

obtain light curves ofquasar multiplets

perform time-delayanalysis

ConvertToReturnFormat

Lensed Quasar ResultScreen

Display error message

Filesystem

Find all Lensed Quasar Candidates Robustness - Analysis Diagram

161

Page 175: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.16.1.12 Generate photometric redshift for a galaxy <UseCase>

Generate Photometric Redshift for a Galaxy

BASIC COURSE:

User specifies LSST galaxy object

System: retrieves passband magnitudes from Object Catalog for the specified object calculates colors from magnitudes looks up photo-Z from Standard Photo-Z Color Lookup Table stores the photo-Z in the Object Catalog for the object

ALTERNATE COURSES:

throws an exception if: insufficient passband magnitude information

Generate photometric redshift for a galaxy Robustness - Analysis Diagram

162

Page 176: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

analysis Generate photometric redshift for a galaxy Robustness

BASIC COURSE:

User specifies LSST galaxy object

System: retrieves passband magnitudes from Object Catalog for the specified object calculates colors from magnitudes looks up photo-Z from Standard Photo-Z Color Lookup Table stores the photo-Z in the Object Catalog for the object

ALTERNATE COURSE:

throws an exception if: insufficient passband magnitude information

Photo-Z Input Screen

Get Galaxy Object

Retrieve PassbandMagnitudes

Generate photometric redshift for a galaxy Robustness - Analysis Diagram

3.2.16.2 Example Complex Science Use Cases

Example Complex Science Use Cases

The distinction between "simple" and "complex" use cases is that a simple use case can be performed with a single query of the LSST catalogs. A complex use case requires processing LSST data to derive quantitites that are not stored in the catalogs and/or require joining LSST data with data from other VO resources.

Example Complex Science Use Cases - Use Case Diagram

163

Page 177: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

uc Example Complex Science Use Cases

«Business»Search For Microlensed SN

«Business»Search for SN Light Echoes

«Business»Search For Planetary Transits

«Business»Calculate the Two Point Correlation

Function of Galaxy Groups

«Business»Classify / Analyze Eclipsing

Binaries

Science User

Example Complex Science Use Cases - Use Case Diagram

3.2.16.3 Example Simple Science Use Cases

Example Simple Science Use Cases

The distinction between "simple" and "complex" use cases is that a simple use case can be performed with a single query of the LSST catalogs. A complex use case requires processing LSST data to derive quantitites that are not stored in the catalogs and/or require joining LSST data with data from other VO resources.

164

Page 178: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

Example Simple Science Use Cases - Use Case Diagramuc Example Simple Science Use Cases

«Business»Find All RR Lyrae Within RA/DEC

Region

«Business»Find All Galaxies With Given

Properties

«Business»Find All g-band Stacked Images That

Cover RA/DEC Region

Science User

Example Simple Science Use Cases - Use Case Diagram

3.2.17 Classification

3.2.17.1 Classify Objects <UseCase>

Classify Objects

BASIC COURSE:

Retrieve Policy from Policy Library.

Classify one object according to rules in Policy:

--foreach filter, invoke Forced Photometry (from Detection Pipeline) on deepest stack at location of new object.

165

Page 179: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

--classify according to Policy rules. This may include invoking the Classify Variable Objects usecase

--Send alert if an Alert Rule is matched.

ALTERNATE COURSES:

*: throw exception

Classify Objects - Use Case Diagramuc Classify Objects

«Controller»Classify Objects

«System»Classify Variable Objects

«System»Classify Stars

«System»Classify Extended Objects

«invokes»«invokes» «invokes»

Classify Objects - Use Case Diagram

166

Page 180: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

analysis Classify Objects Robustness

classify variable stars

classify moving objects

classify extendedobjects

Name: Classify Objects RobustnessAuthor: Jeff KantorVersion: 1.0Created: 10/16/2004 8:08:42 AMUpdated: 12/7/2010 12:04:55 PM

Classify Objects Robustness - Analysis Diagram

3.2.17.1.1 Classify Extended Objects <UseCase>

Classify Extended Objects

BASIC COURSE:

ALTERNATE COURSES:

3.2.17.1.2 Classify Stars <UseCase>

Classify Stars

BASIC COURSE:

167

Page 181: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

NOTE: Will use multi-color photometry, proper motion, and time variability

ALTERNATE COURSES:

3.2.17.1.3 Classify Variable Objects <UseCase>

Classify Variable Objects

BASIC COURSE:

System get light curves from Object Catalog fold light curves and fit non-parametric model to light curves compute light curve characteristics (mean, amplitude, color, power spectrum, fourier coefficients, etc) classify the type of object along with certainty write classification and certainty (and mean, amplitude, color, power spectrum, fourier coefficients, etc) into Object Catalog

ALTERNATE COURSE:

Write "Too few detections" flag into Object Catalog

System returns "System unavailable"

Classify Variable Objects - Analysis Diagram

168

Page 182: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

analysis Classify Variable Objects

BASIC COURSE

System get light curves from Object Catalog fold light curves and fit non-parametric model to light curves compute light curve characteristics (mean, amplitude, color, power spectrum, fourier coefficients, etc) classify the type of object along with certainty write classification and certainty (and mean, amplitude, color, power spectrum, fourier coefficients, etc) into Object Catalog

ALTERNATE COURSE

Write "Too few detections" flag into Object Catalog

System returns "System unavailable"

get light curvesfold and fit light curves compute light curve

characteristicsClassify light cuves

write classification andcertainty

write too few pointsflag

Classify Variable Objects - Analysis Diagram

3.2.18 Alert Generation Pipeline

169

Page 183: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

includes software programs, configuration files, unit tests, component integration tests, and documentation that implement the following capabilities:

· Create Alert Category· Subscribe to Alert Category· Create an Alert Filter· Deliver Alerts

Alert Generation Pipeline - Activity Diagramact Alert Generation Pipeline

Begin: Run Alert Generation Pipeline

End: Run Alert Generation Pipeline

Process Alerts

WBS: 02C.03.03 Alert Generation Pipeline

DESCRIPTION: Alert Generation Pipeline - Post Alerts on selected Objects found during processing.

ALGORITHM:1. Run Process Alerts

EXCEPTIONS:

NOTES:Needs SDQA component e.g. automated check of SDQA status of image data from which alerts are generated.

Name: Alert Generation PipelineAuthor: dougVersion: 1.0Created: 1/7/2009 10:50:22 AMUpdated: 2/4/2011 12:38:38 PM

Alert Generation Pipeline - Activity Diagram

3.2.18.1 Generate Alerts from Visit <UseCase>

DESCRIPTION: Generate Alerts from Visit - The nightly processing of a visit concludes with Alert Processing, which is performed on the full object entry. Each object’s characteristics, including its prior history is checked against a set of rules that determines whether or not an alert will be issued, and if so, the type of alert. If an alert is to be generated, the object information, together with the subimages containing the sources are packaged into an alert structure, which is sent to the Archive center for dissemination to the community using the VOEvent mechanism.

170

Page 184: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

Alert Processing opens the Source Collection (SC) sent by the Source Router. For each Source S in SC:1. Load the associated Object O. Note that this implies that there must be a map from S to O. O is guaranteed to have N epochs of DIASources available if it has been observed at least that many times by the survey to date (see section on preparation for observing).

2. Whether or not an Alert is generated for S will depend on a potentially large number of factors which will be referenced by a set of rules. These rules may call on all the data associated with O such as time history, classification, and shape, as well as those of S itself. The rules are certain to change over time as we (and the community) gain experience, so it is important that the Alert generator be able to accept the rules in some readily changeable form.

3. If an Alert is to be generated, an Alert will be created with all relevant data, including the two small images containing S, and sent to the Archive center for distribution via the VOEvent protocol.

3.2.18.2 Alert Generation Pipeline <Activity>

DESCRIPTION: Alert Generation Pipeline - Post Alerts on selected Objects found during processing.

ALGORITHM:1. Run Process Alerts

EXCEPTIONS:

NOTES:Needs SDQA component e.g. automated check of SDQA status of image data from which alerts are generated.

3.2.19 Alert/Notification Toolkit

includes software programs, configuration files, unit tests, component integration tests, and documentation that implement the following capabilities:

· Create Alert Category· Create an Alert Filter· Retrieve Alerts· Subscribe to Alert Category · Process Subscription Request· Deliver Alerts· Record Alert

171

Page 185: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

Alerting - Use Case Diagramuc Alerting

«Business»Create Alert Category

«System Interface»Deliver Alerts

«Business»Create an Alert Filter

«Business»Subscribe to Alert

Category

Public Interface User

Observatory Operations

«System»Process Subscription

Requests

«System»Record Alert

«Business»Retrieve Alerts

Alert Category Author

Generate Alerts from Visit

«precedes»

«precedes»

«invokes»

«invokes»

«precedes» «precedes» «precedes»

Alerting - Use Case Diagram

3.2.19.1 Create Alert Category <UseCase>

DESCRIPTION: Create Alert Category -

BASIC COURSE:

The Alert Category Author defines an Alert Category in terms of data elements (astronomical characteristics, classification), availability (frequency, latency)

172

Page 186: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.2.19.2 Create an Alert Filter <UseCase>

DESCRIPTION: Create an Alert Filter -

BASIC COURSE:

The Public Interface User defines a "fine-grained" filter to determine how frequently or within what additional conditions the Alert should be delivered.

3.2.19.3 Deliver Alerts <UseCase>

DESCRIPTION: Deliver Alerts -

BASIC COURSE:

For all Alerts newly generated The System examines an Alert and determines which Subscribers to deliver it to. The System places the Alert into a queue for delivery The System delivers the Alerts in the queue to their destination

NOTE: Is this done on a per Alert basis or on batches?

3.2.19.4 Process Subscription Requests <UseCase>

DESCRIPTION: Subscription Requests -

BASIC COURSE:

The System incorporates the Subscription Request and any associated Alert Filter into the delivery mechanisms.NOTE: This may require automated or manual approval process.

3.2.19.5 Record Alert <UseCase>

173

Page 187: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

DESCRIPTION: Record Alert -

BASIC COURSE:

For each newly generated Alert The System records the Alert in the Alert Archive

NOTE: How often, per Alert, per batch?

3.2.19.6 Retrieve Alerts <UseCase>

DESCRIPTION: Retrieve Alerts -

BASIC COURSE:

The Public Interface User queries the Alert Archive to retrieve current or past Alerts

3.2.19.7 Subscribe to Alert Category <UseCase>

DESCRIPTION: Subscribe to Alert Category -

BASIC COURSE:

The Public Interface User accesses the available Alert Categories and selects an Alert Category

ALTERNATE COURSES:

1. The Public Interface User optionally defines a "fine-grained" filter to determine how frequently or within what additional conditions the Alert should be delivered.2. Invoke "Create an Alert Filter"

3.3 Middleware Use Cases

174

Page 188: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.3.1 Pipeline Construction Toolkit

Software programs, database tables, configuration files, unit tests, component integration tests, and documentation that implement the following capabilities:

· Pipeline· Stage· Slice· Clipboard· Queue

3.3.1.1 Construct Pipeline <UseCase>

DESCRIPTION: Construct Pipeline -

BASIC COURSE:The Pipeline Creator creates Components that can be assigned to Processing Steps in Pipelines.Invoke "Create Component".

The Pipeline Creator creates a Pipeline for later execution with appropriate data sets.Invoke "Create Pipeline".

ALTERNATE COURSES:

Construct Pipeline - Use Case Diagram

175

Page 189: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

uc Construct Pipeline

«Controller»Construct Pipeline

«Controller»Create Component

«Controller»Create Pipeline

Pipeline Creator

«invokes»

«invokes»

Construct Pipeline - Use Case Diagram

3.3.1.1.1 Create Component <UseCase>

BASIC COURSE:The Pipeline Creator requests the creation of a new Component.

The DMS creates a new empty Component.

The Pipeline Creator defines the input and output interfaces of the Component.Invoke "Define Component Interface".

The Pipeline Creator defines the algorithm of the Component.Invoke "Create Component Algorithm".

The Pipeline Creator defines the data structures used within the component.Invoke "Define Component Data Structures".

The Pipeline Creator embeds the Component into the Pipeline Construction Environment, making it available to assign to Pipeline Processing Steps.Invoke "Componentize and Add to Component Library".

ALTERNATE COURSES:The Pipeline Creator requests to delete or edit an existing Component.

The DMS retrieves the requested Component.

Create Component - Use Case Diagram

176

Page 190: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

uc Create Component

«Controller»Create Component

«System»Componentize and Add to

Component Library

«Business»Create Component Algorithm

«Business»Browse Component

Library

«Business»Define Component Interface

«Business»Define Component Data

Structures

«invokes»

«invokes»

«invokes»

«invokes»

«invokes»

Create Component - Use Case Diagram

3.3.1.1.1.1 Componentize and Add to Component Library <UseCase>

BASIC COURSE:

TBD - write this use case when pipeline construction middleware is selected.

3.3.1.1.1.2 Create Component Algorithm <UseCase>

BASIC COURSE:TBD - write this use case when pipeline construction middleware is selected. Should cover integration of existing modules and well as newly created ones.

3.3.1.1.1.3 Define Component Data Structures <UseCase>

177

Page 191: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

BASIC COURSE:TBD - write this use case when pipeline construction middleware is selected.

3.3.1.1.1.4 Define Component Interface <UseCase>

BASIC COURSE:

TBD - write this use case when pipeline construction middleware is selected.

3.3.1.1.2 Create Pipeline <UseCase>

BASIC COURSE:The Pipeline Creator requests the creation of a new Pipeline

The DMS creates a new empty Pipeline

The Pipeline Creator defines the Execution Environment of the PipelineInvoke "Define Execution Environment".

The Pipeline Creator defines the Processing Steps of the Pipeline.Invoke "Define Processing Steps".

The Pipeline Creator indicates they are done creating the Pipeline.

The DMS saves the Pipeline and any associated Policies.Invoke "Save Pipeline Configuration".

The Pipeline Creator publishes the Pipeline into the Pipeline Construction Environment, making it available for execution.Invoke "Publish Pipeline Configuration".

The Pipeline Creator selects and prepares the Pipeline for execution scheduling.Invoke "Target to Execution Environment".

ALTERNATE COURSES:The Pipeline Creator requests to delete or edit an existing Pipeline

The DMS retrieves the requested Pipeline.

Create Pipeline - Use Case Diagram

178

Page 192: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

uc Create Pipeline

«Business»Define Processing Steps

«Controller»Create Pipeline

«Business»Define Execution

Environment«Business»

Browse Component Library

«Business»Save Pipeline Configuration

«System»Publish Pipeline

Configuration

VO Registry

(from Actors)

«System»Target to Execution

Environment

«Business»Define Pipelines/Tools

Associations

«invokes»«invokes»

«invokes»

«invokes»

«invokes»

«invokes»

«invokes»

Create Pipeline - Use Case Diagram

3.3.1.1.2.1 Browse Component Library <UseCase>

BASIC COURSE:

TBD - write this use case when pipeline construction middleware is selected.

3.3.1.1.2.2 Define Execution Environment <UseCase>

BASIC COURSE:

3.3.1.1.2.3 Define Pipelines/Tools Associations <UseCase>

179

Page 193: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

BASIC COURSE:TBD - write this use case when pipeline construction middleware is selected.

3.3.1.1.2.4 Define Processing Steps <UseCase>

A high level interface that allows a user to specify and construct a pipeline, meaning:- select a sequence of data processing components captured as processing steps- define the policies and parameters that control the pipeline's execution

Note: May want to maintain some meta-data on performance and run-time profiling data in the componets that can be used for runtime estimates.

BASIC COURSE:Invoke Browse Component Library to select ComponentsSpecify sequence of execution of the Components.DMS creates a Processing Step for each Component in the sequence and associates the Component with that Processing Step.Reject attempts to sequence Components whose inputs and outputs are incompatibleFor every Execution Parameter of a Component: - Decide whether the Parameter is a constant or should be made user-settable.Verify that all Parameters are dealt withPipeline Creator indicates that the Pipeline is defined and ready to save.

Define Processing Steps Robustness - Analysis Diagram

180

Page 194: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

analysis Define Processing Steps Robustness

DEFINITIONA high level interface that allows a user to specify and construct a pipeline, meaning:- select a sequence of data processing components- define the parameters that control the pipeline's execution

Note: May want to maintain some meta-data on performance and run-time profiling data in the componets that can be used for runtime estimates.

BASIC COURSE:Pipeline Creator creates new Pipeline.

Invoke Browse Component Library to select ComponentsSpecify sequence of execution of the ComponentsReject attempts to sequence Components whose inputs and outputs are incompatibleFor every execution Parameter of a Component:- Decide whether the Parameter is a constant or should be made user-settable.Verify that all Parameters are dealt withPipeline is defined and ready to save

ALTERNATE COURSES:

Browse ComponentLibrary

Specify PiplineComponents

Processing Step Sequence Order

Sanity CheckComponent

Sequence/Parameters

Write VerifiedComponent Sequence

Specify PipelineInterface

Pipeline Creator

181

Page 195: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

Define Processing Steps Robustness - Analysis Diagram

3.3.1.1.2.5 Publish Pipeline Configuration <UseCase>

BASIC COURSE:TBD - write this use case when pipeline construction middleware is selected.

3.3.1.1.2.6 Save Pipeline Configuration <UseCase>

BASIC COURSE:TBD - write this use case when pipeline construction middleware is selected.

3.3.1.1.2.7 Target to Execution Environment <UseCase>

BASIC COURSE:

Target to Execution Environment Robustness - Activity Diagram

182

Page 196: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

act Target to Execution Environment Robustness

Pipeline Workflow Database

«Controller»Create Pipeline

Link Pipeline andServices

Target

Target Executable (binary)

Pipeline Services Library (per target)

Extract Componentsand Compile

Generate DataProducts List

Data Products Access List

Target to ExecutionEnvironment

Create TargetExecutable

«invokes»

Target to Execution Environment Robustness - Activity Diagram

3.3.1.1.2.7.1 Pipeline Developer <Actor>This actor is any user that has the access necessary to create a new component or pipeline type or a new instance of an existing component or pipeline type and to cause that instance to be available for execution.

3.3.1.1.2.7.2 Data Products Access List <Class>

List of all the Data Products that will need to be staged by Pipeline Management subsystem prior to executing the target binary executable.

3.3.1.1.2.7.3 Pipeline Services Library (per target) <Class>

The executable logic for the Middleware Services interface, including:

183

Page 197: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

Data Access InterfaceLogging InterfaceAlerting InterfaceEvent Interface

...

3.3.1.1.2.7.4 Pipeline Workflow Database <Class>

Database that contains the pipeline source code and header files along with the call graph information for the workflow.

3.3.1.1.2.7.5 Target <Class>

Target Type Specification. This is a desiginator that tells the targeting system which platform we are to build an executable pipeline. Generated by the Pipeline Management and Control Subsystem.

3.3.1.1.2.7.6 Target Executable (binary) <Class>

Executable binary that can be loaded by the target platform loader or runtime system.

3.3.1.1.2.7.7 Target Platform Specification Database <Class>

Database that contains the set of platform specific entities, including:

+ Target source language compilers and linkers+ Platform Type and Resources - Cluster, Grid, Server, Laptop

3.3.2 Pipeline Execution Services

includes software programs, database tables, configuration files, unit tests, component integration tests, and documentation that implement the following capabilities:

184

Page 198: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

· Pipeline Controllers/Managers· Logging and Trace Exceptions· Inter-process messaging (MPI)

3.3.2.1 Run a Pipeline <UseCase>

DESCRIPTION: Run a Pipeline - This covers the execution of a pipeline from configuration to clean-up. At a minimum, this use case requires as input, the name of the pipeline to run (e.g. Calibration, Detection, etc.) and the collection of data to operate on. This use case handles application-level fault tolerance logic, e.g. recognizing that output data quality is not sufficient and changing parameters then re-executing.

This abstract use case is the parent of all Application Run XYZ Pipeline use cases and describes the Middleware behavior associated with them. See Application Pipeline Inheritance use case in Application Use Cases.

INPUTS:- name of Pipeline

BASIC COURSE:The DMS invokes this use case upon request by the OCS or the Pipeline Operator, taking as input the name of the pipeline to run and input data collection identifiers to operate on. In the case of reprocessing, it may also take an enumeration of the output products to generate.

The DMS creates a Pipeline Manager (PM) to manage the instantiation and execution of the pipeline.

The PM invokes "Record Pipeline Execution Status". The PM continues to record status until the pipeline has been completed or stopped.

The PM creates an instance of the requested Pipeline by invoking Configure Pipeline, passing along the request inputs and, possibly, the platform to target the Pipeline for. It gets back a reference to the configured Pipeline, including a description of the required input data.

The PM invokes the Stage Input Data use case (passing in the collection names that will be operated on and name of the pipeline to be run).

The PM invokes the Pipeline Execution and Monitoring use case.

ALTERNATE COURSES:

If PM receives an exception from the PCS indicating that the input collection cannot be applied to the named Pipeline, the PM will either: -- attempt to tweak the input collection to apply and reattempt to configure, or -- return the exception to the actor for further action.

185

Page 199: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

--------------------------------------------------------------------On an execute command from the OCS or the Pipeline Operator, the DMS starts the pipeline execution process.

Invoke "Record Pipeline Execution Status". The DMS continues to record status until the pipeline has been completed or stopped.

The DMS creates a Job for the pipeline execution.

Invoke "Set Monitoring and Control Parameters".

Invoke "Select Input Sources and Output Sinks".

Invoke "Distribute Programs to Processing Nodes".

Invoke "Initialize Processing Nodes".

Invoke "Stage Input Data"

The DMS schedules the Job for execution.

At the scheduled time, the DMS starts the Job.Invoke "Start Pipeline Execution and Monitoring".

While the job is executing, the DMS monitors the progress of the Job.Invoke "Monitor Pipeline Execution".

When the Job finishes execution, the DMS cleans up after the Job.Invoke "Clean Up After Execution".

The DMS stops recording Pipeline Execution Status.

ALTERNATE COURSES:

If the Job's Monitoring and Control Parameters require checkpointing, the DMS will pause the Job at specified times or events and will perform a checkpoint/restart.Invoke "Checkpoint/Restart Pipeline".

If the DMS receives a Stop Command from the OCS or the Pipeline Operator, the DMS stops the Job.

Invoke "Stop Pipeline Execution".

Run a Pipeline - Use Case Diagram

186

Page 200: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

uc Run a Pipeline

«Controller»Run a Pipeline

«Controller»Monitor Pipeline Execution

«Controller»Checkpoint/Restart Pipeline

«System»Stop Pipeline Execution

«System»Clean Up after Execution

«System»Checkpoint/Restart within

Processing Step

«System»Checkpoint/Restart Between

Processing Steps

«System»Record Pipeline Execution

Status «Controller»Configure Pipeline

«Controller»Pipeline Execution and

Monitoring

«System»Shutdown Slices

«System»Record Event

«Controller»Stage Input Data

«invokes»

«precedes»

«precedes»

«invokes»

«precedes»

«invokes»

«invokes»

«precedes»

«precedes»

«invokes»

«invokes»

«invokes»«invokes»

«invokes»

«invokes» «invokes»

«invokes»

Name:Package:Version:Author:

Run a PipelinePipeline Execution Services1.0Jeff Kantor

Run a Pipeline - Use Case Diagram

3.3.2.1.1 Updating Sources & Sinks <Issue>

187

Page 201: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.3.2.1.2 Preload a Database <UseCase>

BASIC COURSE:The input is description of the database data needed for a pipeline, either in the form of a set of SQL queries or broad descriptors that identify tables and spatial identifiers for data needed from those tables.

3.3.2.1.3 Stage a Named Collection <UseCase>

BASIC SOURCE:Input is the collection identifiers of the collection to move, a logical name for the destination, and the name of a strategy.Map given strategy name to a policy file. Load policy file into memory. Note retrieval protocol preferences (e.g. gridftp, http, scp).Determine if the collections are local or must be retrieved from the archive. If local, interpret ids as directory path and determine file membership.If non-local, consult the replica location service for the locations of collection components, prefering locations supporting preferred protocols. Ensure strategy can be applied to collection.Determine specific destination endpoints from strategy.Schedule data flow using preferred protocols, connecting retrieval endpoints to destination endpoints. (Strategy will build in parallel servers for transfer.)  Include filters (e.g. packers/unpackers) on each end as required by strategy. Monitor transfers.Signal consumer upon completion.

ALTERNATE COURSES:If collection cannot be supported by the strategy, throw an exception.

If a file transfer fails, schedule a new one, possibly with different source or destination protocol or platform. If the transfer continues to fail, throw an exception.

Some exceptions may need to be delivered to the (asynchronous) consumer via the event channel.

3.3.2.1.4 Pipeline Execution and Monitoring <UseCase>

OUTPUTS:- Pipeline object

GIVEN:

188

Page 202: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

- Pipeline (configuration) policy

BASIC COURSE:

The master process(es) for the pipeline is executed on the target platform; this may happen directly or through a scheduler on that platform. The initiation of this process creates a Pipeline instance.

The Pipeline loads the configuration by invoking the "Load Stage Policies" use case.

The Pipeline engages the provenance system to record the configuration and platform description by invoking the "Record Pipeline Provenance" use case.

The Pipeline registers to send and receive events from the PipelineManager by invoking the "Initialize Pipeline Events" Use Case.

The processes that represent the slices of the pipeline are created across the nodes of the pipeline platform by invoking the "Create Slices". The "Create Slices" use case handles the initialization of the slices, making them ready to process data.

The Pipeline creates each of the stages that make up the pipeline as describe in the pipeline policy. The stages are initialized and executed according to the "Execute Processing Stage" use case.

ALTERNATE COURSES:

Pipeline Execution and Monitoring - Use Case Diagram

189

Page 203: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

uc Pipeline Execution and Monitoring

«Controller»Pipeline Execution and Monitoring

«Controller»Execute Processing Stage

«System»Create Slices

«System»Record Pipeline Provenance

«System»Initialize Pipeline Events

«System»Load Pipeline Policy

«System»Create Slice Intracommunicator

«invokes»

«invokes»

«invokes»

«invokes»

«invokes»

«invokes»

«invokes»

«invokes»

«precedes»

«precedes»

«precedes»

«precedes»

«precedes»

Name:Package:Version:Author:

Pipeline Execution and MonitoringPipeline Execution Services1.0Ray Plante

Pipeline Execution and Monitoring - Use Case Diagram

3.3.2.1.4.1 Create Slice Intracommunicator <UseCase>

Create Slice Intracommunicator

Create a Slice intracommunicator that respects the relationship between the Slices.

INPUTS:

OUTPUTS:

190

Page 204: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

GIVEN:

A topology attribute must be specified within Pipeline policy to designate that their will be inter Slice communication with the Slices having a relationship given by the specified topology.

BASIC COURSE:

The Slices read the topology attribute from Pipeline Policy and generate a new Intracommunicator, built off of the default base intracommunicator. The new intracommunicator understands the arrangement/layout of the Slices (what are the neighbors of a given Slice).

ALTERNATE COURSES:

It is possible that there may be multiple intracommunicators, with the Slices fragmenting into intracommunicating subsets.

NOTES:

3.3.2.1.4.2 Initialize Pipeline Events <UseCase>

Initialize Pipeline

This describes how a pipeline sets itself up to send and receive events for communicating with components outside the pipeline.There should be a straight-forward way to subscribe to all events intended for consumption external to the Pipeline (as opposed to those intended for internal consumption).

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:

The pipeline subscribes to events that come from the orchestration layer (handled via the "Run a Pipeline" use case) by invoking the general use case "Subscribe to an Event".

The pipeline sets up channels for sending standard pipeline events that communicate logging messages and uncaught exceptions to DM components by invoking the general use case "Create Event Transmitter".

The pipeline consults the policy for each of the stages for a description of the events that the stage needs information from and subscribes to them by invoking "Subscribe to an Event".

The pipeline also consults the policies for descriptions of events that stages wish to emit. Channels for initiated by invoking "Create Event Transmitter".

191

Page 205: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

ALTERNATE COURSES:

If the connections to the Event System cannot be made, an exception must be thrown. Pipeline execution should not continue unless the event system can be initiated.

NOTES:

Initialize Pipeline Events - Use Case Diagramuc Initialize Pipeline Events

«System»Initialize Pipeline Events (from Event Handling)

«System»Subscribe to an Event Topic

(from Event Handling)

«System»Create Event Transmitter

«invokes»

«invokes»

Name:Package:Version:Author:

Initialize Pipeline EventsPipeline Execution Services1.0Robyn Allsman

Initialize Pipeline Events - Use Case Diagram

3.3.2.1.4.3 Load Pipeline Policy <UseCase>

Load Pipeline Policy

This use case describes how a pipeline's policy data are loaded into the memory of a pipeine. The policy data includes not only general pipeline policies but also the policies for each of the stages.

192

Page 206: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

INPUTS:

OUTPUTS:

GIVEN:

Policy files for the pipeline have been deployed on the platform where the pipeline is being run.

BASIC COURSE:

ALTERNATE COURSES:

NOTES:

3.3.2.1.4.4 Execute Processing Stage <UseCase>

Execute Processing Stage

This describes how a processing stage is created and engaged to process a stream of data. A stage has representations in processes across multiple nodes of the platform where the pipeline is running, including a master pipeline process and the processes hosting each of the component slices.

INPUTS:

OUTPUTS:

GIVEN:

Pipeline Manager has already targeted and deployed the pipeline on all processing nodes that will be engaged, including all policy data.

Pipeline Manger has already created the processes that host the Pipeline and the Slices.

The Event System has been setup.

BASIC COURSE:

The stage is initialized inside all processes that will run the stage by invoking (within the Pipeline and each of the Slices) the "Initialize Processing Stage".

The Pipeline requests new data from the the input queue. When new data arrives, the Stage invokes within the Pipeline the "Process a Data Input Through a Stage" use case. When the data processing is done, it returns to the input queue for more data.

When a signal arrives from the PM that no more data will be arriving, the "Terminate Processing Stage" is invoked in the Pipeline.

193

Page 207: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

ALTERNATE COURSES:

NOTES:

Execute Processing Stage - Use Case Diagramuc Execute Processing Stage

«Controller»Execute Processing Stage

«System»Initialize Processing Stage

«System»Pre-Process

«System»Process

«System»Post Process

«System»Terminate Processing Stage

«Controller»Process a Data Input Through A

Stage

«Controller»Perform InterSlice

Communication

«invokes»

«invokes»

«invokes»

«precedes»«precedes»

«precedes»

«invokes»

«invokes»

«invokes»«precedes»

«invokes»

«precedes»

Name:Package:Version:Author:

Execute Processing StagePipeline Execution Services1.0Tim Axelrod

Execute Processing Stage - Use Case Diagram

3.3.2.1.4.4.1 Initialize Processing Stage <UseCase>Initialize Processing Stage

Do any one-time setup of the stage. The details of this will depend the specific implementation of the stage.

INPUTS:

OUTPUTS:

GIVEN:

194

Page 208: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

BASIC COURSE:Within both the Pipeline and Slices, the Stage extracts the stage policy data and uses them to configure the Stage. If any data is missing, stage-specific defaults are used where possible.

ALTERNATE COURSES:

NOTES:

3.3.2.1.4.4.2 Process a Data Input Through A Stage <UseCase>Process a Data Input Through A Stage

Apply the stages processing algorithm to one data input

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:If this stage has been configured to receive event data, the Pipeline awaits the arrival of the configured event. Once received, the data is incorporated into the data on the input queue.

The Pipeline invokes the "Pre-Process" use case.

The Pipeline looks at the data eminating from the Pre-Process operation for any data that needs to be shared with the Slices. That data is broadcasted to the Slices where it is incorporated into the input queues for the Slices.

The Pipeline signals each of the Slices to invokes the "Process" use case.

The Slices look at the data eminating from the Process operation for data to be shared with the Pipeline. That data is sent back to the Pipeline and incorporated into the Pipelines data queue.

The Pipeline invokes the "Post-Process" use case.The resulting data is place on the output queue.

If so configured, the Pipeline persists and/or checkpoints data in the queue.

ALTERNATE COURSES:

NOTES:

3.3.2.1.4.4.2.1 Perform InterSlice Communication <UseCase>Perform InterSlice Communication

195

Page 209: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

Perform InterSlice communication of shareable Clipboard data..

INPUTS:

OUTPUTS:

GIVEN:A topology for the Slices must be provided in Pipeline policy.Application Stages post data that is to be shared between Slices to the Clipboard and mark the data as shareable.

BASIC COURSE:A Pipeline instructs Slices to share Clipboard data as configured by policy. - Slices first obtain the shared keys from the Clipboard and then pull the associated data off of the Clipboard.- Slices obtain the list of neighbors with whom they need to perform communications. - The low level InterSlice communication of the types is then performed using the Slice intracommunicator. - Slices post the received data onto their Clipboards.

ALTERNATE COURSES:

NOTES:

Perform InterSlice Communication - Use Case Diagramuc Perform InterSlice Communication

«System»Retrieve Shared Data from

Clipboard

«System»Transmit Data between

Slices

«System»Post Received Data to

Clipboard

«Controller»Perform InterSlice

Communication

«precedes»

«precedes»

«invokes»

«invokes»

«invokes»

Name:Package:Version:Author:

Perform InterSlice CommunicationPipeline Execution Services1.0Greg Daues

Perform InterSlice Communication - Use Case Diagram

196

Page 210: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.3.2.1.4.4.2.1.1 Post Received Data to Clipboard <UseCase>Post Received Data to Clipboard

Post the shareable data received from neighbor Slices to the Clipboard.

INPUTS:

OUTPUTS:

GIVEN:Successful completion of the low level interSlice communication and associated data serialization/deserialization is a prerequisite.

BASIC COURSE:Shareable data received from neighbor Slices is posted to the Clipboard of the current Slice. This must be done in a manner so as to avoid conflicts with existing keys on the Clipboard, to distinguish between the local Slice data and that sent from neighbor Slices.

ALTERNATE COURSES:

NOTES:

3.3.2.1.4.4.2.1.2 Retrieve Shared Data from Clipboard <UseCase>Retrieve Shared Data from Clipboard

Retrieve the data to be shared between Slices.

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:The shared keys are first retrieved from the Clipboard . These are subsequently used to retrieve the data to be transmitted between Slices from the Clipboard. The PtrType to the data is provided to affect the low level communication.

ALTERNATE COURSES:

NOTES:

3.3.2.1.4.4.2.1.3 Transmit Data between Slices <UseCase>Transmit Data between Slices

197

Page 211: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

Transmit the shareable Clipboard data from each Slice to its neighbors and vice-versa.

INPUTS:

OUTPUTS:

GIVEN:Infrastructure for InterSlice communication, such as an intracommunicator for Slices, must exist. The shareable data must have already been retrieved from the Clipboard.

BASIC COURSE:The list of neighbors of a Slice is retrieved. The data to be transmitted to a neighbor Slice is serialized for transport. The data is then transmitted (sent), typically in a nonblocking manner. A Slice then receives data from the neighbor Slice, and deserializes the data into a form suitable for operating on the data and posting to its Clipboard.

ALTERNATE COURSES:

NOTES:

3.3.2.1.4.4.2.2 Process <UseCase>Process

This use case handles the parallel processing part of the Stage. The details depends on the particular algorithm being implemented. If the stage does not need a parallel processing step, this can be skipped.

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:Within each slice, the implementation pulls data from the imput queue. This may include data that was calculated in the pre-process step. The algrorithm is applied to data, and the output date is sent to the output queue. As part of the process of posting the data, the Stage may indicate which data should be collected and made available to the post-process step.

ALTERNATE COURSES:

NOTES:

198

Page 212: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.3.2.1.4.4.2.3 Post Process <UseCase>Post Process

This describes the serial processing that can be done after the completion of the parallel processing. The details of this use case depends on the specific algorithm being applied. If the algorithm does not require this type of processing, this use case does nothing.

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:Within the Pipeline's master process, data is pulled off the input queue. The Stage implementation looks for stage-specific data from the set pulled from the input queue.

The Stage performs serial processing required after the parallel processing.

If the processing produces output data, this data is put onto the output queue. 

ALTERNATE COURSES:

NOTES:

3.3.2.1.4.4.2.4 Pre-Process <UseCase>Pre-Process

This describes the serial processing that can be done in advance of the parallel processing. The details of this use case depends on the specific algorithm being applied. If the algorithm does not require this type of processing, this use case does nothing.

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:Within the Pipeline's master process, data is pulled off the input queue. The Stage implementation looks for stage-specific data from the set pulled from the input queue.

The Stage performs serial processing required in advance of the parallel processing.

If the processing produces output data, this data is put onto the output queue.  As part of this operation, the Stage can indicate which of the data should be shared all of the Slices.

199

Page 213: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

ALTERNATE COURSES:

NOTES:

3.3.2.1.4.4.3 Terminate Processing Stage <UseCase>Terminate Processing Stage

This use case covers clean-up and shutdownl operations that must be done when all data have been processed.

INPUTS:

OUTPUTS:

GIVEN:The Pipeline has received a shutdown signal.

BASIC COURSE:This signal is transmitted to all of the Slices. The Pipeline and Slices each carry out any stage-specific operations.

ALTERNATE COURSES:

NOTES:

3.3.2.1.4.5 Create Slices <UseCase>

Create Slices

Setup each process representing a slice and get it ready to process data.

GIVEN:

The Policy files have been deployed across the nodes of the pipeline platform.

INPUTS:

OUTPUTS:

BASIC COURSE:

Get the allocation of slices to nodes and data chunks to slices by reading parameters from the Pipeline Policy.

200

Page 214: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

For each slice,execute the slice process on its assigned node, passing in the pipeline policy file which will: * load pipeline policies into memory by invoking "Load Pipeline Policy" * set up interstage buffers * register containers (e.g. a working directory) for input and output data. * invoke Initialize Data Ingest Converter * invoke the slice portions of the "Execute Processing Stage" use case

ALTERNATE COURSES:

NOTES:

3.3.2.1.4.6 Record Pipeline Provenance <UseCase>

Record Provenance

Record provenance related to the execution of the pipeline as a whole (e.g. pipeline version, start time, etc.)

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:

ALTERNATE COURSES:

NOTES:

3.3.2.1.5 Checkpoint/Restart Pipeline <UseCase>

Checkpoint/Restart Pipeline

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:

On detecting a Checkpoint Event within a Processing Step, the DMS does this form of

201

Page 215: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

Checkpoint.Invoke "Checkpoint/Restart within Processing Step".

On detecting a Checkpoint Event between Processing Steps, does this form of Checkpoint.Invoke "Checkpoint/Restart Between Processing Steps".

ALTERNATE COURSES:

NOTES:

3.3.2.1.5.1 Checkpoint/Restart Between Processing Steps <UseCase>

Checkpoint/Restart between Processing Steps

BASIC COURSE:

On detecting a Checkpoint Event between Processing Steps, the DMS does not pause Job execution, but copies the output of the completed Processing Step to a Checkpoint.

ALTERNATE COURSES:

3.3.2.1.5.2 Checkpoint/Restart within Processing Step <UseCase>

Checkpoint/Restart within Processing Step

BASIC COURSE:

On detecting a Checkpoint Event within a Processing Step, the DMS pauses Job execution, copies intermediate data results and job state to a Checkpoint and then resumes Job Execution.

ALTERNATE COURSES:

3.3.2.1.6 Clean Up after Execution <UseCase>

Clean Up after Execution

This handles the post-processing details, including confirming that products have been ingested, provenance recording is complete, and pipeline processes have been shut down.

INPUTS:

OUTPUTS:

GIVEN:

202

Page 216: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

BASIC COURSE:

On detecting a Job Complete or Job Stopped Event, the DMS will execute the clean-up steps specified in the Job's Monitoring and Control Parameters.

ALTERNATE COURSES:

NOTES:

3.3.2.1.7 Configure Pipeline <UseCase>

Create an instance of a Pipeline to operate on a given collection of data. By instance, we don't necessarily mean instantiating a software object as much as assembling the specific parameter values to execute a particular run of a pipeline, and to put that parameter data into place.

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:

This use case is invoked by a Pipeline Manager (PM), taking as inputs the name of the Pipeline to instantiate, the target platform, the input data collection to operate on, and possibly (in the case of reprocessing) the desired output data products. A data collection may refer to existing data in the archive (in the case of reprocessing), a virtual collection (previously built products that need to be re-created based on provenance data), or a future data collection (e.g. a night's worth of raw data yet to be collected).

The PM creates a Pipeline Configurater (PCfg), passing in the above inputs, which handles creating the Pipeline instance.

The PCfg retrieves from the Pipeline Construction System a Pipeline template corresponding to the input pipeline name and target platform. This template includes an integrated set of Policies.

Operating on the template as a whole, the PCfg: -- invokes the Select Input Sources and Output Sinks use case. -- invokes the Set Monitoring and Control Parameters use case. -- invokes the Distribute Programs to Processing Nodes use case. -- invokes the Initialize Processing Nodes use case.

The PCfg returns to the PM a reference to the configured pipeline which includes a description of the required input data.

203

Page 217: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

ALTERNATE COURSES:

The PM may have the ability (at the prompting of the OCS or Pipeline Operator) to manipulate monitoring or control parameters after the pipline has started.

If an exception is thrown from the Select Input Sources and Output Sinks use case, the exception is passed to the PM.

NOTES:

Configure Pipeline - Use Case Diagramuc Configure Pipeline

«Controller»Configure Pipeline

«System»Distribute Programs to

Processing Nodes

«System»Initialize Processing Nodes

«System»Select Input Sources and

Output Sinks

«System»Set Monitoring and Control

Parameters

Retrieve Default Pipeline Policies

«invokes»

«invokes»

«invokes»

«invokes»

«precedes»

«precedes»

«precedes»

«invokes»

Name:Package:Version:Author:

Configure PipelinePipeline Execution Services1.0Ray Plante

Configure Pipeline - Use Case Diagram

3.3.2.1.7.1 Retrieve Default Pipeline Policies <UseCase>

Retrieve Default Pipeline Policies

204

Page 218: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

This use case describes how to retrieve previously save sets of policies for configuring a pipeline.

INPUTS:

OUTPUTS:

GIVEN:

BASIC SOURCE:

This use case takes a logical name for a pipeline as an input.

The PolicyLibrary is given the logical name of the pipeline, which it maps to a top level PolicyFile.

The PolicyLibrary opens the top level PolicyFile and finds references to other included PolicyFiles. Those files are (recursively) retrieved, opened, and searched for included policy files.

The PolicyLibrary returns the Policy data either as a list of required policy files used to configure the pipeline (where the top-level PolicyFile is obviosly marked--e.g., first in the list) or as Policy object tree in memory, with all of the included policy files dereferenced.

ALTERNATE COURSES:

NOTES:

3.3.2.1.7.2 Distribute Programs to Processing Nodes <UseCase>

Distribute Programs to Processing Nodes

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:

The PCfg requests from the DMS the destination filesystems for the target platform and maps them onto the Pipeline Policies.

The PCfg will write out the modified Policy parameters into Policy Files for distribution to the processing nodes.

205

Page 219: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

The PCfg copies all needed executables and Policy Files to the target filesystems.

ALTERNATE COURSES:

If a Processing Node that was identified for execution is unavailable, the DMS will attempt to replace it with another available Processing Node.

If no replacement Processing Nodes are available, the DMS will record a Pipeline Hold Event. The DMS places the Job in a Hold Queue.

Some platforms (e.g. external community grids) may require that some the deployment of done as the intial part of the job that gets submitted to the platform's scheduler. In this case, "copying" may mean packaging up the configuration and pointers to data into deployable script that will be run upon execution of the job.

NOTES:

3.3.2.1.7.3 Initialize Processing Nodes <UseCase>

Initialize Processing Nodes

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:

The PCfg initializes each Processing Node by:- Validating the Program deployment on the Node- Setting the communications ports between the Node and other Nodes with which it must communicate

ALTERNATE COURSES:

NOTES:

Some of the initializing of the nodes may actually happen transparently as part of the initial execution of the pipeline. Most notably, an MPI-based pipeline would rely on the MPI infrastructure to set up the communication ports when the pipeline is executed.

3.3.2.1.7.4 Select Input Sources and Output Sinks <UseCase>

206

Page 220: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

Select Input Sources and Output Sinks

This use case determines the required input data for each Processing Step that makes up a particular instance of a Pipeline as well as what output products will be produced. This use case ensures that the proper intermediate products are produced and are set as inputs to the Processing Stages that need them.

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:

The PCfg maps the requested input collection onto the Policies of the initial Processing Stage in the Pipeline to determine what output products will be produced as well as what additional input data is required. The Policy will distinguish between output products that are considered intermediate (which will be deleted before the Pipeline is finished) and which should be ingested into the archive.

For each subsequent Processing Stage, the PCfg maps all input data as well as output data from previous steps into the current Stages's Policy to determine the Stage's input and output data.

The PCfg records all the input data required from the Pipeline as a whole as well as the output data that need to be ingested into the archive after the pipeline completes.

ALTERNATE COURSES:

If specific output products are included in the request (as in for reprocessing), the pipeline and data employed will be trimmed to avoid recreating data that is not needed.

An exception is thrown if the input data collection cannot be applied to the current platform. This is the conclusion when the data can not be mapped onto the Policies.

NOTES:

I'm not sure if the following should be considered distinct from cancelling a Pipeline Execution and configuring a new one:

At any time prior to the start of Pipeline execution, the DMS permits the Pipeline Operator or OCS to reset the Job's Input Sources and Output Sinks.

If "by hand" re-configuring of input and output data is allowed, keeping consistency across Steps will are more likely be more error-prone.

207

Page 221: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

-- Ray

3.3.2.1.7.5 Set Monitoring and Control Parameters <UseCase>

Set Monitoring and Control Parameters

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:

The inputs to this use case are the logical name for the pipeline, the target platform, the input sources and output sinks.

The PCfg retrieves the default template parameters for a pipline by invoking "Retrieve Default Pipeline Policies", passing in the logical name of the Pipeline.

The PCfg modifies the default Monitoring and Control Parameters in the Pipeline Policies as needed for the data collection being operated on and the platform being used. The typical changes made include: o incorporating the selected data sources and sinks o setting the rank of parallelism, the assignment of pipeline slices to nodes, and the assignment of data chunks to slices.

The PCfg may expose the adapted Policy parameters into the DMS's control interface so that they can be modifed arbitrarily.

ALTERNATE COURSES:

NOTES:

3.3.2.1.8 Monitor Pipeline Execution <UseCase>

Monitor Pipeline Execution

INPUTS:

OUTPUTS:

GIVEN:

208

Page 222: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

BASIC COURSE:

While the Job is executing, the DMS accesses the recorded execution status and displays it according to the Job's Monitoring and Control Parameters.Invoke "Display Pipeline Status"

[Note: "Record Provenance" now handled within executing pipline stages.]

While the Job is executing, the DMS monitors the Processing Nodes for Failure Events or time-outs.Invoke "Detect Failure".

ALTERNATE COURSES:

If the DMS detects a Hardware Failure, it attempts to recover.Invoke "Recover from Hardware Failure".

If the DMS detects a Software Failure, it attempts to recover.Invoke "Recover from Software Failure".

NOTES:

Monitor Pipeline Execution - Use Case Diagram

209

Page 223: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

uc Monitor Pipeline Execution

«Controller»Monitor Pipeline

Execution

«System»Detect Failure

«Business»Display Pipeline Status

«System»Recover from Hardware

Failure

«System»Recover from Software Failure

The recovery use cases may recursively invoke Execute the Pipeline or portions of it as a result of recovery.

«invokes»

«invokes»

«invokes»

«invokes»

Name:Package:Version:Author:

Monitor Pipeline ExecutionPipeline Execution Services1.0Ray Plante

Monitor Pipeline Execution - Use Case Diagram

3.3.2.1.8.1 Detect Failure <UseCase>

Detect Failure

INPUTS:

OUTPUTS:

GIVEN:

210

Page 224: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

BASIC COURSE:

If a Processing Node experience a hardware or software failure and is able to record a Failure Event, it does so. If the DMS detects tthat a processing node has failed (without a recorded Event, via e.g. timout of a connection), the DMS will record a Failure Event.

ALTERNATE COURSES:

NOTES:

3.3.2.1.8.2 Display Pipeline Status <UseCase>

Display Pipeline Status

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:

While the Job is executing, the DMS accesses the recorded execution status and displays it anywhere in the DMS in real-time. The status is available remotely from any facility in the DMS, assuming communications links are available from that facility to the monitoring facility. The displayed information and refresh rate, and a default display location and format are specified in the Job's Monitoring and Control Parameters.

ALTERNATE COURSES:

NOTES:

3.3.2.1.8.3 Recover from Hardware Failure <UseCase>

Recover from Hardware Failure

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:

The DMS examines the Event Database to determine the time and nature of the failure. If a

211

Page 225: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

corrective action is possible, the DMS attempts to correct the failed hardware. If the action corrects the failure, the DMS records a Recovered Failed Hardware Event.Invoke "Recover from Software Failure".

ALTERNATE COURSES:

If no corrective action is available, the DMS attempts to find a replacement for the failed hardware. If the DMS finds a replacment and configures it successfully, the DMS records a Replaced Failed Hardware Event.Invoke "Recover from Software Failure".

If no replacement is available, the DMS records an Unrecoverable Hardware Failure Event.

NOTES:

3.3.2.1.8.4 Recover from Software Failure <UseCase>

Recover from Software Failure

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:

The DMS examines the Event Database to determine the time and nature of the failure. If a corrective action is possible, the DMS attempts to correct the failed software If the action corrects the failure, the DMS records a Recovered Failed Software Event.

ALTERNATE COURSES:

If no corrective action is available, the DMS attempts to re-run the failed software. If the DMS succeeds in re-running the failed software successfully, the DMS records a Re-run Failed Software Event. The DMS will attempt to re-run the software the specified number of times in the Job's Monitoring and Control Parameters.

If the re-run attempts are unsuccessful, the DMS records an Unrecoverable Software Failure Event.

NOTES:

3.3.2.1.9 Record Pipeline Execution Status <UseCase>

212

Page 226: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

Record Pipeline Execution Status

This use case handles the recording of the state of a Pipeline via Events.

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:

A Pipeline Manager (PM) manages the evolution of a Pipeline instance.Whenever the Pipeline instance changes state, the System records the execution state by invoking "Record an Event"Recording of state ends when the PM determines that the Pipeline execution (through clean-up) is complete.

ALTERNATE COURSES:

NOTES:

Record Pipeline Execution Status Robustness - Analysis Diagram

213

Page 227: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

analysis Record Pipeline Execution Status Robustness

Name: Record Pipeline Execution Status RobustnessAuthor: The AdministratorVersion: 1.0Created: 9/22/2005 10:04:32 AMUpdated: 10/28/2010 2:16:53 PM

This is a 'plain Note and NOT a linked Use Case Note.

DESCRIPTION:

BASIC COURSE:System creates a Job controlling a Pipeline.Whenever the Pipeline changes state, the System records the execution state as an Event in the Event DatabaseThe Job ends.

ALTERNATE COURSES:

create a Job

get State

record pipelineexecution

create Event

insert Event into EventDatabase

Record Pipeline Execution Status Robustness - Analysis Diagram

3.3.2.1.10 Stage Input Data <UseCase>

This handles the staging of input file data needed by a pipeline which may be carried out well in advance of the pipeline's execution.

BASIC COURSE:

The PMC reacts to an event (perhaps a scheduled event) to stage data for a given pipeline on a collection of data.The collection could refer to data that does not exist yet (because it hasn't been observed; see ALTERNATE COURSE).

The PMC passes the collection ID(s) and the name of the pipeline to be run to the Pipeline Construction System (PCS) and gets back * a list of additional collection IDs referring to other data the pipeline will need as input, * a description of the database information needed, either in the form of a set of SQL queries or broad descriptors that identify tables and spatial identifiers for data needed from those tables

214

Page 228: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

The PMC invokes the Stage a Named Collection Use Case, using the list of collection IDs received from the PCS; that and the pipeline name is passed to the Data Access Framework for file datastaging.

The PMC invokes the Preload a Database UC, using the description of the database information needed.

ALTERNATE COURSES:

If the collection ID refers to raw data that hasn't been observed yet (e.g. the next night's observations), the PMC must request from the OCS the schedule for those future observations. The PMC transforms the schedule into a list of more specific collections IDs (e.g. corresponding to the fields being observed). This specific list is sent to the PCS as described above.

If an exception is caught (either directly or via the event channel), the transfer can be rescheduled for another time. Continued failure should result in an exception.

3.3.2.1.11 Stop Pipeline Execution <UseCase>

Stop Pipeline Execution

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:

On detecting a Stop Event, the DMS stops the Job and takes the actions specified in the Job's Monitoring and Control Parameters for Stop Events.

Invoke Shutdown Data Ingest Converter (see Start Pipeline Execution and Monitoring in the Processing Step Sandbox, it invokes Create Slices which invokes Initialize Data Ingest Converter - we probably should do the same here)

ALTERNATE COURSES:

NOTES:

Stop Pipeline Execution - Use Case Diagram

215

Page 229: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

uc Stop Pipeline Execution

«System»Stop Pipeline Execution

«System»Shutdown Slices«invokes»

Name:Package:Version:Author:

Stop Pipeline ExecutionPipeline Execution Services1.0Robyn Allsman

Stop Pipeline Execution - Use Case Diagram

3.3.2.1.11.1 Shutdown Slices <UseCase>

Shutdown Slices

Shutdown the processes and shared memory on each node

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:

For each slice, on its assigned node, * tear down interstage buffers * de-register containers (e.g. a working directory) for input and output data. * invoke Shutdown Data Ingest Converter (see Start Pipeline Execution and Monitoring in the Processing Step Sandbox, it invokes Create Slices which invokes Initialize Data Ingest Converter - we probably should do the same here)

ALTERNATE COURSES:

NOTES:

216

Page 230: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.3.3 Control and Management Services

includes software programs, database tables, configuration files, unit tests, component integration tests, and documentation that implement the following capabilities:

· Event services· Orchestration services

3.3.3.1 Control DM System <UseCase>

02C.07.01 Control and Management Services

DESCRIPTION:

This use case describes the overall control and management modes of the Data Management Subsystem. It is a pure controller use case and describes how the DMS simultaneously operates to collect data, reduce it, re-process it, and publish it. It also describes how the DMS may be started/initialized orstopped, which typically occur only during commissioning and for scheduled major maintenance periods.

BASIC COURSE:

On command from the Data Management System Administrator, each DM facility is started and initialized independently as documented in system administration procedures. Connectivity is then established between the facilities and with the OCS.Invoke "Initialize DMCS".

On command form the OCS, the DMS receives and processes data from the instrument.

Invoke "Process and Observing Night".

On completion of a pre-defined number of observing nights or on command by Observatory Operations, the DMS does a complete assessment of the overall state of the LSST Data Products and produces Data Product Quality Reports.Invoke "Assess Data Product Quality".

After review of the Data Product Quality Reports, Observatory Operations determines whether a new Data Products Release is warranted. If so, Observatory Operations initiates the process to release the Data Products.Invoke "Publish Data Products".

After each assessment of Data Product quality, the DMS makes an automated assessment of the need or desirability of reprocessing LSST Data. This can occur due to an increase in the volume or quality of available raw data, or the appearance of improved processing algorithms. Depending on the amount of re-processing, the DMS may initiate the re-processing

217

Page 231: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

automatically, or request approval from Observatory Operations to initiate the re-processing. If authorized, the re-processing starts.Invoke "Reprocess Observations".

At all times during DMS operation, the DMS collects system health and status across all facilities, and logs this into the Event Database.Invoke "Montior DM System Health and Status".

On command from the DMS Administrator, the DMS logs a stop Event and initiates shutdown at each facility.Invoke "Stop DMCS".

ALTERNATE COURSES:

TBD

Control DM System - Use Case Diagram

218

Page 232: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

uc Control DM System

«Controller»Process Raw Images to Alerts

«System»Initialize DMS

«System»Stop DMCS

«Controller»Reprocess Observations

«Controller»Control DM System

Observatory Control System «Controller»Publish Data Products

Data Management System Administrator

Observatory Operations

VO Registry

«System»Monitor DM System Health

and Status

Note: All Control and Management use cases contain alternates for special mode operations:

Commissioning Use of simulated interfacesMaintenance Infrastructure Upgrade Infrastructure ServiceEmergency Mountain/Base Connectivity Lost Base/Archive Connectivity Lost

«Controller»Process an Observing Night

«System»Fetch Template Images & Updated

Orbit Catalog from Archive

«System»Send Raw Image Data to Archive

«Controller»Produce Calibration Data

Products

«Controller»Produce a Data Release

«System»Produce Nightly DMS

Summaries

«precedes»

«invokes»

«invokes»

«invokes»

«invokes»

«invokes»

«invokes»

«invokes»

«invokes»

«invokes»

«invokes»

«precedes»

«precedes»

Control DM System - Use Case Diagram

219

Page 233: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.3.3.1.1 Fetch Template Images & Updated Orbit Catalog from Archive <UseCase>

Fetch Template Images & Updated Orbit Catalog from Archive -

BASIC COURSE:After the data is reduced by the Nightly Pipelines and Analysis Pipelines at the Archive Center, the updated Orbit Catalog and any required Template Images are transferred to the Base Facility for subsequent Observing Nights.

ALTERNATE COURSES:

3.3.3.1.2 Produce Nightly DMS Summaries <UseCase>

Produce Nightly DMS Summaries -

BASIC COURSE:After the data is reduced by the Nightly Pipelines and Analysis Pipelines at the Archive Center, the DMS generates Nightly Data Product Summaries and transfers them back to the Base Facility for subsequent Observing Nights scheduling.

ALTERNATE COURSES:

3.3.3.1.3 Send Raw Image Data to Archive <UseCase>

Send Raw Image Data to Archive - Capture data transferred from Base Camp and stage in the appropriate place for subsequent processing.

BASIC COURSE:

ALTERNATE COURSE:

3.3.3.1.4 Initialize DMS <UseCase>

Initialize DMS

BASIC COURSE:

220

Page 234: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

On command from the DMS Administrator, each DM facility initiates an automated startup and initializiation procedure independently (as documented in system administration procedures). This starts the DMCS at each facility. The DMCS initializes and runs diagnostics on all equipment. The DMS then establishes connectifvity between the facilities and with the OCS.

ALTERNATE COURSES:

3.3.3.1.5 Monitor DM System Health and Status <UseCase>

Monitor DM System Health and Status -

BASIC COURSE:

At all times during DMS operation, the DMS collects system health and status across all facilities, and records this into the Event Database. All significant state changes of the DMS are recorded, including pipeline Jobs started and completed, equipment or software failure and recovery Events, data transmissions from facility to facility. Continuous and perriodic evaluationas are made of system resource use and availability, including CPU/cluster, disk and secondary storage, and network delivered bandwidth and errors. Any occurrence of these statistics being out of pre-defined tolerances is also recorded and Observatory Operations are notified according to pre-defined risk levels.

ALTERNATE COURSES:

3.3.3.1.6 Publish Data Products <UseCase>

Publish Data Products -

BASIC COURSE:

After review of the Data Product Quality Reports, Observatory Operations determines whether a new Data Products Release is warranted. If so, Observatory Operations initiates the process to release the Data Products. The release process creates the Data Product Release along with associated descriptive documentation in the Archive.Invoke "Publish in Archive Center".

The DMS updates references and links to the most current release to point to the new release.

221

Page 235: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

The DMS produces master images for production of other release media (e.g. DVD) as necessary.

The DMS registers the release with the VO Registry.Invoke "Register in VO".

The DMS initiates replication of the release to any Data Centers that are to host the new release.Invoke "Replicate in Data Centers".

Based on Data Product Retention Policy, if any Data Products are deteremined to be ready for preservation or retirement, the DMS removes the Data Products to the appropriate storage and changes their status accordingly.Invoke "Preserve/Retire Data Product".

ALTERNATE COURSES:

Publish Data Products - Use Case Diagramuc Publish Data Products

«Controller»Publish Data Products

Observatory Operations

Observatory Control System

«System»Publish in Archive Center

«System»Replicate in Data Centers

«System Interface»Register in VO

VO Registry

«System»Preserve/Retire Data

Product

«invokes»

«invokes»

«invokes»

«invokes»

Publish Data Products - Use Case Diagram

3.3.3.1.6.1 Preserve/Retire Data Product <UseCase>

Preserve/Retire Data Product

222

Page 236: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

BASIC COURSE:

ALTERNATE COURSES:

3.3.3.1.6.2 Publish in Archive Center <UseCase>

Publish in Archive Center

BASIC COURSE:

ALTERNATE COURSES:

3.3.3.1.6.3 Register in VO <UseCase>

Register in VO

BASIC COURSE:

ALTERNATE COURSES:

3.3.3.1.6.4 Replicate in Data Centers <UseCase>

Replicate in Data Centers

BASIC COURSE:

ALTERNATE COURSES:

223

Page 237: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.3.3.1.7 Reprocess Observations <UseCase>

Reprocess Observations

BASIC COURSE:

After each assessment of Data Product quality, the DMS makes an automated assessment of the need or desirability of reprocessing LSST Data. This can occur due to an increase in the volume or quality of available raw data, or the appearance of improved processing algorithms. Depending on the amount of re-processing, the DMS may initiate the re-processing automatically, or request approval from rhe Pipeline Operatow to initiate the re-processing. If authorized, the re-processing starts.

ALTERNATE COURSES:

Reprocess Observations - Use Case Diagramuc Reprocess Observations

«Controller»Reprocess Observations

«System»Analyze Re-processing Needed

«System»Create Re-processing Strategy

Pipeline Operator

Observatory Control System

«invokes» «invokes»

«precedes»

Reprocess Observations - Use Case Diagram

3.3.3.1.7.1 Analyze Re-processing Needed <UseCase>

Analyze Re-processing Needed

BASIC COURSE:

224

Page 238: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

After each assessment of Data Product quality, the DMS makes an automated assessment of the need or desirability of reprocessing LSST Data.

The DMS assesses the increase in the volume or quality of available raw data, or the appearance of improved processing algorithms.

Based in the existing Data Product Quality statistics, the above factors, and the Data Product Provenance, the DMS identifies Data Products needing re-processing and stores this information in a Processing List.

ALTERNATE COURSES:

3.3.3.1.7.2 Create Re-processing Strategy <UseCase>

Create Re-Processing Strategy

BASIC COURSE:

For each item in the Processing List, the DMS examines Provenance and determines the point of re-processing and creates a corresponding Policy.

ALTERNATE COURSES:

3.3.3.1.8 Stop DMCS <UseCase>

Stop DMS

BASIC COURSE:

On command from the DMS Administrator, the DMS records a stop Event and initiates an automated shutdown procedure at each facility. Each facility goes executes the automated shutdown process, and then the shutdown is verified by Observatory Operations.

ALTERNATE COURSES:

225

Page 239: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.3.3.2 Event Handling

Event Handling - Use Case Diagramuc Event Handling

«System»Record Event

«System»Subscribe to an Event Topic

«System»Create Event Transmitter

«System»Create Event

Receiver

«System»Establish Event

Transmitter In Event System

«System»Establish Event

Receiver In Event System

«System»Publish Event

«System»Publish Event Using

Event System

«System»Matching Receive

Event

«System»Receive Event

«System»Retrieve Event Using

Event System

«System»Retrieve Matching Event Using Event

System

«invokes»

«invokes»

«invokes»

«invokes»

«invokes»

«invokes»

«invokes»

«invokes»

«invokes»

«invokes»

«invokes»

Event Handling - Use Case Diagram

226

Page 240: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

uc Event Monitoring

«System»Process Incoming

Event

«System»Initialize Event

Monitor

«Controller»Run Event Monitor

«Controller»Run Fault Monitor

«System»Create Timer

«System»Publish Event Using

Event System

«invokes»

«invokes»

«invokes»

«invokes»

«instanceOf»

Event Monitoring - Use Case Diagram

3.3.3.2.1 Create Event Receiver <UseCase>

This use case describes how to set up a subscriber inside a process wishing to receive Events. The invoker needs to know the Event Broker’s host name and the logical name of the Event Topic that will be used to retrieve messages.

BASIC COURSE:

The process creates an Event Receiver using the host name of an Event Broker, and the Event Topic name to subscribe to. On successful completion, all Events sent to that Event Topic by canbe read by the Event Receiver for the life of the Event Receiver. The Events are read from the Event Receiver in the order they arrive.

Done.

ALTERNATE COURSES:

If a host name is given for a host that does not have an Event Broker, an exception is thrown.

3.3.3.2.2 Create Event Transmitter <UseCase>

227

Page 241: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

This use case describes how to set up an Event Transmitter inside a process wishing to send events. The invoker needs to know the Event Broker’s host name and the logical name of the event topic to publish to.

BASIC COURSE:

The process creates an Event Transmitter using the host name of an Event Broker, and the Event Topic name to publish to. All Events sent from this Event Transmitter are sent only to this Event Topic, and can be read by all current subscribers to that topic.

Done.

ALTERNATE COURSES:

If a host name is given for a host that does not have an Event Broker, an exception is thrown.

3.3.3.2.3 Create Timer <UseCase>

Create Timer - A Timer is an alarm which is set to expire with in a certain time period if an event matching a Condition doesn't arrive.

BASIC COURSE:

The Event Monitor creates an Timer with a time value.

If the timer alarm expires before the Condition is met, the Timer's Action is executed.

If the Condition is met before the time expires,The Condition's Action is executed.

ALTERNATE COURSES:

3.3.3.2.4 Establish Event Receiver In Event System <UseCase>

Establish Event Receiver in Event System - This use case describes how to set up a receiving channel in the Event System inside a process. The invoker gives the Event System the name of the system hosting the Event Broker, and the topic name to listen to.

BASIC COURSE:

The process wishing to receive events on an event channel retrieves a reference to the Event System.

228

Page 242: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

The process gives the Event System the host name and the topic name that will be later used to receive events and invokes the "Create Event Receiver" use case.

The resulting Event Receiver is kept by the Event System for subsequent use.

Done.

ALTERNATE COURSE:

An exception is thrown if the broker at the given host name can not be contacted, or does not exist.

3.3.3.2.5 Establish Event Transmitter In Event System <UseCase>

Establish Event Transmitter in Event System - This use case describes how to set up a publishing channel in the Event System inside a process wishing to send events. The invoker gives the Event System the name of the system hosting the event broker, and the topic name to which event can be published. In later calls, processes only need to refer to the topic name on which to publish when sending events. References to publishing topic names are kept internally in the Event System.

BASIC COURSE:

The process wishing to transmit events to an Event Topic retrieves a reference to the global Event System.

The process gives the Event System the host name and the Event Topic that will be later used to transmit Events and invokes the "Create Event Transmitter" use case.

The resulting Event Transmitter is stored by the Event System for subsequent use by the process when it publishes Events.

Done.

ALTERNATE COURSE:

An exception is thrown if the broker at the given host name cannot be contacted, or does not exist.

An exception is thrown if the Event Topic has not be registered with the Event System.

3.3.3.2.6 Initialize Event Monitor <UseCase>

229

Page 243: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

Initilize Event Monitor - This use case describes how the Event Monitor is initialized to wait for incoming traffic.

BASIC COURSE:

The Event Monitor reads an Event Monitor Configuration file. The Event Monitor also initializes the source incoming traffic stream, and waits for the initial Event data to arrive.

ALTERNATE COURSES:

3.3.3.2.7 Matching Receive Event <UseCase>

Matching Receive Event - This use case describes how to receive Events with match a name and value pair in the Event payload. An optional timeout value can be given to match name/value pairs within the specified time period.

BASIC COURSE:

The process gives a name and value to be matched to an Event Receiver. The Event Receiver looks in the Event Cache to see if there are any previously received Events that match that pair. If there is a match, this event is returned.

If there is no match in the Event Cache, the Event Receiver listens for new Events from the Event Broker. When an Event is received, the name/value pair is matched against it. If there is a match, the event is returned to the caller.

If there is no match, the event is added to the Event Cache, and the Event Receiver again listens for Events from the Event Broker, repeating the steps above.

ALTERNATE COURSES:

3.3.3.2.7.1 Publish Event <UseCase>

Publish Event - This use case describes how to publish an Event to an Event Topic.

BASIC COURSE:

An Event of the appropriate type is created and given a data payload.

The process wishing to publish an Event gives the event to an Event Transmitter created with

230

Page 244: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

the event topic the process wishes to publish to. The Event Transmitter immediately sends the Event to the Event Broker. The Event Broker distributes the event to all current subscribers to that event topic.

Done.

ALTERNATE COURSE:

If a connection to the Event Broker is broken, and exception is thrown.

3.3.3.2.8 Process Incoming Event <UseCase>

Process Incoming Event - This use case describes how an incoming Event flows through the Event Monitor.

BASIC COURSE:

An Event arrives at the Event Monitor. The Event is match against the sets of Conditions specified when the Event Monitor was initialized.

If a Condition matches, its Action is executed. If the Condition is part of a list of Condition/Action pairs, the next Condition/Action pair will be used when the next Event comes in.

ALTERNATE COURSES:

3.3.3.2.9 Publish Event Using Event System <UseCase>

Publish Event Using Event System - This use case describes how to publish an event using the Event System. The invoker supplies the Event to be published, and the Event Topic to publish to.

BASIC COURSE:

The process gives the Event System an Event Topic name and an Event to publish on that topic. The Event System matches the topic name with the list of Event Transmitters that have previously been created, and uses that Event Transmitter to publish the Event.

Done.

ALTERNATE COURSE:

231

Page 245: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

If an Event Transmitter with the given Event Topic name does not exist, an exception is thrown.

If a connection to the Event Broker is broken, and exception is thrown.

3.3.3.2.10 Receive Event <UseCase>

Receive Event - This use case describes how to retrieve an Event. The Event is retrieved within a certain time limit.

BASIC COURSE:

The process uses an Event Receiver to request a new event. The Event Receiver blocks until a new event arrives, and the event is returned to the calling process.

If the process calls the Event Receiver with a timeout value, the Event Receiver will wait up to that amount of time. If an Event hasn’t be received within this time period, a null event is returned.

ALTERNATE COURSE:

3.3.3.2.11 Record Event <UseCase>

Record Event - Describes the process for creating, publishing, and recording an Event message.

BASIC COURSE:

The process invokes the "Establish Event Transmitter In Event System" use case to create an Event Transmiter in the Event System.

The process creates an Event Object of the proper type and loads it with the desired data to record.

The process invokes the "Publish Event Using Event System" use case to send the Event.

Each process which subscribes to this event topic receives the message.

The events sent to this topic are recorded to a database via an external process which records all events.

Done.

ALTERNATE COURSE:

232

Page 246: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.3.3.2.12 Retrieve Event Using Event System <UseCase>

Retrieve Event Using Event System - This use case describes how to retrieve a message from the Event System. The caller only needs to supply the Event Topic to receive messages from. An optional timeout value can be given so that the call returns if no event areas within the specified time period.

BASIC COURSE:

The process requests the Event System receive a message on an event topic channel. The Event System searches for an Event Receiver with that Event Topic, and uses it to retrieve the next incoming event, which is returned to the invoking process.

ALTERNATE COURSE:

If no Event Receiver matches the Event Topic, an exception is thrown.

3.3.3.2.13 Retrieve Matching Event Using Event System <UseCase>

Retrieve Matching Event Using Event System - This use case describes how to receive Events with match a name and value pair in the Event payload through the Event System. An optional timeout value can be given to match name/value pairs within the specified time period.

BASIC COURSE:

The process gives a Topic Name, along with a name and value pair to be matched to the Event System. The Topic Name is matched to the Event Receiver which has previously been registered to the Event System.

The matching Event Reciever is used to invoke the "Matching Receive Event" use case.

The matching Event is returned to the invoking process.

ALTERNATE COURSE:

If Topic Name does not match those listed in the Event System, an exception is thrown.

3.3.3.2.14 Run Event Monitor <UseCase>

Run Event Monitory - The Event Monitor watches Event streams and performs actions based

233

Page 247: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

on information it receives from those incoming events. This use case describes the execution path of the Event Monitor.

BASIC COURSE:

The Event Monitor invokes "Initialize Event Monitor" to read the Event Monitor configuration file to set up data structures for evaluating incoming events, and to begin monitoring event topic(s).

The Event Monitor invokes "Run a Process Timer" when events are required to appear within a certain amount of time.

The Event Monitor invokes "Process Incoming Event" when a new Event arrives. This is done continuously for each new Event until the Event Monitor is stopped.

ALTERNATE COURSES:

3.3.3.2.15 Run Fault Monitor <UseCase>

Run Fault Monitor - The Fault Monitor is a type of Event Monitor that is configured to issue Fault Events based on conditions it detects in the system. Multiple Fault Monitors may exist.

Typical Conditions the Fault Monitor will watch for are: single exception events; sequences of Events that must occur within a certain amount of time; an event, which contains a parameter that is out of range from what, it expects.

Typical Actions that the Fault Monitor will take: issuing Fault Events; sending an e-mail to an administrator; displaying an error message on an operator console.

BASIC COURSE:

This is a Basic Course that outlines how an operator is alerted to a failed data transfer:

The Fault Monitor invokes the "Initialize Event Monitor" use case to read in its configuration files.

It begins to retrieve incoming Events by invoking the "Process Incoming Event" use case.

An Event indicating the start of a data transfer matches one of the Conditions in the Fault Monitor. The Fault Monitor invokes the "Create Timer" use case to set an alarm that will go off if an Event indicating the end of transmission isn't received within a set time.

Events continue to arrive, but the Timer alarm goes off before the Event indicating "end of transmission" arrives.

An Fault Event is issued by the Fault Monitor indicating that the data transfer didn't occur within

234

Page 248: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

the specified time.

Done.

ALTERNATE COURSES:

3.3.3.2.16 Subscribe to an Event Topic <UseCase>

Subscribe to an Event Topic - Describes how an event consumer registers itself to receive events.

BASIC COURSE:

The process wishing to consume Events on a Event Topic invokes the "Establish Event Receiver In Event System" use case.

From that point on, each time the process wishes to receive an event, it invokes the "Receive Event Using Event System" use case to retrieve Events.

ALTERNATE COURSES:

If a process desires to retrieve only Events that match certain criteria, the "Retrieve Matching Event Using Event System" use case to retrieve Events.

3.3.4 Science Database and Data Access Services

includes software programs, database tables, configuration files, unit tests, component integration tests, and documentation that implement the following capabilities:

· Database Services· Query Services· Catalog Construction· Virtual Observatory Interfaces

These services provide the ability to ingest, index, federate, query, and administer DMS data products on distributed, heterogeneous storage system and data server architectures. All services will be implemented to provide reasonable fault-tolerance and autonomous recovery in the event software and hardware failures.

235

Page 249: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.3.4.1 Catalog Construction Toolkit

includes software programs, database tables, configuration files, unit tests, component integration tests, and documentation that implement the following capabilities:

· Create Data Type· Create Meta Data· Create Catalog

3.3.4.1.1 Construct Catalog <UseCase>

02C.06.06 Catalog Construction Toolkit

DESCRIPTION:

Construct Catalog -

BASIC COURSE:The Pipeline Creator creates Data Types that can be members of Catalogs.Invoke "Create Data Type".

The Pipeline Creator creates a Catalog for later population with appropriate data sets.Invoke "Create Catalog".

ALTERNATE COURSES:

Construct Catalog - Use Case Diagram

236

Page 250: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

uc Construct Catalog

Catalog Creator

(from Actors)

«Controller»Construct Catalog

«Controller»Create Data Type

«Controller»Create Catalog

«invokes»

«invokes»

Construct Catalog - Use Case Diagram

3.3.4.1.1.1 Create Catalog <UseCase>

Create Catalog -

BASIC COURSE:The Catalog Creator requests the creation of a new Catalog

The DMS creates a new empty Catalog

The Catalog Creator defines the Data Types that represent the "columns" or "tables" of the Catalog.Invoke "Select Data Types".

The Catalog Creator defines the validation constraints that are to be maintained in ensuring the quality and integrity of the Catalog data.Invoke "Define Validation Constraints".

The Catalog Creator defines the access constraints that determine what users and processes can access the Catalog.Invoke "Define Access Constraints".

The Catalog Creator indicates they are done creating the Catalog

237

Page 251: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

The DMS saves the Catalog and publishes the Catalog into the Catalog Constuction Environment, making it available for population.Invoke "Compile and Publish Catalog".

The Catalog Creator creates a loader for Catalog initialization.Invoke "Create Loader".

The Catalog Creator initializes the Catalog by invoking the Loader on an initial data set.Invoke "Initialize Catalog Contents".

ALTERNATE COURSES:The Catalog Creator requests to delete or edit an existing Catalog

The DMS retrieves the requested Catalog.

Create Catalog - Use Case Diagramuc Create Catalog

«Business»Browse Data Types«Controller»

Create Catalog

«Business»Select Data Types

«Business»Define Validation

Constraints

«Business»Define Access

Constraints

«System»Compile and Publish Catalog

«Business»Create Loader

«System»Initialize Catalog Contents

«invokes»

«invokes»

«invokes»

«invokes»

«invokes»

«invokes»

«invokes»

Create Catalog - Use Case Diagram

3.3.4.1.1.1.1 Compile and Publish Catalog <UseCase>

Compile and Publish Catalog -

BASIC COURSE:

TBD - write this use case when catalog construction middleware is selected.

238

Page 252: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

ALTERNATE COURSES:

3.3.4.1.1.1.2 Create Loader <UseCase>

Create Loader -

BASIC COURSE:

TBD - write this use case when catalogconstruction middleware is selected.

ALTERNATE COURSES:

3.3.4.1.1.1.3 Define Access Constraints <UseCase>

Define Access Constraints -

BASIC COURSE:

TBD - write this use case when catalogconstruction middleware is selected.

ALTERNATE COURSES:

3.3.4.1.1.1.4 Define Validation Constraints <UseCase>

Define Validation Constraints -

BASIC COURSE:

TBD - write this use case when catalogconstruction middleware is selected.

ALTERNATE COURSES:

3.3.4.1.1.1.5 Initialize Catalog Contents <UseCase>

Initialize Catalog Contents -

239

Page 253: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

BASIC COURSE:

TBD - write this use case when catalogconstruction middleware is selected.

ALTERNATE COURSES:

3.3.4.1.1.1.6 Select Data Types <UseCase>

Select Data Types -

BASIC COURSE:

TBD - write this use case when catalogconstruction middleware is selected.

ALTERNATE COURSES:

3.3.4.1.1.2 Create Data Type <UseCase>

Create Data Type -

BASIC COURSE:

- The Catalog Creator requests the creation of a new Data TypeThe DMS creates a new empty Data Type.

- The Catalog Creator defines the attributes of the Data TypeInvoke "Define Attributes".

- The Catalog Creator defines the assocaitions and indexiing of the Data TypeInvoke "Define Data Assocations and Indexing".

- The Catalog Creator defines the pipelines and tools that create and manipulate the Data Type.Invoke "Define Pipeline/Tools Associations".

- The Catalog Creator defines the physical storage for the Catalog.Invoke "Define Physical Storage".

- The Catalog Creator compiles the Data Type into the DMS, making it available to be used in Catalogs.Invoke "Compile and Publish Data Type".

ALTERNATE COURSES:

240

Page 254: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

The Catalog Creator requests to delete or edit an existing Data Type.The DMS retrieves the requested Data Type.

Create Data Type - Use Case Diagramuc Create Data Type

«Controller»Create Data Type

«Business»Define Attributes

«Business»Define Data Associations and

Indexing

«Business»Define Physical Storage

(from Pipeline Construction Toolkit)

«Business»Define Pipelines/Tools

Associations

«System»Compile and Publish Data

Type«Business»

Browse Data Types

«invokes»

«invokes»

«invokes»

«invokes»

«invokes»

«invokes»

Create Data Type - Use Case Diagram

3.3.4.1.1.2.1 Browse Data Types <UseCase>

Browse Data Types -

BASIC COURSE:

TBD - write this use case when catalogconstruction middleware is selected.

ALTERNATE COURSES:

241

Page 255: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.3.4.1.1.2.2 Compile and Publish Data Type <UseCase>

Compile and Publish Data Type -

BASIC COURSE:

TBD - write this use case when catalog construction middleware is selected.

ALTERNATE COURSES:

3.3.4.1.1.2.3 Define Attributes <UseCase>

Define Attributes -

BASIC COURSE:

TBD - write this use case when catalog construction middleware is selected.

ALTERNATE COURSES:

3.3.4.1.1.2.4 Define Data Associations and Indexing <UseCase>

Define Data Associations and Indexing -

BASIC COURSE:

TBD - write this use case when catalogconstruction middleware is selected.

ALTERNATE COURSES:

3.3.4.1.1.2.5 Define Physical Storage <UseCase>

Define Physical Storage -

BASIC COURSE:

TBD - write this use case when catalogconstruction middleware is selected.

ALTERNATE COURSES:

242

Page 256: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.3.4.2 Query Services

includes software programs, database tables, configuration files, unit tests, component integration tests, and documentation that implement Database Services. It includes off-the-shelf query capabilities, and custom parallel queries.

Querying - Use Case Diagramuc Querying

«Business»Execute Cone Search

«Business»Query Catalog

«Business»Query Image Archive

«System»Access Database Systems

«System»Access File Systems

«System»Process Query Results

«Business»Formulate and Submit

Query

«System»Process Query

«System»Catch Hostile Queries

«System»Estimate Query Cost

«System»Log Queries

«invokes»

«invokes»

«invokes»

«invokes»

«invokes»

«invokes»

«invokes»

«invokes»

«invokes»

«invokes»

«invokes» «invokes»

Querying - Use Case Diagram

3.3.4.2.1 Formulate and Submit Query <UseCase>

02C.06.05 Query Services

DESCRIPTION:

Formulate and Submit Queries -

BASIC COURSE:

- User browses list of Data Products.- User makes a selection.- Based on selection, user is offered a choice of Query Interface.- User selects Query Interface

243

Page 257: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

- User selects attributes of interest from Meta Data.- Query formulated.- Invoke Process Query

ALTERNATE COURSES:

Formulate Query Robustness - Analysis Diagramanalysis Formulate Query Robustness

select Data Product

browse Query Interface

formulate Query

browse attributes fromMetadata

browse Data Product

formulate QueryInterface

select Query Interface

select attributes fromMetadata

«System»Process Query

Formulate Query Robustness - Analysis Diagram

3.3.4.2.2 Process Query <UseCase>

02C.06.05 Query Services

DESCRIPTION:

Process Query -

BASIC COURSE:

- Invoke "Catch Hostile Queries", abort the query if the query identified as "hostile"

244

Page 258: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

- Determine what resource types the query needs (catalog, files)- Check if needed resources are available, fail if they are not- Invoke "Estimate Query Cost"

- Inform user if query was classified as "expensive/long", abort/continue depending on user input

- Execute the query, this will invoke "Access File Systems" and/or "Access Database Systems"- While the query runs, monitor execution time, abort if time limit exceeded- Invoke "Process Query Results"- Return results to the user- The query is logged (Log Queries use case)

ALTERNATIVE COURSES:

3.3.4.2.2.1 Log Queries <UseCase>

BASIC COURSE:

Log each query. Logging should include- number of rows returned- columns touched- elapse execution time

- ...

3.3.4.2.2.2 Catch Hostile Queries <UseCase>

BASIC COURSE:

ALTERNATIVE COURSES:

3.3.4.2.2.3 Estimate Query Cost <UseCase>

Wrap tools provided by RDBMS (for instance EXPLAIN in mysql/oracle) and provide an estimate how long the query will take,e .g. :short, medium, long

3.3.4.2.2.4 Access Database Systems <UseCase>

245

Page 259: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

BASIC COURSE:- Access database- If requested data exceeds max limit, abort

ALTERNATIVE COURSES:

3.3.4.2.2.5 Access File Systems <UseCase>

BASIC COURSE:

- Access files- If requested data exceeds max limit, abort

ALTERNATIVE COURSE

3.3.4.2.2.6 Process Query Results <UseCase>

BASIC COURSE:

ALTERNATIVE COURSE:

3.3.4.2.3 SQL Syntax <Issue>Can we use a typed SQL statement as the form of Query creation

3.3.4.3 Data Ingest (Database Services)

This WBS element includes software programs, database tables, configuration files, unit tests, component integration tests, and documentation that implement Database Services. The element includes off-the-shelf DBMS capabilities.

3.3.4.3.1 Run Data Ingest <UseCase>

246

Page 260: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

02C.06.03 Database Services

DESCRIPTION:

Run Data Ingest - ingest newly created data products into the archive

BASIC COURSE:

This use case is triggered by a monitoring event that indicates that data has been completely processed by a pipeline and that one or more data products are available for ingest.

- For catalog products, invoke "Run Catalog Ingest Service" UC

- For image and other file-based products, invoke "Run File Ingest Services" UC.

ALTERNATE COURSE:

Run Data Ingest - Use Case Diagramuc Run Data Ingest

«Controller»Run Data Ingest

«Controller»Run File Ingest Service

«Controller»Run Catalog Ingest Serv ice

«invokes»

«invokes»«invokes»

Run Data Ingest - Use Case Diagram

3.3.4.3.1.1 Run Catalog Ingest Service <UseCase>

Run Catalog Ingest Service - manages the Data Ingest Service process.

BASIC COURSE:

Run Data Ingest invokes Run Catalog Ingest Service

- Invoke Initialize Data Ingest

- Listen for "register" events from Data Ingest Converters, invoke Register Data Ingest Converter for each such event.

247

Page 261: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

- On "start loading data" event from Run Nightly Pipelines:1) stop listening for "register" events2) periodically invoke Control Ingest Tables3) start receiving input data from registered Data Ingest Converters through the buffer

- Each time new data arrives: 1) read Input Data 2) invoke Verify Input Data 3) invoke Load Data Chunk to Database 4) invoke Expose Ingested Data

- On "unregister" signal from Data Ingest Converter invoke Unregister Data Ingest Converter

- Periodically invoke Cleanup

- On signal from Run Nightly Pipelines invoke Finish Nightly Ingest

- On shutdown signal from Run Nightly Pipelines invoke Shutdown Data Ingest

ALTERNATE COURSES:

Run Catalog Ingest Service - Use Case Diagram

248

Page 262: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

uc Run Catalog Ingest Service

«Controller»Run Catalog Ingest

Serv ice

«System»Initialize Data Ingest

«System»Shutdown Data Ingest

«System»Load Data Chunk to Database

«System»Control Ingest Tables

«System»Verify Input Data

«System»Expose Ingested Data

«System»Finish Nightly Ingest at Base

Camp

«System»Finish Nightly Ingest at Main

Archive

«System»Finish Nightly Ingest

«System»Cleanup

«Controller»Run Data Ingest

«invokesperiodically»

«invokes»

«precedes»

«realize»«realize»

«precedes»

«invokes»

«precedes»

«invokesonce»

«invokes» «invokes»

«invokes»

«invokesperiodically»

«invokes»

«invokesonce»

«precedes»

Run Catalog Ingest Service - Use Case Diagram

249

Page 263: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

analysis Run Catalog Ingest Service Robustness Diagram

«System»Register Data Ingest

Converter

Expose Ingested Data

«System»Shutdown Data Ingest

Read Data

Query Data

Verify Input Data

Run Catalog IngestService

Get Catalog IngestConfiguration

Create

Load Schema

«System»Load Data Chunk to

Database

«System»Finish Nightly Ingest

«System»Unregister Data Ingest Converter

Record event

«precedes»

Run Catalog Ingest Service Robustness Diagram - Analysis Diagram

3.3.4.3.1.1.1 Initialize Data Ingest <UseCase>

Initialize DAta Ingest - Initializes the data ingest service

BASIC COURSE:

Start Catalog Ingest Service

- The Catalog Ingest Service gets Ingest-related configuration from the Pipeline Policy file. (see the notes of the Get Catalog Ingest Configuration control objects for more details)

- The CIS gets the configuration: "how long to keep ingest databases from previous nights.

- On each Ingest Node the Catalog Ingest Service: -- Checks available space -- Creates new database (the name should be unique for each night) -- Loads schema

- Records Event In Event Database "CIS started"

250

Page 264: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

ALTERNATE COURSES:

Initialize Data Ingest Robustness Diagram - Analysis Diagramanalysis Initialize Data Ingest Robustness Diagram

Get Catalog IngestConfiguration

CreateLoad Schema

Check Available Space

Initialize CIS

Initialize Data Ingest Robustness Diagram - Analysis Diagram

3.3.4.3.1.1.2 Register Data Ingest Converter <UseCase>

Register Data Ingest Converter -

BASIC COURSE:

On an event from Data Ingest Converter:

- Set up a Ingest Table

- Set up a tempory, in-memory table for verifying input data

- Set up ProcessingSlice-Converter and Converter-Database buffers

ALTERNATE COURSES:

Register Ingest Stream Robustness Diagram - Analysis Diagram

251

Page 265: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

analysis Register Ingest Stream Robustness Diagram

Register DIC Create Table

Setup Buffer

Register Ingest Stream Robustness Diagram - Analysis Diagram

3.3.4.3.1.1.3 Unregister Data Ingest Converter <UseCase>

Unregister Data Ingest Converter -

BASIC COURSE:

- Delete the temporary, in-memory table used for verifying input data

- Destroy ProcessingSlice-Converter and Converter-Database buffers

ALTERNATE COURSES:

Unregister Data Ingest Converter Robustness Diagram - Analysis Diagram

252

Page 266: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

analysis Unregister Data Ingest Converter Robustness Diagram

Destroy Buffer

Delete Table

Unregister DIC

Unregister Data Ingest Converter Robustness Diagram - Analysis Diagram

3.3.4.3.1.1.4 Control Ingest Tables <UseCase>

Control Ingest Tables - Monitors whether the tables are not locked by readers for too long. If they are, takes appropriate action

BASIC COURSE:

For each ingest node for each table:

- Periodically check if table is not locked. If locked, force-stop the transaction.

- "Periodic": should be once per image, (close to the time when new data chunk is expected to arrive

ALTERNATE COURSES:

3.3.4.3.1.1.5 Verify Input Data <UseCase>

Verify Input Data - Verifies if input data is 'sane'

BASIC COURSE:

- Check for obvious problems with data format

- Load data to the temporary, in-memory table

253

Page 267: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

- Check if loading ok or not

- Truncate the temporary table

- Report back if loading was ok or not

ALTERNATE COURSES:

3.3.4.3.1.1.6 Load Data Chunk to Database <UseCase>

Load Data Chunk to Database -

BASIC COURSE:

- Disable indices

- Load data to database

- Enable indices

- Validate data. Validation includes -- catching duplicates (if not done during building indices) -- checking referencial integrity (if not done during building indices) -- checking for abnormal values -- others (to be decided)

ALTERNATE COURSES:

In case of problems (e.g. duplicated found when building primary key, or referencial integrity problem if building foreign keys) delete questionable rows and log the problem

Load Data Chunk to Database Robustness Diagram - Analysis Diagram

254

Page 268: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

analysis Load Data Chunk to Database Robustness Diagram

Validate Data

Disable Indices

Enable Indices

Load Data Chunk

Load Data Chunk toDatabase

Load Data Chunk to Database Robustness Diagram - Analysis Diagram

3.3.4.3.1.1.7 Expose Ingested Data <UseCase>

Expose Ingested Data -

BASIC COURSE:

Send Event indicating that data to Association Pipeline indicating that data is available for consumption.

It is expected that querying will finish before next input data arrives.(loading and reindexing should take < 3 sec, which leaves ~12 sec for querying relatively small tables). If it takes too long, the query will be terminated

3.3.4.3.1.1.8 Cleanup <UseCase>

Cleanup - Periodically some things will need to be cleaned up (garbage collection)

BASIC COURSE:

255

Page 269: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

For each ingest node: remove ingest databases that are too old.Definition of "too old" based on configuration fetched from policyduring CIS initialization

ALTERNATE COURSES:

3.3.4.3.1.1.9 Finish Nightly Ingest <UseCase>

Finish Nightly Ingest - This is a generic use case.

BASIC COURSE:

Check if all Data Ingest Converters unregistered, and if not, unregister them. This means that DICs should register once per night, even if they run continuously.

- do something with the Ingest Tables (transfer data, backup, move somewhere else) - this is implemented in the use case that derive from this one

- send event "nightly ingest finished successfully", plus some statistics

ALTERNATE COURSES:

Finish Nightly Ingest Robustness Diagram - Analysis Diagramanalysis Finish Nightly Ingest Robustness Diagram

Finish Nightly Ingest

Move Data Somewhere

Unregister DIC

Send "Ingest Finishedok" signal

Finish Nightly Ingest Robustness Diagram - Analysis Diagram

3.3.4.3.1.1.9.1 Finish Nightly Ingest at Base Camp <UseCase>

256

Page 270: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

Finish Nightly Ingest at Base Camp - this is a realization of a generic use case Finish Nightly Ingest

BASIC COURSE:

- mark the database as "backup". It will be kept for few days, then deleted

ALTERNATE COURSES:

3.3.4.3.1.1.9.2 Finish Nightly Ingest at Main Archive <UseCase>

FInish Nightly Ingest at Main Archive - This is a realization of a generic use case Finish Nightly Ingest

BASIC COURSE:

Merge Ingest Tables

ALTERNATE COURSES:

3.3.4.3.1.1.10 Shutdown Data Ingest <UseCase>

Shutdown Data Ingest -

BASIC COURSE:

Wait until all data is loaded into database. When all done then:

- Invoke Finish Nightly Ingest- Cleanup temporary data on Ingest Nodes- Return Ingest Nodes- Records Event In Event Database "CIS terminated"- terminate self

ALTERNATE COURSES:

Shutdown Data Ingest Robustness Diagram - Analysis Diagram

257

Page 271: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

analysis Shutdown Data Ingest Robustness Diagram

«System»Finish Nightly Ingest

Shutdown Data Ingest Service

Cleanup TemporaryData

Return Node

Shutdown Data Ingest Robustness Diagram - Analysis Diagram

3.3.4.3.1.1.11 Record event <Object>- start- stop

- Nightly Ingest Finished

3.3.4.3.1.1.12 Get Catalog Ingest Configuration <Object>- mapping of ccd --> Ingest Node. This determines which nodes will be used

- how long should we keep Databases with ingested data from previous nights

3.3.4.3.1.2 Run File Ingest Service <UseCase>

Run File Ingest Service - this service loads file-based data into the archive. The file-based datasets this service operates on will appear or be transformed into file-based data products available from the archive.

BASIC COURSE:

ALTERNATE COURSES:

Run File Ingest Service - Use Case Diagram

258

Page 272: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

uc Run File Ingest Service

«Controller»Run File Ingest Service

«System»Ensure Data is Staged for

Ingest «System»Extract & Verify Metadata

From Dataset

«System»Assign Identifiers to Dataset

«System»Copy Data to Long-Term

Storage

«System»Replicate Data Files and Metadata to

Mirror Sites

«System»Expose Ingested Data Files

and Metadata

«invokes»«invokes»

«invokes»

«invokes»

«invokes»

«precedes»

«precedes»

«precedes»

«invokes»

«precedes»

Run File Ingest Service - Use Case Diagram

3.3.4.3.1.2.1 Assign Identifiers to Dataset <UseCase>

Assign Identifiers to Dataset - an archive-unique identifier (or identifiers) is assigned to the dataset to enable unambiguous referencing to the dataset. Association of the dataset with logical collections is handled as part of this use case.

BASIC COURSE:

ALTERNATE COURSES:

259

Page 273: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.3.4.3.1.2.2 Copy Data to Long-Term Storage <UseCase>

Copy Data to Long-Term Storage - Data is copied to safe, long-term storage. When this use case is complete, the data is considered safe from reasonable system failures. Note that not all data products will preserved in long-term storage. Instead some will be temporarily cached in near-line storage but eventually deleted; after this, such data will be regenerated on-the-fly as it is needed.

BASIC COURSE:

ALTERNATE COURSES:

3.3.4.3.1.2.3 Ensure Data is Staged for Ingest <UseCase>

Ensure Data is Staged for Ingest - Ensure files to be ingested have been staged to storage systems accessible to ingest service.

BASIC COURSE:

ALTERNATE COURSES:

3.3.4.3.1.2.4 Expose Ingested Data Files and Metadata <UseCase>

Expose Ingested Data Files and Metadata - the dataset metadata are committed to the archve database and the datasets are made available to archive users.

BASIC COURSE:

ALTERNATE COURSES:

3.3.4.3.1.2.5 Extract & Verify Metadata From Dataset <UseCase>

Extract & Verify Metadata from Dataset - metadata that will be stored in the archive holdings database is extracted from the file-based dataset and/or accompanying meta-files.

BASIC COURSE:

260

Page 274: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

ALTERNATE COURSES:

3.3.4.3.1.2.6 Replicate Data Files and Metadata to Mirror Sites <UseCase>

Replicate Data Files and Metadata to Mirror Sites - data products are distributed to mirror sites that desire them. In particular, replication to the Data Access Center is necessary to consider the stored products as "safe".

BASIC COURSE:

ALTERNATE COURSES:

3.3.4.3.2 Run Data Ingest Converter <UseCase>

02C.06.03 Database Services

DESCRIPTION:

Run Data Ingest Converter -

BASIC COURSE:

Create Slices invokes Initialize Data Ingest Converter (once for each Processing Slice)

- Crunch sends data through communication buffer. Data is converted by the Data Ingest Converter and loaded to Database by the Ingest Service.

- Stop Pipeline Execution invokes Shutdown Data Ingest Converterfor each slice.

ALTERNATE COURSES:

Run Data Ingest Converter - Use Case Diagram

261

Page 275: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

uc Run Data Ingest Converter

«System»Unregister Data Ingest

Converter

«System»Shutdown Data Ingest

Converter

«Controller»Run Catalog Ingest Service

«System»Initialize Data Ingest

Converter

«System»Convert Input Data

«System»Register Data Ingest Converter

«Controller»Run Data Ingest

Converter

«invokes»

«precedes»

«precedes»

«invokes»

«precedes»

«invokes»

«invokes»

«invokes»

Run Data Ingest Converter - Use Case Diagram

262

Page 276: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

analysis Run Data Ingest Converter t Robustness Diagram

Write Data

Convert

Initialize DIC

Write DataRead Data

Shutdown DIC

«System»Register Data Ingest

Converter

Run Data Ingest Converter t Robustness Diagram - Analysis Diagram

3.3.4.3.2.1 Initialize Data Ingest Converter <UseCase>

Initialize Data Ingest Converter -

BASIC COURSE:

Create Slices invokes Initialize Data Ingest Converter (for each Processing Slice)

- Start Data Ingest Converter

- Register just started Converter with the Catalog Ingest Service.

ALTERNATE COURSES:

Initialize Data Ingest Converters Robustness Diagram - Analysis Diagram

263

Page 277: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

analysis Initialize Data Ingest Converters Robustness Diagram

Initialize DIC

DESCRIPTION:

Initializes Data Ingest Converter

BASIC COURSE:

Create Slices invokes Initialize Data Ingest Converter (for each Processing Slice) and sends Data Ingest Converter

Start Data Ingest Converter

Setup communication buffer between Processing Slice and Data Ingest Converter

Register Data Ingest Converter with the Catalog Ingest Service

Setup communication buffer between Data Ingest Converter and Catalog Ingest Service(for instance Catalog Ingest Service might store the name of the directory used for communication with the Converter).

Register DIC

Initialize Data Ingest Converters Robustness Diagram - Analysis Diagram

3.3.4.3.2.2 Convert Input Data <UseCase>

Convert Input Data - Guts of the Data Ingest Converter

BASIC COURSE:

Crunch use case writes data to the ProcessingSlice-IngestConverter buffer.

For each chunk of new data:

- Data Ingest Converter reads data from the buffer- Converts data- writes data to the IngestConverter-Database buffer

ALTERNATE COURSES:

264

Page 278: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

Convert Input Data Robustness Diagram - Analysis Diagramanalysis Convert Input Data Robustness Diagram

Write Data

Read Data

Convert

Write Data

Convert Input Data Robustness Diagram - Analysis Diagram

3.3.4.3.2.3 Shutdown Data Ingest Converter <UseCase>

Shutdown Data Ingest Converter - Shuts down Data Ingest Converter

BASIC COURSE:

Stop Pipeline Execution invokes Shutdown Ingest Data Converter for each Processing Slice:

- Wait until all data in communication buffers processed

- Unregister the converter with Data Ingest Service by invoking Unregister DIC

- Stop Data Ingest Converter

ALTERNATE COURSES:

Shutdown Data Ingest Converter Robustness Diagram - Analysis Diagram

265

Page 279: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

analysis Shutdown Data Ingest Converter ...

Shutdown DIC

Unregister DIC

Shutdown Data Ingest Converter Robustness Diagram - Analysis Diagram

3.3.4.4 Data Access Framework

implements all structured data, including the schema for Source, Object, Deep Object, Orbit Catalogs and all Meta-Data.

3.3.4.4.1 LsstData and Citizen Use Cases

LSSTData object creation - Use Case Diagram

266

Page 280: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

uc LSSTData object creation

This set of Use Cases addresses the creation of a specific realization of an LsstData object, fully configured with support for persistance, security, release process, etc.

In particular, they elaborate the separation of the configuration of an LsstData object instance's support classes from the construction of the object itself.

«System»Configure LsstData

Support

Lsst Application

«System»Obtain an LsstData

realization

Which of the support objects contained by an LsstData object instance can be configured?- Data Properties?

«invokes»

«invokes»

LSSTData object creation - Use Case Diagram

3.3.4.4.1.1 Lsst Application <Actor>For example, a pipeline stage

3.3.4.4.1.2 Configure LsstData Support <UseCase>

02C.06.01 Catalogs and MetaData: Data Access Framework

DESCRIPTION:Configure LsstData Support

Given a specific LsstData realization, change the support contained in the object.

Update/replace the Support objects (Persistance, Security, etc.) associated with the LsstData object according to the content of a specific Policy object that describes a set of support strategies for the object.

267

Page 281: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

INPUTS:- specific LsstData realization- specific Policy object

OUTPUTS:- altered Support objects (Persistance, Security, etc)

GIVEN:

ALGORITHM:The actor passes the LsstData object to a Support Factory along with the Policy object. Via the content of the Policy object, the actor can control the extent of change to the Support for the LsstData realization (e.g. Persistance only or all of it at once).

EXCEPTIONS:

NOTES:

Configure LsstData Support - Analysis Diagramanalysis Configure LsstData Support

BASIC COURSE:The actor passes the LsstData object to a Support Factory along with the Policy object. Via the content of the Policy object, the actor can control the extent of change to the Support for the LsstData realization (e.g. Persistance only or all of it at once).

Lsst Application

configure support viaPolicy

Policy Factory

«call»

Configure LsstData Support - Analysis Diagram

3.3.4.4.1.2.1 Policy Factory <Object>A constructor???

A loader? i.e. load from some persistent store???

3.3.4.4.1.3 Obtain an LsstData realization <UseCase>

02C.06.01 Catalogs and MetaData: Data Access Framework

268

Page 282: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

DESCRIPTION:

Obtain an LsstData Realization - Create a specific LsstData realization.

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:

Call the specific type's constructor:- The actor calls the appropriate constructor for the desired LsstData type. This will result in an instance having default support objects attached to it that are appropriate for the constructor used. (For example, the Persistance object may do nothing, assuming the instance will be temporary, the Security object may allow world access.)

ALTERNATE COURSES:

Create a specific LsstData object using an existing dataset in the archive:- A policy object is used to control the actions of the appropriate Data Access service. The Data Access service loads to content of the LsstData object and returns the realization. Support will be configured in already. In particular, the Persistance object will be configured to save any updates to the dataset (assuming the user has permission).

NOTES:

Obtain an LsstData realization - Analysis Diagram

269

Page 283: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

analysis Obtain an LsstData realization

LsstData objectconstructor

ALTERNATE COURSE:Create a specific LsstData object using an existing dataset in the archive. A policy object is used to control the actions of the appropriate Data Access service. The Data Access service loads to content of the LsstData object and returns the realization. Support will be configured in already. In particular, the Persistance object will be configured to save any updates to the dataset (assuming the user has permission).

Data Access Framework

BASIC COURSE: Call the specific type's constructor.The actor instantiates the appropriate constructor for the desired LsstData type. This will result in an instance having default support objects attached to it that are appropriate for the constructor used. (For example, the Persistance object may do nothing, assuming the instance will betemporary, the Security object may allow world access.)

Lsst Application

«call»

«call»

Obtain an LsstData realization - Analysis Diagram

3.3.4.4.2 Persistence Use Cases

Persist Data from Pipeline - Use Case Diagramuc Persist Data from Pipeline

Persist Data from Pipeline

Define persistence policies

Execute persistence

«invokes»

«invokes»

«precedes»

Persist Data from Pipeline - Use Case Diagram

270

Page 284: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

uc Persist Persistable object

Obtain and configure Persistence object

Obtain and configure Storage object(s) for

persistence

Format and send Persistable object to

Storage(s)

Specify additional object metadata

LSST Application

Execute persistence

«precedes» «precedes»

«precedes»

«invokes»

«invokes» «invokes» «invokes»

«invokes»

Persist Persistable object - Use Case Diagramuc Retrieve Persistable object

Retrieve Persistable object from Storage(s)

Execute retrieval

Specify additional object metadata

Obtain and configure Storage object(s) for

persistence

Obtain and configure Persistence object

LSST Application

«invokes»

«invokes»

«invokes»«invokes»

«precedes»

«precedes»«precedes»

«invokes»

Retrieve Persistable object - Use Case Diagram

271

Page 285: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.3.4.4.2.1 Define persistence policies <UseCase>02C.06.01 Catalogs and MetaData: Data Access Framework

DESCRIPTION:

Define persistence policies -

INPUTS:- name of object on clipboard

OUTPUTS:- data for persistence- persistence Policy file(s)

GIVEN:

ALGORITHM:

- Define data for persistence - specify name of object on clipboard for persistence by writing policy file(s)

- Define persistence location - specify path name or database location (host, port and schema) by writing policy file(s).

- Define additional data - specify which data is to be copied from event on a clipboard by writing policy file(s)

- Define formatter specific parameters - Specify any additional parameters such as template table names required by formatter, by writing policy file(s)

EXCEPTIONS:

NOTES:

3.3.4.4.2.2 Execute persistence <UseCase>02C.06.01 Catalogs and MetaData: Data Access Framework

DESCRIPTION:

Execute persistence - Pipeline output stage implicitly performs this step.

INPUTS:- persistence Policy

OUTPUTS:

272

Page 286: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

- object persisted to Storage(s)

GIVEN:

ALGORITHM:1. Obtain and configure Persistence object

2. Obtain and configure Storage objects(s) for persistence.

3. If indicated by persistence Policy, specify additional object metadata.

4. Format and send persistable object to Storage(s)

EXCEPTIONS:

NOTES:

Execute persistence - Use Case Diagramuc Execute persistence

Obtain and configure Persistence object

Obtain and configure Storage object(s) for

persistence

Format and send Persistable object to

Storage(s)

Specify additional object metadata

Execute persistence

«precedes» «precedes»

«precedes»

«invokes» «invokes» «invokes»

«invokes»

Execute persistence - Use Case Diagram

3.3.4.4.2.3 Execute retrieval <UseCase>

02C.06.01 Catalogs and MetaData: Data Access Framework

DESCRIPTION:

Execute retrieval -

273

Page 287: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

INPUTS:- persistence Policy

OUTPUTS:- object retrieved from Storage(s)

GIVEN:

ALGORITHM:

1. Obtain and configure Persistence object.2. Obtain and configure Storage objects(s) for persistence.3. If indicated by persistence Policy, specify additional object metadata.4. Retrieve Persistable object from Storage(s).

EXCEPTIONS:

NOTES:

3.3.4.4.2.4 Format and send Persistable object to Storage(s) <UseCase>

02C.06.01 Catalogs and MetaData: Data Access Framework

DESCRIPTION:

Format and send Persistable object to Storage(s) -

INPUTS:- persistence Policy

OUTPUTS:- Persistable object to Storage(s)

GIVEN:

ALGORITHM:

EXCEPTIONS:

NOTES:

3.3.4.4.2.5 Obtain and configure Persistence object <UseCase>

274

Page 288: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

02C.06.01 Catalogs and MetaData: Data Access Framework

DESCRIPTION:

Obtain and configure Persistence object -

INPUTS:

OUTPUTS:

GIVEN:

ALGORITHM:

EXCEPTIONS:

NOTES:

3.3.4.4.2.6 Obtain and configure Storage object(s) for persistence <UseCase>

02C.06.01 Catalogs and MetaData: Data Access Framework

DESCRIPTION:

Obtain and configure Storage object(s) for persistence -

INPUTS:

OUTPUTS:

GIVEN:

ALGORITHM:

EXCEPTIONS:

NOTES:

3.3.4.4.2.7 Persist Data from Pipeline <UseCase>

275

Page 289: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

02C.06.01 Catalogs and MetaData: Data Access Framework

DESCRIPTION:

Persist Data from Pipeline -

INPUTS:

OUTPUTS:

GIVEN:

ALGORITHM:

EXCEPTIONS:

NOTES:

3.3.4.4.2.8 Retrieve Persistable object from Storage(s) <UseCase>

02C.06.01 Catalogs and MetaData: Data Access Framework

DESCRIPTION:

Retrieve Persistable object from Storage(s) -

INPUTS:- object's persistence Policy

OUTPUTS:- retrieved object from Storage(s)

GIVEN:

ALGORITHM:

EXCEPTIONS:

NOTES:

3.3.4.4.2.9 Specify additional object metadata <UseCase>

276

Page 290: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

02C.06.01 Catalogs and MetaData: Data Access Framework

DESCRIPTION:

Specify additional object metadata -

INPUTS:- object metadata- object's persistence Policy

OUTPUTS:- updated persistence Policy

GIVEN:

ALGORITHM:

EXCEPTIONS:

NOTES:

3.3.4.4.3 Provenance Use Cases

Provenance Use Cases - Use Case Diagramuc Provenance Use Cases

LSST Application

Re-create Science Exposure

(from Pipeline Execution Services)

«System»Record Pipeline

Provenance

«invokes»«invokes»

Provenance Use Cases - Use Case Diagram

277

Page 291: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.3.4.4.3.1 Re-create Science Exposure <UseCase>

02C.06.01 Catalogs and MetaData: Data Access Framework

DESCRIPTION:

Re-create Science Exposure -

INPUTS:- Science Exposure identification

OUTPUTS:- re-created Science Exposure on Clipboard

GIVEN:- Science Exposure provenance from DB

BASIC COURSE:- Database provides provenance information.

- Pipeline stage is configured using provenance information to re-create Science Exposure.

- Pipeline stage is executed.

- Resulting Science Exposure remains on Clipboard.

ALTERNATE COURSES:

NOTES:

3.3.4.5 Data Access for Pipelines

includes software programs, database tables, configurationfiles, unit tests, component integration tests, and documentation thatimplement the Image Archive. The Image Archive includes all Images processed bythe DMS. These include:

· Raw Science Images - The raw science image is a container for pixel data.

· Calibrated Science Images - The Image Processing Pipeline transforms a Raw Science Image into a Calibrated Science Image.

· Calibrated Science Images will not be stored, but the Image Archive and Image

278

Page 292: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

Processing Pipeline will recreate them on demand from the raw science image and associated calibration data.

· Subtracted Science Images - Subtracted Science Images are used to enable detection of transient sources. A Subtracted Science Image is created by the Image Processing Pipeline from two input images. Subtracted Science Images will not be stored, but the Image Archive and Image Processing Pipeline will recreate them on demand from the two input images.

Prepare Data Access for Pipeline - Use Case Diagramuc Prepare Data Access for Pipeline

Configure Object Catalog for association

Initialize CatalogsPrepare Data Access for Pipeline

Generate list of Sky Patches

«invokes»

«invokes»

«invokes»

Prepare Data Access for Pipeline - Use Case Diagram

279

Page 293: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

uc Stage Input Data

(from Pipeline Execution Services)

«System»Preload a Database

(from Pipeline Execution Services)

«System»Stage a Named

Collection

Setup access to Image Collection

Setup access to Co-add/Template

Collection

Retrieve Template/Co-Add covering an area

Retrieve Image from Image Collection

Retrieve Image Fragments from Image Collection

(from Pipeline Execution Services)

«Controller»Stage Input Data

«invokes»«invokes»

«invokes»

«invokes»

«invokes»

«invokes»

Name:Package:Version:Author:

Stage Input DataData Access for Pipelines1.0Kian-Tat Lim

Stage Input Data - Use Case Diagram

3.3.4.5.1 Configure Object Catalog for association <UseCase>

02C.06.02 Image Archive: Data Access for Pipeline

DESCRIPTION:Configure Object Catalog for association -

INPUTS:- policy to configure DbStorage

OUTPUTS:- per-run Object Table- per-run Merge Table

280

Page 294: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

- per-run Object View

GIVEN:

BASIC COURSE:For DC3 ONLY:- Use a Policy to configure a DbStorage.

- Using the DbStorage, create the Per-run Object Table, Merge Table, and Object View.

ALTERNATE COURSES:

NOTES:

Configure Object Catalog for association - Analysis Diagram

281

Page 295: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

analysis Configure Object Catalog for association

WBS:

02C.06.02 Image Archive: Data Access for Pipeline

DESCRIPTION:Configure Object Catalog for association -

INPUTS:- policy to configure DbStorage

OUTPUTS:- per-run Object Table- per-run Merge Table- per-run Object View

GIVEN:

BASIC COURSE:For DC3 ONLY:- Use a Policy to configure a DbStorage.

- Using the DbStorage, create the Per-run Object Table, Merge Table, and Object View.

ALTERNATE COURSES:

NOTES:

Initialize DbStoragewith Policy

Policy DbStorage Create per-run tables

Per-Run Object Table

Per-Run Merge Table

Per-Run Object View

Name:Package:Version:Author:

Configure Object Catalog for associationData Access for Pipelines1.0Kian-Tat Lim

Configure Object Catalog for association - Analysis Diagram

3.3.4.5.2 Generate list of Sky Patches <UseCase>

02C.06.02 Image Archive: Data Access for Pipeline

DESCRIPTION:

Generate list of Sky Patches -

INPUTS:- Exposure Catalog- Cluster Topology

OUTPUTS:

282

Page 296: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

- lists of Sky Patches

GIVEN:

BASIC COURSE::Using the Exposure Catalog, determine the total Sky Coverage.

Using the Cluster Topology, fragment the Sky Coverage into a set of lists of Sky Patches, one list per node, so that each node's patches are close together on the sky.

ALTERNATE COURSES:

NOTES:

Generate list of Sky Patches - Analysis Diagramanalysis Generate list of Sky Patches

WBS:

02C.06.02 Image Archive: Data Access for Pipeline

DESCRIPTION:

Generate list of Sky Patches -

INPUTS:- Exposure Catalog- Cluster Topology

OUTPUTS:- lists of Sky Patches

GIVEN:

BASIC COURSE::Using the Exposure Catalog, determine the total Sky Coverage.

Using the Cluster Topology, fragment the Sky Coverage into a set of lists of Sky Patches, one list per node, so that each node's patches are close together on the sky.

ALTERNATE COURSES:

NOTES:

ExposureCatalogCompute coverage of

the sky Sky Coverage

Generate patches List of Sky Patches

Name:Package:Version:Author:

Generate list of Sky PatchesData Access for Pipelines1.0Kian-Tat Lim

283

Page 297: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

Generate list of Sky Patches - Analysis Diagram

3.3.4.5.2.1 Sky Coverage <Object>

3.3.4.5.3 Initialize Catalogs <UseCase>

02C.06.02 Image Archive: Data Access for Pipeline

DESCRIPTION:

Initialize Catalogs -

INPUTS:- policy containing DbStorage configuration

OUTPUTS:- NIghtly Catalog tables- Data Release tables

GIVEN:

BASIC COURSE:

- Use a Policy to configure a DbStorage.

- Using the DbStorage, create the Nightly and/or Data Release Catalog tables.

ALTERNATE COURSES:

NOTES:

Initialize Catalogs - Analysis Diagram

284

Page 298: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

analysis Initialize Catalogs

WBS:

02C.06.02 Image Archive: Data Access for Pipeline

DESCRIPTION:

Initialize Catalogs -

INPUTS:- policy containing DbStorage configuration

OUTPUTS:- NIghtly Catalog tables- Data Release tables

GIVEN:

BASIC COURSE:

- Use a Policy to configure a DbStorage.

- Using the DbStorage, create the Nightly and/or Data Release Catalog tables.

ALTERNATE COURSES:

NOTES:

PolicyInitialize DbStorage

with Policy DbStorage

Create tables

Nightly Catalogs

Data Release Catalogs

Name:Package:Version:Author:

Initialize CatalogsData Access for Pipelines1.0Kian-Tat Lim

Initialize Catalogs - Analysis Diagram

3.3.4.5.4 Prepare Data Access for Pipeline <UseCase>

02C.06.02 Image Archive: Data Access for Pipeline

DESCRIPTION:

Prepare Data Access for Pipeline -

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:

- invokes Configure Object Catalog for association

285

Page 299: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

- invokes Initialize Catalogs- invokes Generate list of Sky Patches

ALTERNATE COURSES:

NOTES:

3.3.4.5.5 Retrieve Image Fragments from Image Collection <UseCase>

02C.06.02 Image Archive: Data Access for Pipeline

DESCRIPTION:

Retrieve Image Fragments from Image Collection -

INPUTS:- Sky Patch- Exposure Catalog

OUTPUTS:- Image Fragments

GIVEN:

BASIC COURSE:

Given a Sky Patch.- Compute a list of all Image Fragments covering the Sky Patch using the Exposure Catalog.

- Retrieve the Image Fragments using the ExposureFormatter.

ALTERNATE COURSES:

NOTES:

Retrieve Image Fragments from Image Collection - Analysis Diagram

286

Page 300: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

analysis Retrieve Image Fragments from Image Collection

WBS:

02C.06.02 Image Archive: Data Access for Pipeline

DESCRIPTION:

Retrieve Image Fragments from Image Collection -

INPUTS:- Sky Patch- Exposure Catalog

OUTPUTS:- Image Fragments

GIVEN:

BASIC COURSE:

Given a Sky Patch.- Compute a list of all Image Fragmentscovering the Sky Patch using the Exposure Catalog.

- Retrieve the Image Fragments using the ExposureFormatter.

ALTERNATE COURSES:

NOTES:

ExposureCatalog

Sky Patch

Compute list offragments

List of Image FragmentIds Retrieve image

fragments Image Fragment

Name:Package:Version:Author:

Retrieve Image Fragments from Image CollectionData Access for Pipelines1.0Kian-Tat Lim

Retrieve Image Fragments from Image Collection - Analysis Diagram

3.3.4.5.6 Retrieve Image from Image Collection <UseCase>

02C.06.02 Image Archive: Data Access for Pipeline

DESCRIPTION:

Retrieve Image from Image Collection -

INPUTS:- Image identifier (Raw Exposure Id or Science Exposure Id)

OUTPUTS:- Exposure

GIVEN:- policy containing Logical Location mapping to Physical Location??

BASIC COURSE:Given an Image identifier (Raw Exposure Id, Science Exposure Id).- Map the identifier to a Logical Location.

287

Page 301: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

- (Non-DC3) Map the Logical Location to a Physical Location (e.g. using iRODS).

- Retrieve the Image using the ExposureFormatter.

ALTERNATE COURSES:

NOTES:

Retrieve Image from Image Collection - Analysis Diagramanalysis Retrieve Image from Image Collection

WBS:

02C.06.02 Image Archive: Data Access for Pipeline

DESCRIPTION:

Retrieve Image from Image Collection -

INPUTS:- Image identifier (Raw Exposure Id or Science Exposure Id)

OUTPUTS:- Exposure

GIVEN:- policy containing Logical Location mapping to Physical Location??

BASIC COURSE:Given an Image identifier (Raw Exposure Id, Science Exposure Id).- Map the identifier to a Logical Location.

- (Non-DC3) Map the Logical Location to a Physical Location (e.g. using iRODS).

- Retrieve the Image using the ExposureFormatter.

ALTERNATE COURSES:

NOTES:

Raw Exposure Id

Science Exposure Id

Map to logical location Logical Location

Map to physicallocation

Physical Location Retrieve image Exposure

Name:Package:Version:Author:

Retrieve Image from Image CollectionData Access for Pipelines1.0Kian-Tat Lim

Retrieve Image from Image Collection - Analysis Diagram

3.3.4.5.6.1 Map to physical location <Object>Identity mapping (in FitsStorage) for DC3

288

Page 302: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.3.4.5.7 Retrieve Template/Co-Add covering an area <UseCase>

02C.06.02 Image Archive: Data Access for Pipeline

DESCRIPTION:

Retrieve Template/Co-Add covering an area -

INPUTS:- Geometry (Bounding Box or Footprint)- Co-Add Catalog ??- Template Exposure ??

OUTPUTS:- Template Image- Co-Add Image

GIVEN:

BASIC COURSE:Given a Geometry (Bounding Box or Footprint).- Compute a list of all Template Fragments covering the Geometry.

- Retrieve the Template Fragments using the ExposureFormatter.

- Transform and crop the Template Fragments to produce a Template Image.

ALTERNATE COURSES:Same process holds for Co-Add Images, using the Co-Add Catalog.

NOTES:

Retrieve Template/Co-Add covering an area - Analysis Diagram

289

Page 303: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

analysis Retrieve Template/Co-Add covering an area

WBS:

02C.06.02 Image Archive: Data Access for Pipeline

DESCRIPTION:

Retrieve Template/Co-Add covering an area -

INPUTS:- Geometry (Bounding Box or Footprint)- Co-Add Catalog ??- Template Exposure ??

OUTPUTS:- Template Image- Co-Add Image

GIVEN:

BASIC COURSE:Given a Geometry (Bounding Box or Footprint).- Compute a list of all Template Fragments covering the Geometry.

- Retrieve the Template Fragments using the ExposureFormatter.

- Transform and crop the Template Fragments to produce a Template Image.

ALTERNATE COURSES:Same process holds for Co-Add Images, using the Co-Add Catalog.

NOTES:

GeometryCompute template

fragment listTemplate Fragment Id

Retrieve templatefragments

Template Fragment

Transform and crop

Template Image

Name:Package:Version:Author:

Retrieve Template/Co-Add covering an areaData Access for Pipelines1.0Kian-Tat Lim

Retrieve Template/Co-Add covering an area - Analysis Diagram

3.3.4.5.8 Setup access to Co-add/Template Collection <UseCase>

02C.06.02 Image Archive: Data Access for Pipeline

DESCRIPTION:

Setup access to Co-add/Template Collection -

INPUTS:- pre-generated Template Images- pre-generated Co-add Images ??

OUTPUTS:- symlinks in per-run filesystem

GIVEN:

BASIC COURSE:

290

Page 304: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

FOR DC3:- Symbolically link pre-generated template images into per-run filesystem tree.

- (Eventually may require more work, such as establishing a connection to a template service or configuring iRODS.)

ALTERNATE COURSES:

NOTES:

Setup access to Co-add/Template Collection - Analysis Diagramanalysis Setup access to Co-add/Template Collection

WBS:

02C.06.02 Image Archive: Data Access for Pipeline

DESCRIPTION:

Setup access to Co-add/Template Collection -

INPUTS:- pre-generated Template Images- pre-generated Co-add Images ??

OUTPUTS:- symlinks in per-run filesystem

GIVEN:

BASIC COURSE:FOR DC3:- Symbolically link pre-generated template images into per-run filesystem tree.

- (Eventually may require more work, such as establishing a connection to a template service or configuring iRODS.)

ALTERNATE COURSES:

NOTES:

Symlink filesystemdirectory

Name:Package:Version:Author:

Setup access to Co-add/Template CollectionData Access for Pipelines1.0Kian-Tat Lim

Setup access to Co-add/Template Collection - Analysis Diagram

3.3.4.5.9 Setup access to Image Collection <UseCase>

02C.06.02 Image Archive: Data Access for Pipeline

DESCRIPTION:

Setup access to Image Collection -

INPUTS:

OUTPUTS:

291

Page 305: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

GIVEN:

BASIC COURSE:FOR DC3:- Symbolically link image directory into per-run filesystem tree.

- (Eventually may require more work, such as establishing a connection to an image retrieval service or configuring iRODS.)

ALTERNATE COURSES:

NOTES:

Setup access to Image Collection - Analysis Diagramanalysis Setup access to Image Collection

Symlink filesystemdirectory

WBS:

02C.06.02 Image Archive: Data Access for Pipeline

DESCRIPTION:

Setup access to Image Collection -

INPUTS:

OUTPUTS:

GIVEN:

BASIC COURSE:FOR DC3:- Symbolically link image directory into per-run filesystem tree.

- (Eventually may require more work, suchas establishing a connection to an image retrieval service or configuring iRODS.)

ALTERNATE COURSES:

NOTES:

Name:Package:Version:Author:

Setup access to Image CollectionData Access for Pipelines1.0Kian-Tat Lim

Setup access to Image Collection - Analysis Diagram

292

Page 306: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.3.5 Security and Access Control Services

includes software programs, database tables, configuration files, unit tests, component integration tests, and documentation that implement the following capabilities:

· Security (Authorization/Authentication)· Access Control (Enforcement)

Security - Use Case Diagramuc Security

Data Management System Administrator

«Business»Configure Security Profiles and

Policies

«Business»Administer Groups and

Users

«Business»Administer Certificates

«System»Enforce Security Policies

This use case is implicitly invoked by many other use cases, whenever an authentication or authorization is required.

«System»Authenticate

«System»Authorize

«precedes»

«precedes»

«precedes»

«invokes» «invokes»

Security - Use Case Diagram

3.3.5.1 Administer Certificates <UseCase>

02C.07.04 Security and Access Control Services

293

Page 307: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

DESCRIPTION:

Administer Certificates -

INPUTS:

OUTPUTS:

BASIC COURSE:

ALTERNATE COURSES:

NOTES:

3.3.5.2 Administer Groups and Users <UseCase>

02C.07.04 Security and Access Control Services

DESCRIPTION:

Administer Groups and Users -

INPUTS:

OUTPUTS:

BASIC COURSE:

ALTERNATE COURSES:

NOTES:

3.3.5.3 Authenticate <UseCase>

02C.07.04 Security and Access Control Services

DESCRIPTION:

Authenticate - how an actor obtains an authenticated identity to start an authenticated session, and how that identity is passed to services being protected

BASIC COURSE:

294

Page 308: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

ALTERNATE COURSES:

3.3.5.4 Authorize <UseCase>

02C.07.04 Security and Access Control Services

DESCRIPTION:

Authorize - describes how the system ensures that an actor can carry out a requested operation on a protected service.

BASIC COURSE:

ALTERNATE COURSE:

3.3.5.5 Configure Security Profiles and Policies <UseCase>

02C.07.04 Security and Access Control Services

DESCRIPTION:

Configure Security Profiles and Policies -

INPUTS:

OUTPUTS:

BASIC COURSE:

ALTERNATE COURSES:

NOTES:

3.3.5.6 Enforce Security Policies <UseCase>

02C.07.04 Security and Access Control Services

295

Page 309: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

DESCRIPTION:

Enforce Security Policies -

INPUTS:

OUTPUTS:

BASIC COURSE:

ALTERNATE COURSES:

NOTES:

3.3.6 Sys Admin and Opns Services

includes software programs, configuration files, unit tests, component integration tests, and documentation that implement the following capabilities:

· Provide an interface to formatted data describing the current operational state of the LSST DMS, including equipment health and workload, pipeline processing status, data query and ingest workload, data transfer workload, and DMS performance snapshots and trends.

· Provide an interface to formatted data describing the internal state of LSST data archives and catalogs, including data integrity, usage patterns, and logical and physical schemas

· Provide an interface to formatted data describing configuration of the LSST DMS subsystem.

Sys Admin and Opns Services - Use Case Diagram

296

Page 310: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

uc Sys Admin and Opns Services

«Business»Administer and Maintain

Systems

«System»Monitor DM System Health

and Status

Data Management System Administrator

«Business»Administer and Maintain Data

«Business»Administer and Maintain

Applications Software

«invokes»

«invokes»

«invokes»

Sys Admin and Opns Services - Use Case Diagram

3.3.6.1 Administer and Maintain Applications Software <UseCase>

02C.07.06 System Administration and Operations Services

DESCRIPTION:

Administer and Maintain Applications Software -

INPUTS:

OUTPUTS:

BASIC COURSE:

ALTERNATE COURSES:

NOTES:

297

Page 311: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.3.6.2 Administer and Maintain Data <UseCase>

02C.07.06 System Administration and Operations Services

DESCRIPTION:

Administer and Maintain Data -

INPUTS:

OUTPUTS:

BASIC COURSE:

NOTE:This use case will include:- Backup- Replication- Hot Spot Detection and Recovery- Manual Overwrites (e.g. killing queries, recover from unusual problems)- Parsing query logs and looking for access patterns

ALTERNATE COURSES:

NOTES:

3.3.6.3 Administer and Maintain Systems <UseCase>

02C.07.06 System Administration and Operations Services

DESCRIPTION:

Administer and Maintain Systems - This is adminstration of infrastructure and middleware services

INPUTS:

OUTPUTS:

BASIC COURSE:

ALTERNATE COURSES:

NOTES:

298

Page 312: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

3.3.7 User Interface/Visualization Services

includes software programs, configuration files, unit tests, component integration tests, and documentation that implement the following capabilities:

· Create and configure displays and plots of static data and metadata in graphical and tabular forms.

· Create and configure static and continuously updating displays of real-time data streams in graphical and tabular forms.

User Interface Services - Use Case Diagramuc User Interface Services

«Business»Static Display of Data and Meta

Data

«Business»Dynamic Display of Data and Meta

Data

User Interface Services - Use Case Diagram

3.3.7.1 Dynamic Display of Data and Meta Data <UseCase>

02C.07.05 User Interface/Visualization Services

DESCRIPTION:

Dynamic Display of Data and Meta Data -

INPUTS:

OUTPUTS:

BASIC COURSE:

299

Page 313: Introduction - DocuShare · Web view3.3.2.1.5.2Checkpoint/Restart within Processing Step 209 3.3.2.1.6Clean Up after Execution 209 3.3.2.1.7Configure

LSST Data Management UML Use Case and Activity Model LDM-134

7/12/2011

ALTERNATE COURSES:

NOTES:

3.3.7.2 Static Display of Data and Meta Data <UseCase>

02C.07.05 User Interface/Visualization Services

DESCRIPTION:

Static Display of Data and Meta Data -

INPUTS:

OUTPUTS:

BASIC COURSE:

ALTERNATE COURSES:

NOTES:

300