knowledge management status report to era project team · knowledge management status report to era...
TRANSCRIPT
27 February 2003 Richard W. Morris <[email protected]> 1
Knowledge ManagementStatus Report to eRA Project Team
Transition to Pre-Production Phase
27 February 2003 Richard W. Morris <[email protected]> 2
Life Cycle of Disruptive TechnologiesGOAL: DEMONSTRATION
STAGE 1
DRIVER: CONCEPT
GOAL: MATURITY / DIFFUSION
STAGE 3
DRIVER: COMPETITION
GOAL: NICHE APPLICATIONS
STAGE 2
DRIVER: TRUE BELIEVERS
Myers, et al “Practitioner’s View: Evolutionary Stages of Disruptive Technologies”; IEEE Transactions, v. 49, no. 4, Nov. 2002
27 February 2003 Richard W. Morris <[email protected]> 3
Aims Today
1. Where we have been.
2. Where we are going.
3. How we’ll get there (if we answer a few questions).
27 February 2003 Richard W. Morris <[email protected]> 4
Where we have been.
27 February 2003 Richard W. Morris <[email protected]> 5
Concept Phase
pre-production phaseconceptual phase
-
performancerequirements
detaileddesign
build / testprototype
assessresults
modify(produce)
system requirements
functionalrequirements
design phase
Source: Management of Systems Engineering, Wilton P. Chase
27 February 2003 Richard W. Morris <[email protected]> 6
Conceptual Phase
system requirements
MITRETEKconceptual phase
-
functionalrequirements
performancerequirements
detaileddesign
build / testprototype
assessresults
modify(produce)
Source: Management of Systems Engineering, Wilton P. Chase
collexis
semio
stratify
inxight
i411
27 February 2003 Richard W. Morris <[email protected]> 7
Design Phase
pre-production phaseconceptual phase
-
performancerequirements
detaileddesign
build / testprototype
assessresults
modify(produce)
system requirements
functionalrequirements
design phase
Source: Management of Systems Engineering, Wilton P. Chase
27 February 2003 Richard W. Morris <[email protected]> 8
Design Phase
performancerequirements
detaileddesign
build / testprototype
assessresults
modify(produce)
system requirements
functionalrequirements
conceptual phase
performancerequirements
detaileddesign
build / testprototype
assessresults
modify(produce)
Pilot #2: reviewer selection
Pilot #1: situational awareness
design phase
Source: Management of Systems Engineering, Wilton P. Chase
27 February 2003 Richard W. Morris <[email protected]> 9
Project Management Plan
performancerequirements
detaileddesign
build / testprototype
assessresults
modify(produce)
-
system requirements
functionalrequirements
conceptual phase pre-production phase
design phase
Dec. 1, 2002 Feb. 1, 2003 Sept. 1, 2003
Source: Management of Systems Engineering, Wilton P. Chase
27 February 2003 Richard W. Morris <[email protected]> 10
Next Steps: Pre-Production
pre-production phaseconceptual phase
-
performancerequirements
detaileddesign
build / testprototype
assessresults
modify(produce)
system requirements
functionalrequirements
design phase
Source: Management of Systems Engineering, Wilton P. Chase
27 February 2003 Richard W. Morris <[email protected]> 11
Where we are going.
27 February 2003 Richard W. Morris <[email protected]> 12
NIH KM OverviewCore KM Prototypes
27 February 2003 Richard W. Morris <[email protected]> 13
Expanded NIH KM OverviewComplete System
27 February 2003 Richard W. Morris <[email protected]> 14
Metrics for GRSEstimated Benefits•Reduced cycle time•Improved quality and consistency of referrals•Time saved by the organization
Methods Used•Sampling•Survey•Interviews•Internal Logs
Key Measures Key Outputs Key Outcomes System Performance
• Recall • Precision • Reviewer Selection • System Response time • Scalability
o Number of research proposals
o Number of reviewers • Institute Routing
• Time spent “selecting” candidate reviewers
• Time spent “screening” candidate reviewers
• Number of conflicts identified • Percentage of candidate
reviewers chosen • Percentage of correct institute
routing
• User satisfaction • Time saved by the organization
in “selecting” and “screening” candidate reviewers
• Savings or improvements in organizational quality and efficiency
• Time saved in institute routing
System Usage
• System down time • Scalability
o Number of users o Frequency of use
• User Feedback (real-time) • Usability survey (time-lag) • Training time /learning curve
• Usefulness survey • Feedback results • Duration of learning curve • Duration of training time
• User satisfaction • Savings or improvements in
organizational quality and efficiency
• Time saved by the organization • Reduced training time or
learning curve System Operation & Maintenance
• Frequency of Updates • System Downtime • Help Desk Support
• Number of Help Desk support requests
• User satisfaction • Reallocation of Help Desk
resources • Recency of Information
27 February 2003 Richard W. Morris <[email protected]> 15
Metrics for GTSEstimated Benefits•Situational Awareness•Discovery of Patterns and Trends•Informed Decision Making•Time saved by the organization
Methods Used•Sampling•Survey•Interviews•Internal Logs
Key Measures Key Outputs Key Outcomes System Performance
• System Response time • Scalability - Number of research
proposals • Proposal Analysis
• Time spent in understanding proposals
• Time spent in analyzing and identifying relationships among concepts
• Percentage of successful document categorization
• Time spent in analyzing and identifying distributions
• User satisfaction • Time saved by the
organization • Awareness of relationships
among proposals • Improvements in document
categorization • Visual awareness of
distributions, patterns and trends
System Usage
• System down time • Scalability
o Number of users o Frequency of use
• User Feedback (real-time) • Usability survey (time-lag) • Training time /learning curve
• Usefulness survey • Feedback results • Duration of learning curve • Duration of training time
• User satisfaction • Savings or improvements in
organizational quality and efficiency
• Time and reduced cost saved by the organization
• Reduced training time or learning curve
• Visual identification of concept relationships
System Operation & Maintenance
• Frequency of Updates • System Downtime • Help Desk Support
• Number of Help Desk support requests
• User satisfaction • Reallocation of Help Desk
resources
27 February 2003 Richard W. Morris <[email protected]> 16
How we’ll get there.
1. Understand impact of disruptive technologies.
2. Use KM to align workflows and data flows.
3. Answer the hard, but practical questions.
27 February 2003 Richard W. Morris <[email protected]> 17
Life Cycle of Disruptive TechnologiesGOAL: DEMONSTRATION
STAGE 1
DRIVER: CONCEPT
GOAL: MATURITY / DIFFUSION
STAGE 3
DRIVER: COMPETITION
GOAL: NICHE APPLICATIONS
STAGE 2
DRIVER: TRUE BELIEVERS
Myers, et al “Practitioner’s View: Evolutionary Stages of Disruptive Technologies”; IEEE Transactions, v. 49, no. 4, Nov. 2002
27 February 2003 Richard W. Morris <[email protected]> 18
workflows
managecaptureinputs release outputs
display
data
docs
data
docs
disks
decisions decisions
taskstasks
27 February 2003 Richard W. Morris <[email protected]> 19
data flows
store
managecaptureinputs release outputs
process process
extract/mergeprep/input collate/sort
display
data
docs
data
docs
disks
decisions decisions
taskstasks
27 February 2003 Richard W. Morris <[email protected]> 20
KEY QUESTIONS
• Do we have a credible means of verifying best practices?
• Do we have baselines and are we ready impacts? Assessment• Is the XML corpus ready? (If so, when and where?)
• Do we have the needed data sets?Inputs• Does the contractor have the resources and skillsets?
• Do we have the staff to manage and oversee the project?Staff• Does the contractor have needed resources / skillsets?
• Do we have staff to manage and oversee the project?Management• Do we have funds / plan for a full-scale implementation?
• Do we have Phase 2 funds — for pre-production piloting?Budget• Do we have the pilot sites identified, with buy-in?
• Is the organization ready?Readiness
27 February 2003 Richard W. Morris <[email protected]> 21
IC of the Future
IC of the future must serve theneeds of several end-users:
– experimental biologists,– clinical researchers,– science administrators, and– even public health officials.
Biology today is quantitative;it depends on computers for the– production,– analysis, and– management of scientific data.
SJ Wiback and BO Palsson Biophysical Journal, 8:2002
27 February 2003 Richard W. Morris <[email protected]> 22
END
27 February 2003 Richard W. Morris <[email protected]> 23
Expanded NIH KM OverviewAssisted Specialized Taxonomy Generation
27 February 2003 Richard W. Morris <[email protected]> 24
Expanded NIH KM OverviewResearch Proposal Archiving & Collaborative Resources
27 February 2003 Richard W. Morris <[email protected]> 25
project plan
performancerequirements
detaileddesign
build / testprototype
assessresults
modify(produce)
system requirements
functionalrequirements
conceptual phase pre-production phase
design phase
Source: Management of Systems Engineering, Wilton P. Chase
27 February 2003 Richard W. Morris <[email protected]> 26
PM plan
performancerequirements
detaileddesign
pre-production phaseconceptual phase
-
build / testprototype
assessresults
modify(produce)
system requirements
functionalrequirements
design phase
internal leadership end-userSource: Management of Systems Engineering, Wilton P. Chase
27 February 2003 Richard W. Morris <[email protected]> 27
KM Project Overview
performancerequirements
detaileddesign
build / testprototype
assessresults
modify(produce)
system requirements
functionalrequirements
conceptual phase (pre-)production phase
design phase
Source: Management of Systems Engineering, Wilton P. Chase
27 February 2003 Richard W. Morris <[email protected]> 28
Conceptual Phase
system requirements
functionalrequirements
performancerequirements
detaileddesign
build / testprototype
assessresults
modify(produce)
-
Pilot #1: situational awareness
Source: Management of Systems Engineering, Wilton P. Chase
collexis
semio
stratify
inxight
i411
Pilot #2: reviewer selectionMITRETEK
conceptual phase