scoring 1. scoring categories 1 – 6 (process categories) examiners select a score (0-100) to...

25
Scoring 1

Upload: reginald-reed

Post on 11-Jan-2016

218 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Scoring 1. Scoring Categories 1 – 6 (Process Categories) Examiners select a score (0-100) to summarize their observed strengths and opportunities for

Scoring

1

Page 2: Scoring 1. Scoring Categories 1 – 6 (Process Categories) Examiners select a score (0-100) to summarize their observed strengths and opportunities for

Scoring Categories 1 – 6 (Process Categories)

• Examiners select a score (0-100) to summarize their observed strengths and opportunities for improvement (OFI’s)

• Scoring Guidelines are provided for Approach, Deployment, Learning, and integration

• Scores are assigned for each ITEM

Page 3: Scoring 1. Scoring Categories 1 – 6 (Process Categories) Examiners select a score (0-100) to summarize their observed strengths and opportunities for

Scoring Guidelines for Approach SCORE PROCESS

0% -5%No SYSTEMATIC APPROACH is evident; information is ANECDOTAL. (A)

10% - 25%The beginning of a SYSTEMATIC APPROACH to the BASIC REQUIREMENTS of the Item is evident. (A)

30% - 45%An EFFECTIVE, SYSTEMATIC APPROACH, responsive to the BASIC REQUIREMENTS of the Item, is evident. (A)

50% - 65%An EFFECTIVE, SYSTEMATIC APPROACH, responsive to the OVERALL REQUIREMENTS of the Item, is evident. (A)

70% - 85%An EFFECTIVE, SYSTEMATIC APPROACH, responsive to the MULTIPLE REQUIREMENTS of the Item, is evident. (A)

90% - 100%An EFFECTIVE, SYSTEMATIC APPROACH, fully responsive to the MULTIPLE REQUIREMENTS of the Item, is evident. (A)

Page 4: Scoring 1. Scoring Categories 1 – 6 (Process Categories) Examiners select a score (0-100) to summarize their observed strengths and opportunities for

Category/Item Question Organization

BASIC

OVERALL MULTIPLE

Page 5: Scoring 1. Scoring Categories 1 – 6 (Process Categories) Examiners select a score (0-100) to summarize their observed strengths and opportunities for

Scoring Guidelines for DEPLOYMENTSCORE PROCESS

0% -5% Little or no DEPLOYMENT of an APPROACH is evident.

10% - 25%The APPROACH is in the early stages of DEPLOYMENT in most areas or work units, inhibiting progress in achieving the BASIC REQUIREMENTS of the Item.

30% - 45%The APPROACH is DEPLOYED, although some areas or work units are in early stages of DEPLOYMENT.

50% - 65%The APPROACH is well DEPLOYED, although DEPLOYMENT may vary in some areas or work units.

70% - 85% The APPROACH is well DEPLOYED, with no significant gaps.

90% - 100%The APPROACH is fully DEPLOYED without significant weaknesses or gaps in any areas or work units.

Page 6: Scoring 1. Scoring Categories 1 – 6 (Process Categories) Examiners select a score (0-100) to summarize their observed strengths and opportunities for

Scoring Guidelines for LEARNING

SCORE PROCESS

0% -5%An improvement orientation is not evident; improvement is achieved through reacting to problems.

10% - 25%Early stages of a transition from reacting to problems to a general improvement orientation are evident.

30% - 45%The beginning of a SYSTEMATIC APPROACH to evaluation and improvement of KEY PROCESSES is evident.

50% - 65%A fact-based, SYSTEMATIC evaluation and improvement PROCESS and some organizational LEARNING are in place for improving the efficiency and EFFECTIVENESS of KEY PROCESSES.

70% - 85%Fact-based, SYSTEMATIC evaluation and improvement and organizational LEARNING are KEY management tools; there is clear evidence of refinement and INNOVATION as a result of organizational-level ANALYSIS and sharing.

90% - 100%Fact-based, SYSTEMATIC evaluation and improvement and organizational LEARNING are KEY organization-wide tools; refinement and INNOVATION, backed by ANALYSIS and sharing, are evident throughout the organization.

Page 7: Scoring 1. Scoring Categories 1 – 6 (Process Categories) Examiners select a score (0-100) to summarize their observed strengths and opportunities for

Scoring Guidelines for INTEGRATION

SCORE PROCESS

0% -5%No organizational ALIGNMENT is evident; individual areas or work units operate independently.

10% - 25%The APPROACH is ALIGNED with other areas or work units largely through joint problem solving.

30% - 45%The APPROACH is in early stages of ALIGNMENT with your basic organizational needs identified in response to the Organizational Profile and other Process Items.

50% - 65%The APPROACH is ALIGNED with your organizational needs identified in response to the Organizational Profile and other Process Items.

70% - 85%The APPROACH is INTEGRATED with your organizational needs identified in response to the Organizational Profile and other Process Items

90% - 100%The APPROACH is well INTEGRATED with your organizational needs identified in response to the Organizational Profile and other Process Items.

Page 8: Scoring 1. Scoring Categories 1 – 6 (Process Categories) Examiners select a score (0-100) to summarize their observed strengths and opportunities for

Overall Score for Categories 1 – 6 Items

• An applicant’s score for approach [A], deployment [D], learning [L], and integration [I] depends on their ability to demonstrate the characteristics associated with that score

• Although there are 4 factors to score for each Item, only one overall score is assigned

• The examination team selects the “best fit” score, which is likely to be between the highest and lowest score for the ADLI factors

Page 9: Scoring 1. Scoring Categories 1 – 6 (Process Categories) Examiners select a score (0-100) to summarize their observed strengths and opportunities for

“Best Fit” ExampleConsider two applicants, both scoring 60% for A, D, and I, and 20% for L• Applicant Market Growth Technology

Competitors A 10%/year Stable Stable B 2X/year Major change Many new

• Item Score:

• Applicant A: 50%

(Learning could lead to incremental results improvements)• Applicant B: 30% (Learning needed to sustain the organization)

Page 10: Scoring 1. Scoring Categories 1 – 6 (Process Categories) Examiners select a score (0-100) to summarize their observed strengths and opportunities for

6. Independent review:LeTCI – Results Items

Objective: Be able to evaluate a results item

for independent review

Page 11: Scoring 1. Scoring Categories 1 – 6 (Process Categories) Examiners select a score (0-100) to summarize their observed strengths and opportunities for

Using a Table of Expected Results to Identify Category 7 OFI’s

Purpose:

Identify results that the applicant hasn’t provided

WHY?

Because applicants like to show results that are favorable, but may not always show results that are important.

Page 12: Scoring 1. Scoring Categories 1 – 6 (Process Categories) Examiners select a score (0-100) to summarize their observed strengths and opportunities for

Table of Expected Results

Title of Expected Result

Source Reference

Category 7Reference

Result Reference

(Describe expected result and segments)

(List where in the application you learned this was important to report in Cat 7)

(List where in Cat 7 the result belongs)

(Identify the page or Figure that contains the results)

Examples: Examples: Examples:

Status of Action Plans (% of Action Plans on target)

OP pg.7 1, 2.1a2 Not found

Workforce segments OP p6 7.4 Expect to see segmentation by FT, PT and on-call staff

Found in 7.4-4, 6, 10Not found in 7.4-7, 8, 9

Page 13: Scoring 1. Scoring Categories 1 – 6 (Process Categories) Examiners select a score (0-100) to summarize their observed strengths and opportunities for

Process for using a Table of Expected Results

As you read the application, pay attention for processes that the applicant cites as important, measures that are discussed, segments that are used, etc.

As you find examples of these, include them in the table of expected results in columns 1-3.

When you review Category 7, review it for the expected results and complete column 4.

Based on the comments in Column 4 of the table, an OFI, or OFI’s can be drafted citing missing expected results.

Note: These are results that the applicant has introduced an expectation for in the Organizational Profile and Categories 1-6. They are NOT results that the examiner would like to see.

Page 14: Scoring 1. Scoring Categories 1 – 6 (Process Categories) Examiners select a score (0-100) to summarize their observed strengths and opportunities for

Evaluation Factors for Category 7 (results)

• Le = Performance LevelsNumerical information that places an organization’s results on a meaningful measurement scale. Performance levels permit evaluation relative to past performance, projection goals, and appropriate comparisons.

• T = TrendsNumerical information indicating the direction, rate and breadth of performance improvements. A minimum of 3 data points is needed to begin to ascertain a trend. More data points are needed to define a statistically valid trend.

• C = ComparisonsEstablishing the value of results by their relationship to similar or equivalent measures. Comparisons can be made to results of competitors, industry averages, or best-in-class organizations.

• I = IntegrationConnection to important customer, product and service, market, process and action plan performance measurements identified in the Organizational Profile and in Process Items.

• G = Gaps Absence of results addressing specific areas of Category 7 Items, including the absence of results on key measures discussed in Categories 1–6

Page 15: Scoring 1. Scoring Categories 1 – 6 (Process Categories) Examiners select a score (0-100) to summarize their observed strengths and opportunities for

Results Evaluation Factors

45

50

55

60

65

70

75

80

85

90

95

93-94 95-96 97-98 99-00 01-02 03-04

Per

cen

tile

MCB Percentile Top 10%National Mean MCB Trend

Le

Trend

Comparison

Denotes “Good” Trend

direction

Page 16: Scoring 1. Scoring Categories 1 – 6 (Process Categories) Examiners select a score (0-100) to summarize their observed strengths and opportunities for

Results strengths and OFI;s

• Strengths are identified if the following is observed:• Performance levels [Le] are equivalent or better than

comparatives, and/or benchmarks• Trends [T] show consistent improvement, and• Results are linked [Li] to key requirements

• Opportunities for improvement are identified if:• Performance levels [Le] are not as good as comparatives• Trends [T] show degrading performance• Comparisons are not shown• Results are not linked [Li] to key requirements• Results are not provided [G] for key processes and/or

action items

Page 17: Scoring 1. Scoring Categories 1 – 6 (Process Categories) Examiners select a score (0-100) to summarize their observed strengths and opportunities for

Scoring a Results Item

• Scoring a Results item is similar to a Process Item

• Scoring Guidelines for Le, T, C, and I

• A single “best fit” score is selected for each results item

Page 18: Scoring 1. Scoring Categories 1 – 6 (Process Categories) Examiners select a score (0-100) to summarize their observed strengths and opportunities for

Results Scoring for Level [Le]SCORE RESULTS

0% -5%There are no organizational PERFORMANCE RESULTS or poor RESULTS in areas reported.

10% - 25%A few organizational PERFORMANCE RESULTS are reported; there are some improvements and/or early good PERFORMANCE LEVELS in a few areas.

30% - 45%Improvements and/or good PERFORMANCE LEVELS are reported in many areas addressed in the Item requirements.

50% - 65%Improvement TRENDS and/or good PERFORMANCE LEVELS are reported for most areas addressed in the Item requirements.

70% - 85%Current PERFORMANCE LEVELS are good to excellent in most areas of importance to the Item requirements.

90% - 100%Current PERFORMANCE LEVELS are excellent in most areas of importance to the Item requirements.

Page 19: Scoring 1. Scoring Categories 1 – 6 (Process Categories) Examiners select a score (0-100) to summarize their observed strengths and opportunities for

Results Scoring for Trend [T]SCORE RESULTS

0% -5% TREND data either are not reported or show mainly adverse TRENDS.

10% - 25%Little or no TREND data are reported, or many of the trends shown are adverse.

30% - 45% Early stages of developing TRENDS are evident.

50% - 65%No pattern of adverse TRENDS and no poor PERFORMANCE LEVELS are evident in areas of importance to your organization’s KEY MISSION or business requirements.

70% - 85%Most improvement TRENDS and/or current PERFORMANCE LEVELS have been sustained over time.

90% - 100%Excellent improvement TRENDS and/or consistently excellent PERFORMANCE LEVELS are reported in most areas.

Page 20: Scoring 1. Scoring Categories 1 – 6 (Process Categories) Examiners select a score (0-100) to summarize their observed strengths and opportunities for

Results Scoring for Comparison [C]SCORE RESULTS

0% -5% Comparative information is not reported.

10% - 25% Little or no comparative information is reported.

30% - 45% Early stages of obtaining comparative information are evident.

50% - 65%Some TRENDS and/or current PERFORMANCE LEVELS—evaluated against relevant comparisons and/or BENCHMARKS—show areas of good to very good relative PERFORMANCE.

70% - 85%

Many to most reported TRENDS and/or current PERFORMANCE LEVELS—evaluated against relevant comparisons and/or BENCHMARKS—show areas of leadership and very good relative PERFORMANCE.

90% - 100%Evidence of industry and BENCHMARK leadership is demonstrated in many areas.

Page 21: Scoring 1. Scoring Categories 1 – 6 (Process Categories) Examiners select a score (0-100) to summarize their observed strengths and opportunities for

Results evaluation: key concepts

• Integration• Results align with key factors, e.g.,

• Strategic challenges, • Workforce requirements• Vision, mission, values

• Results presented for• Key processes• Key products, services• Strategic accomplishments

• What examples can you think of?• Strong integration?• Not-so-strong integration?

21

Page 22: Scoring 1. Scoring Categories 1 – 6 (Process Categories) Examiners select a score (0-100) to summarize their observed strengths and opportunities for

Results evaluation: key concepts

• Strong integration: results presented for • Key areas addressing strategic challenges, • Key competitive advantages • Key customer requirements

• Not-so-strong integration:• Results missing for the above• Results presented that the Examiner can’t match to

process items or Organizational Profile

22

Page 23: Scoring 1. Scoring Categories 1 – 6 (Process Categories) Examiners select a score (0-100) to summarize their observed strengths and opportunities for

Results Scoring for Integration [I]SCORE RESULTS

0% -5%RESULTS are not reported for any areas of importance to your organization’s KEY MISSION or business requirements.

10% - 25%RESULTS are reported for a few areas of importance to your organization’s KEY MISSION or business requirements.

30% - 45%RESULTS are reported for many areas of importance to your organization’s KEY MISSION or business requirements.

50% - 65%Organizational PERFORMANCE RESULTS address most KEY CUSTOMER, market, and PROCESS requirements

70% - 85%Organizational PERFORMANCE RESULTS address most KEY CUSTOMER, market, PROCESS, and ACTION PLAN requirements.

90% - 100%Organizational PERFORMANCE RESULTS fully address KEY CUSTOMER, market, PROCESS, and ACTION PLAN requirements.

Page 24: Scoring 1. Scoring Categories 1 – 6 (Process Categories) Examiners select a score (0-100) to summarize their observed strengths and opportunities for

“Key” results

• Refers to elements or factors most critical to achieving the intended outcome

• In terms of results, look for• Those responsive to the Criteria requirements• Those most important to the organization’s success• Those results that are essential elements for the

organization to pursue or monitor in order to achieve its desired outcome

24

Page 25: Scoring 1. Scoring Categories 1 – 6 (Process Categories) Examiners select a score (0-100) to summarize their observed strengths and opportunities for

THANK YOU!!

Your support and participation as Examiners helps us all by helping WSQA

fulfill its mission!

25