the gp the contractor & the employer marion foster march 2011
TRANSCRIPT
THE GPTHE CONTRACTOR & THE EMPLOYER
Marion Foster March 2011
GP Partner as a Contractor and Employer
UNDERSTAND:
The duties, rights and responsibilities
The RISKs
OBJECTIVESOBJECTIVES
ABRIDGED PRESENTATION — FOR ILLUSTRATIVE PURPOSES ONLY
© 2009, Clifford & Garde
Major MalfunctionMajor Malfunction
Package
“”
CAN GENERAL PRACTICE LEARN FROM CAN GENERAL PRACTICE LEARN FROM THE CHALLENGER DISASTER?THE CHALLENGER DISASTER?
THETHE LEGACYLEGACY OFOF CHALLENGERCHALLENGER
The Rogers Commission, which investigated the incident, determined:
The SRB joint failed when jet flames burned through both o-both o-rings rings in the joint
NASA had long known about recurrent damage to o-recurrent damage to o-ringsrings
Increasing levels of o-ring damage had been tolerated tolerated over time
Based upon the rationale that “nothing bad has “nothing bad has happened yet” - happened yet” - Complacency
1. Maintain Sense Of Vulnerability
2. Combat “Normalization Of Deviance”
3. Establish an Imperative for Safety
4. Perform Valid/Timely Hazard/Risk Assessments
5. Ensure Open and Frank Communications
6. Learn and Advance the Culture
KEY ORGANIZATIONAL CULTURE FINDINGS
– What NASA Did Not Do
MAINTAINING A SENSE OF VULNERABILITYMAINTAINING A SENSE OF VULNERABILITY
NASA’s successes (Apollo program, et al) had created a “can do” “can do” attitude that minimized the consideration of failure
Near-missesNear-misses were regarded as successessuccesses of a robust system rather than near-failures near-failures
A weak sense weak sense of vulnerability can lead to taking future successsuccess for granted granted… and to taking greater risksgreater risks
NASA’s “can do” can do” attitude often made it hard for individuals (even groups) to step forward and say “this can’t be done.” say “this can’t be done.” The imperative of “we must succeed” “we must succeed” had overwhelmed the consideration of “we could fail.”“we could fail.”
1. Maintain Sense Of Vulnerability
2. Combat “Normalization Of Deviance”
3. Establish an Imperative for Safety
4. Perform Valid/Timely Hazard/Risk Assessments
5. Ensure Open and Frank Communications
6. Learn and Advance the Culture
Key Organizational Culture Findings– What NASA Did Not Do
COMBATING NORMALIZATION OF DEVIANCECOMBATING NORMALIZATION OF DEVIANCE
“This history portrays an incremental descent into poor judgment.”
Diane Vaughan,The Challenger Launch
Decision
Each successful mission reinforced reinforced the perception that foam shedding was unavoidableunavoidable…either unlikely to jeopardize safety or an acceptable risk.acceptable risk.
Foam shedding, which violated the shuttle design basis, had been normalizednormalized
Challenger parallel… tolerance of damage tolerance of damage to the primary o-ring… led to tolerance of failure tolerance of failure of the primary o-ring… which led to the tolerance of damage tolerance of damage to the secondary o-ring… which led to DISASTERDISASTER
1. Maintain Sense Of Vulnerability
2. Combat “Normalization Of Deviance”
3. Establish an Imperative for Safety
4. Perform Valid/Timely Hazard/Risk Assessments
5. Ensure Open and Frank Communications
6. Learn and Advance the Culture
Key Organizational Culture Findings– What NASA Did Not Do
ESTABLISH AN IMPERATIVE FOR SAFETYESTABLISH AN IMPERATIVE FOR SAFETY
“When I ask for the budget to be cut, I’m told it’s going to impact SAFETY onthe Space Shuttle … I think that’s a bunch of crap.” Daniel S. Goldin, NASA Administrator, 1994
Burden of proof The technical staff for both Challenger and Columbia were put in the position of having to prove prove that management’s intentions were unsafewere unsafe.
The traditional approach - to assume ssume that a problem existedexisted, then seek the sound technical evidence and analysis necessary to prove to prove (if possible) that the problem did not existdid not exist.
This reversedreversed their normal role of having to prove mission safety.
ESTABLISH AN IMPERATIVE FOR SAFETYESTABLISH AN IMPERATIVE FOR SAFETY
International Space Station deadline 19 Feb 04
Desktop screensaver at NASA
As with Challenger, future NASA funding required meeting an ambitious launch schedule
Conditions/checks, once “critical,” were now waived
A significant foam strike on a recent mission was not resolved prior to Columbia’s launch
Priorities conflicted… and production won over safety
1. Maintain Sense Of Vulnerability
2. Combat “Normalization Of Deviance”
3. Establish an Imperative for Safety
4. Perform Valid/Timely Hazard/Risk Assessments
5. Ensure Open and Frank Communications
6. Learn and Advance the Culture
Key Organizational Culture Findings– What NASA Did Not Do
PERFORM VALID/TIMELY PERFORM VALID/TIMELY HAZARD/RISK ASSESSMENTSHAZARD/RISK ASSESSMENTS
“Any more activity today on the tile damage or are people just relegated to crossing their fingers and hoping for the best?”
Email Exchange at NASA
“… hazard analysis processes are applied inconsistently across systems, subsystems, assemblies, and components.”
CAIB Report, Vol. 1, p. 188
NASA lacked lacked consistent, structured approaches for identifyingidentifying hazards and assessing risks
Many analyses were subjective, subjective, and many action items from studies were not addressed not addressed
In lieu of proper risk assessmentsrisk assessments, many identified concerns were simply labeled as “acceptableacceptable”
Invalid computer modeling of the foam strike was conducted by “green” analysts“green” analysts
1. Maintain Sense Of Vulnerability
2. Combat “Normalization Of Deviance”
3. Establish an Imperative for Safety
4. Perform Valid/Timely Hazard/Risk Assessments
5. Ensure Open and Frank Communications
6. Learn and Advance the Culture
Key Organizational Culture Findings– What NASA Did Not Do
ENSURE OPEN AND FRANK COMMUNICATIONSENSURE OPEN AND FRANK COMMUNICATIONS
I must emphasize (again) that severe enough damage… could present potentially grave hazards… Remember the NASA safety posters everywhere around stating, “If it’s not safe, say so”? Yes, it’s that serious.
Memo that was composed but never sent
Management adopted a uniform mindset that foam strikes were not a concern not a concern and was not open not open to contrary opinions.
The organizational cultureDid not encourage “bad news”“bad news”Encouraged 100% 100% consensusEmphasized only “chain of command” “chain of command” communicationsAllowed rank and status to trumptrump expertise
1. Maintain Sense Of Vulnerability
2. Combat “Normalization Of Deviance”
3. Establish an Imperative for Safety
4. Perform Valid/Timely Hazard/Risk Assessments
5. Ensure Open and Frank Communications
6. Learn and Advance the Culture
Key Organizational Culture Findings– What NASA Did Not Do
LEARN AND ADVANCE THE CULTURELEARN AND ADVANCE THE CULTURE
The organizational dysfunctions that had been identified in the Challenger incident, and which persisted persisted through the Columbia incident, strongly suggest that NASA had not learned from its mistakes…
NASA had not learned not learned from the lessons of Challenger
Communications problems still existed still existed
Experts with divergent opinions still had difficulty getting heardgetting heard
Normalization of deviance Normalization of deviance was still occurring
Schedules Schedules often still dominated dominated over safety concerns
Hazard/risk assessments were still shallowshallow
Abnormal events Abnormal events were not studied in sufficient detail, or trended to maximize learnings
What is Safety Culture and What is Safety Culture and Why is it Important?Why is it Important?
Safety Culture
is the critical barrier
that protects workers, the public, and the
environment
from the inherent risks or dangers
in the organisations’ work.
What is Safety Culture and What is Safety Culture and Why is it Important?Why is it Important?
WHAT IS RISK?WHAT IS RISK?
Chinese word for Risk
Wei Ji
Danger Opportunity
REMEMBER RISK HAS AN UPSIDE!REMEMBER RISK HAS AN UPSIDE!
WHAT ARE OUR HIGH RISK AREAS?WHAT ARE OUR HIGH RISK AREAS?
System Failures negligence claimsSystem Failures negligence claims
Over ¼ of 200 cases settled
Repeat prescribing –11.5%
Dealing with results –7%
General office systems –3.5%
Cervical cytology –1.5%
Recall / protocols / referrals 5%
MDDUS Statistics
EXAMPLES OF LEGISLATIONEXAMPLES OF LEGISLATION Freedom of Information Act. 2002 The Medicines Act, 1968 The Misuse of Drugs Act 1971 The Children’s (Scotland) Act 1995 Health and Safety at Work Act 1974 Data Protection Act 1998 Access to Health Records 1990 The Misuse of Drugs Act 1971 The Children’s (Scotland) Act 1995
Business Continuity Planning
• Low likelihood
• Very High impact
BUSINESS CONTINUITY MANAGEMENTBUSINESS CONTINUITY MANAGEMENT
CIVIL CONTINGENCY ACT 2004• Loss of premises - Fire / Flood
• Loss of staff - Key People / Mass sickness / Flu
• Loss of IT / Patient Records / QoF Data / Key documents
• Loss of power
• Restricted access to building
• Partnership split
• Loss of telephones
Corporate Governance
Business Continuity
Management
Site Recovery planning
Business Continuity planning
Work Area Recovery planning
Human Resource planning
Technology Recovery planning
Crisis Management
planning
Managing the crisis
Essential staffStaff relocation
Business as usual
Site salvage / restoration
ContextContext
SenariosSenarios