equitable job eval project overview report (1)

Upload: mohammad-yousuf

Post on 04-Apr-2018

225 views

Category:

Documents


1 download

TRANSCRIPT

  • 7/30/2019 Equitable Job Eval Project Overview Report (1)

    1/19

    Equitable Job Evaluation Project Overview Report

    Department of Labour September 2009

    1

  • 7/30/2019 Equitable Job Eval Project Overview Report (1)

    2/19

    Background .................................................................................................................... 3

    History and rationale ..................................................................................................3

    Development and testing ...........................................................................................3

    Key EJE design features .............................................................................................5

    Implementation ...............................................................................................................5

    Beta release .................................................................................................................5Conditions of use ........................................................................................................6

    EJE Review ................................................................................................................6

    Training modules ........................................................................................................6

    Pay investigations ...........................................................................................................7

    EJE projects ................................................................................................................7

    Special Education Support Workers Pay Investigation ..........................................7

    Community Support Workers ................................................................................ 8

    Department of Conservation (DOC) roles ..............................................................9

    PSA project ............................................................................................................ 9

    Roles evaluated with EJE ..................................................................................... 11

    Project terminated .................................................................................................12Implementation issues for consideration in the review ................................................ 12

    Training .................................................................................................................... 12

    Process ...................................................................................................................... 13

    Factor plan ................................................................................................................14

    Comparisons with other systems .............................................................................14

    Becoming more efficient in implementation ............................................................14

    Identifying comparator occupations .........................................................................14

    The time projects have taken ...................................................................................15

    Using EJE without its own market data base of points and pay ...............................16

    The future of EJE ..................................................................................................... 16

    Conclusion .................................................................................................................... 17

    Appendix 1 The Gender-inclusive Job Evaluation Standard, and Spotlight: A Skills

    Recognition Tool ......................................................................................................18

    2

  • 7/30/2019 Equitable Job Eval Project Overview Report (1)

    3/19

    Background

    History and rationale

    The Equitable Job Evaluation (EJE) System was developed as a result of Cabinetdecisions implementing the Pay and Employment Equity Plan of Action 2004-2009. The Plan of Action followed the findings of the Tripartite Pay andEmployment Taskforce, which commenced in 2003 and reported in 2004. One ofthe Taskforce findings was that undervaluation of female-dominated occupationswas one of three main causes of the gender pay gap. The Taskforcerecommended and the Government agreed that a gender-neutral job evaluation

    tool would facilitate evaluation of jobs free of gender bias. The Department ofLabour was tasked with the development of the tool.

    A Request for Quotation for the development of the gender neutral job evaluation

    tool was issued for competitive tender on the Government procurement site,

    GETS (closed 5 November 2004). Four expressions of interest were received.The approach developed and negotiated involved the contributions of twoproviders with complementary expertise in job evaluation and gender bias, andthat approach was the basis of the contracts with Watson Wyatt and Top DrawerConsultants. It was recognized that acceptance of the legitimacy of the tool for arange of stakeholders would depend on the tool being developed within thecontexts of mainstream job evaluation provision in New Zealand and historic and

    current concerns about gender bias in job evaluation both internationally and inNew Zealand.

    A project team was formed, comprising consultants from Watson Wyatt (latertaken over by Mercer), Top Drawer Consultants, Pulse HR, a representative of the

    State Services Commission and the Director and Senior Adviser from the Pay andEmployment Equity Unit, Department of Labour.

    Various uses of EJE were envisaged. They included: stand-alone use of the tool asa job evaluation system for an organization or occupation; a tool for reflecting on

    the gender-inclusiveness of other systems in use for whole organizations or units,or particular occupations; a tool for producing information about relative job

    values for use in bargaining and/or in claims for funding for remuneration thatmore fully and fairly reflects the value of jobs in female-dominated occupations

    Twenty five of the 38 Public Service reviews raised concerns about and agreed toaddress gender bias in job evaluation, through the use of EJE and/or the Gender-inclusive job Evaluation Standard (see below).

    Development and testingThe EJE tool was developed and tested between 2005 and 2006. The EJE toolconsists of:

    1. A factor plan that includes:

    12 factors clustered in three factor families with varying numbers of

    factor levels

    Guidance notes for using the factors

    Details of factor weightings and the conversion of raw scores toweighted scores

    Details on how to take organizational size into account in the scoring.

    3

  • 7/30/2019 Equitable Job Eval Project Overview Report (1)

    4/19

    2. An EJE Users Guide that provides information on how to set up a genderneutral process and includes things such as selecting a job evaluationcommittee, the role of the committee chair, selecting benchmark jobs,

    collecting job information and conducting an evaluation.

    3. An EJE questionnaire. This is recommended as the process for collectinginformation from job holders since it reflects the metrics of the factor plan.

    4. Four training modules on using the EJE system.

    The project team initially researched national and international experience of

    features of job evaluation design and implementation which increase and mitigate

    the likelihood of gender bias. These gender neutral principles were used to inform

    the EJE development process. This work was also used in the development of the

    Gender-inclusive Job Evaluation Standard (8007/2006) by Standards NZ, which

    provides a blueprint for all job evaluation systems to meet on a voluntary basis1.

    Appendix 1 briefly sets out the content and background of the Standard, and

    introduces another resource complementary to EJE.

    In addition to the expertise of the project team the development processincluded:

    A review of local and overseas job evaluation systems especially those

    specifically designed to minimize gender bias2

    National and international peer review

    Statistical analysis of results

    Four targeted trials.

    Each trial resulted in modification and refinement of the factor plan and theassociated EJE material.

    1. The first draft factor plan was tested by evaluating eleven typical jobs

    using position descriptions alone. This trial focused on the usability of the

    system and whether the features of the different jobs could be measured

    by the draft factors.

    2. The second trial used position descriptions and EJE questionnaires

    completed by thirty job holders across the Public Service. This provided

    more comprehensive information on the ability of the draft factor plan to

    capture a wider range of work, and an opportunity to check the

    interpretation of the factors and the factor levels, to identify work

    dimensions that were not adequately captured, to more closely focus the

    1 There was no existing Standard on gender bias and job evaluation, and t heStandard has since attracted interest in South Africa, Spain, Australia and theUnited Kingdom. The Pay and Employment Equity Unit developed three resourceson meeting the Standard - A Guide to the Gender-inclusive job EvaluationStandard, Gender Bias in Job Evaluation: A Resource Collection, and Dorfox Meetsthe Standard: Gender-inclusive Job Evaluation.2 For example, in the United Kingdom, the job evaluation systems developed forlocal government and for the National Health Service were specifically designedto minimise gender bias. In April 2009, a decision was handed down in the UnitedKingdom case Hartley and Others and Northumbria Healthcare NHS Foundation

    Trust. It was found that Agenda for Change, the job evaluation system used in thehealth sector, was not affected by sex discrimination. The decision provides acomprehensive consideration of gender bias in job evaluation.

    4

  • 7/30/2019 Equitable Job Eval Project Overview Report (1)

    5/19

    scope of some factors and to consider any issues of the interrelationship of

    the factors.

    3. The third trial included testing the data gathering process, the application

    of the factor plan to 17 jobs within a single organisation and the use of the

    system by a job evaluation committee from the organisation itself. It alsoallowed for comparison of the rank order of jobs produced by EJE

    compared with that which was established using another system within the

    organisation and consideration of whether any differences were

    explainable.

    4. The final trial examined the performance of EJE in evaluating occupational

    hierarchies in job families within the health and education sector. This trial

    also allowed a final review of language and level distinction.

    Key EJE design features Complete and validated job data provided by jobholders

    Transparent and fixed weightings3

    Questionnaire that reflects what is to be measured

    Full rationales for scoring decisions

    Consistent processes, and

    Training to avoid bias.

    The project team was confident that the last version of EJE met the designprinciples of gender neutrality developed at the outset. Further modificationscould only be achieved after real life use of the tool. While EJE is a New Zealand

    system built for use in contemporary organizations4 the 12 factors are not acomplete departure from other job evaluation systems.

    Implementation

    Beta releaseCabinet originally agreed that EJE would be released in a beta release a testingphase which could be concluded following an evaluation of the system afterevaluation of a sufficiently large number of jobs [Cab 06 34/8]. That decision andothers relating to the Pay and Employment Equity Plan of Action have beenrescinded [Cab 09 16/12].

    The use of a beta release phase is common in implementing a new job evaluationsystem. It recognizes that there may be a need to modify certain features of the

    scheme in the light of its operation and results. In the case of EJE, the betarelease phase accommodates the reality that while four tests had been

    conducted, there is some unavoidable artificiality in how the job evaluationprocess operates when its results are not being implemented. This artificiality

    3 The fixed weightings are set out in the Factor Plan. Consistent with other jobevaluation systems, the weightings are based on full time equivalent staff number

    and operational budgets.4 For example, the system recognises job features such as multicultural skills andleadership through influence.

    5

  • 7/30/2019 Equitable Job Eval Project Overview Report (1)

    6/19

    affects data gathering, the operation of the job evaluation process, the way jobholders, managers, and evaluators participate in the process and themanagement of the results of the evaluation. Certain aspects of the operation ofthe scheme can only become apparent when a sufficiently large and diverse rangeof jobs is evaluated. The beta release allows the use of the tool, while recognizingthat there may be some changes in the tool and some results may need to be re-

    visited, without obstructing the use of the tool and the implementation of results.Since the beta release phase involves monitoring of results, and is an important

    phase in quality assurance of the tool, standards were set regarding how the toolwould be used. The beta release process was agreed in letters between the

    Department of Labour and the State Services Commission.

    Conditions of useEJE has been provided to users with a Conditions of Use Agreement which

    requires, among other things, the use of the tool in its entirety, and requiresusers to provide the results of EJE evaluations to the DoL and to train participants

    appropriately for the EJE projects. In response to a high level of interest in the

    tool, EJE was also provided on a for information basis to interested people, withthe proviso that use of EJE for job evaluation without a Conditions of Useagreement contravened the basis on which the for information copy is madeavailable5.

    EJE ReviewA monitoring committee comprising employer and union representatives from the

    Public Service, and the public health and education sectors, was established bythe Department of Labour to review EJE results as they emerged and to conductthe evaluation to conclude the beta release. The committee met once on 10 May2007. A review and monitoring framework was commissioned from Strategic Pay,using an excel-based tool to record scores, evaluation rationales and

    remuneration information. The committee agreed to the EJE monitoring andreview framework.

    Training modulesTop Drawer Consultants and Working Wisdom were contracted to develop the EJEtraining modules. Three modules were initially developed:

    1. Introduction to EJE (.5 day)2. Data Gatherer training (one day)3. Evaluator training (one day)

    The training is a mix of presentation and practical experience of (for example)

    data gathering. A further stand alone module was later developed by Top DrawerConsultants and Pulse HR - Minimising Bias in Human Resources Practice.

    The three core EJE modules were piloted with volunteers (mainly from the PublicService Association and HR staff from Government departments). A trainingopportunity was provided to consultants and others to become competent inproviding EJE training or assist with EJE implementation within organizations.

    As required by the EJE Conditions of Use agreement, training has been providedto data gatherers and the job evaluation committee for the Ministry of Education

    pay investigation of education support workers. Training for data gatherers wasprovided to the PSA for its internal project evaluating the jobs of its own staff and

    the Child, Youth and Family pay investigation of social workers.

    5 Following the disestablishment of the Pay and Employment Equity Unit, the basison which EJE is provided has changed and is set out at the end of this report.

    6

  • 7/30/2019 Equitable Job Eval Project Overview Report (1)

    7/19

    In addition to formal training sessions, there have been numerous briefpresentations to seminars and groups to introduce EJE and give an outline of thesystem and its implementation. It became clear that there is a need to have someflexibility about the content and duration of the training depending on existinglevels of skills and knowledge about gender and job evaluation.

    Pay investigationsThe Plan of Action included a provision for pay investigations and remedial paysettlements6. A pay investigation was a process of systematic enquiry into all the

    factors affecting remuneration of female dominated occupations where 70% ormore people in that occupation are women. The investigation focused on factorsthat influence job size such as skills, knowledge, responsibilities, demands andconditions. It also included other factors that affect pay rates such as marketinfluences, performance payments and other employment conditions.

    Cabinet had determined that pay investigations can arise from agency or sectorPay and Employment Equity Review response or action plans; from a request to

    participate in a sector/cross sector pay investigation; or in bargaining. Cabinethad approved terms of reference and guidelines for conducting a pay

    investigation [Cab 05 34/8]. Comparator occupations from outside theorganization or bargaining unit within the sector or across the sector could beused, if suitable internal comparators could not be identified.

    The conduct of a pay investigation was seen as likely to include evaluating asample of jobs to assess the relative size, content and contribution of target andcomparator roles. It was anticipated that EJE would be the evaluation instrumentalthough by agreement of the parties to the pay investigation, it was possible to

    use another job evaluation system providing it met the Gender-inclusive JobEvaluation Standard. It was intended that any remedial pay settlement arising

    from a pay investigation would be delivered mainly through existing budgetplanning and management mechanisms.

    Two pay investigations were initiated as a result of organizational pay andemployment equity review response plans. These were of social workers in ChildYouth and Family (CYF) (part of the Ministry of Social Development) and specialeducation support workers in the Ministry of Education. Both investigations usedEJE as the key evaluation tool with another job evaluation system used to provide

    a comparison with the EJE outcomes.

    The Ministry of Education with the NZEI completed the investigation process inJanuary 2009. The results are now being used within the bargaining process. The

    investigation in CYFs was not completed by the time the Governmentdiscontinued pay investigations early in February 2009. The Ministry of Educationpay investigation (as it related to EJE) is described in more detail below.

    EJE projects

    Special Education Support Workers Pay InvestigationThe Pay and Employment Equity Review in the Ministry of Education suggested

    that the work of Special Education Support Workers may be undervalued becausethe work is primarily undertaken by women and has many of the characteristics

    found in research and pay equity cases to be associated with gender related

    6 Pay investigations have been discontinued and the related Cabinet decisionshave been rescinded [Cab 09 16/12].

    7

  • 7/30/2019 Equitable Job Eval Project Overview Report (1)

    8/19

    undervaluation. There are three support worker roles Behavioural SupportWorkers, Communication Support Workers and Educational Support Workers. Thecomparator roles chosen were Corrections Officers and Hospital Orderlies. Thesewere male-dominated occupations in the public sector, with the same skill level aseducation support workers, as measured by the Australian and New ZealandStandard Classification of Occupations (ANZSCO). The pay investigation was

    conducted by Janice Burns (Top Drawer Consultants) and Lyndy Young (PulseHR)both of whom had been on the EJE development team.

    The full report can be found on NZEIs website:

    http://www.nzei.org.nz/Group+Special+Education/Support+Workers+Pay+Investigation.html

    The pay investigation used all the components of the EJE system including theEJE questionnaire. Once the data had been gathered and validated, theinformation was evaluated using EJE. The job data was also provided to Mercer toundertake an evaluation using the Compers system7. This allowed comparison of

    the rank order generated by the EJE evaluation and the Compers evaluation. Ifthe rank orders had been substantially different there would have been an

    opportunity to analyse the ways in which the EJE factors (in terms of what theycapture and measure) could have affected the scores. The EJE and Mercer

    evaluations used the same job information and produced substantially similarresults. Both evaluations supported the hypothesis that the Special EducationSupport Workers jobs were of similar job size to Corrections Officers who are paidconsiderably more, and the jobs are of larger size than Hospital Orderlies who arealso paid more.

    Community Support WorkersCommunity support workers are employed by private sector and not-for-profit

    organizations. These organizations are funded by the government to provide 24hour, seven day a week care in community houses for people with intellectualdisabilities (they may also have physical and behavioural disabilities). The PSAand employers agree that the community support workers are underpaid for theskill level of the work they do.

    The PSA commissioned Janice Burns of Top Drawer Consultants to undertake anEJE evaluation to compare community support workers with other occupations of

    similar ANZSCO8 classification and entry requirements. The project was supportedby the employers. Two comparators were selected, therapy assistants (female-

    dominated) and corrections officers (male-dominated). Community supportworkers and therapy assistant roles have not previously been sized through job

    evaluation.

    The results supported the view that the jobs are of substantially similar size. It islikely that EJE captured and measured aspects of the community support workerrole (including services to people and emotional and physical demands) that havepreviously not been acknowledged in pay setting. This was a small scale exercise

    7 Compers is owned by Mercer through a merger with Watson Wyatt. The Ministryof Education used Compers as their job evaluation system. Compers is no longersupported by Mercer.8 Community support workers and the comparator occupations are classified atlevel 4. There are five skill levels in ANZSCO defined in terms of formal education

    and training, previous experience and on- the-job training. Skill level 4 is a level ofskill commensurate with a NZ level 2 or 3 qualification or at least one year ofrelevant experience.

    8

    http://www.nzei.org.nz/Group+Special+Education/Support+Workers+Pay+Investigation.htmlhttp://www.nzei.org.nz/Group+Special+Education/Support+Workers+Pay+Investigation.htmlhttp://www.nzei.org.nz/Group+Special+Education/Support+Workers+Pay+Investigation.htmlhttp://www.nzei.org.nz/Group+Special+Education/Support+Workers+Pay+Investigation.html
  • 7/30/2019 Equitable Job Eval Project Overview Report (1)

    9/19

    involving nine jobs and the data gathering and evaluation was done by theconsultant. However, in line with the Gender-inclusive Job Evaluation Standard,the evaluation was independently validated by another consultant.

    Department of Conservation (DOC) roles

    This project was undertaken by David Shannon, Senior Consultant, (then) Mercer.Its purpose was to consider whether use of the Equitable Job Evaluation Systemwould produce different relativities among certain roles from those produced by

    using Compers, the Departments existing system. The project arose partly inresponse to questions raised during the Departments Pay and EmploymentEquity Review as to whether the Departments existing job evaluations were freeof gender bias.

    The DoC roles involved in this project were: Ranger, Visitor Centre Ranger (Service)

    Business Services Officer (Finance) Ranger (Biodiversity Assets).

    The first three roles were female-dominated, and the Ranger (Biodiversity Assets)was male-dominated. All positions are grade 7 in Compers. The rank order of thepositions was substantially similar in the EJE and Compers evaluations and thepoints values in EJE were fairly similar for all the positions. The two systems didnot produce different rank orders in points for the male and female dominated

    jobs.

    While two of the specific ranger roles in this exercise are female-dominated,overall the ranger occupation in DOC, at the time of the Pay and Employment

    Equity Review, was male-dominated (80%). Business services roles at level 1 arefemale-dominated (83%) and male-dominated at level 2 (60%).

    Experience of developing EJE would indicate that EJE would not be expected toalter significantly the established relativities of work such as ranger in that frontline service work is (with exceptions for particular roles within the occupation) nota core component. The same is true of the finance role. While EJE would offersome potential gains in terms of recognition of aspects of the ranger job such as

    physical skills, these roles would not usually score highly on the more distinctivepeople/service factors.

    The results would seem to confirm that while EJE is likely to alter the traditional

    relativities between some jobs, there are many jobs for which it will produce aresult similar to that of other job evaluation systems.

    PSA project

    In 2007 the PSA undertook a Pay and Employment Equity review. The review was

    committed to in bargaining between the PSA staff union representatives and thePSA. The PSA Joint Union Management Committee (JUMC) sponsored the review.

    The JUMC is made up of the National Secretariat, a representative of theAssistant Secretaries, the Union Group (UG) and the PSA Member EmployeeGroup (MEG). The JUMC agreed on the following recommendations in the PAEEreport:

    1. developing job descriptions for all 20 PSA positions using the EJE

    questionnaire and the job evaluation record

    2. EJE training for participants on job evaluation committee3. undertaking an Equitable Job Evaluation (EJE) exercise of all 20 positions within the PSA;

    9

  • 7/30/2019 Equitable Job Eval Project Overview Report (1)

    10/19

    A working party was established to carry out these tasks.

    Developing Job DescriptionsThe first phase of the project involved developing gender neutral job descriptionsfor every job in the PSA covered by the Collective Agreement. The working party

    drafted gender neutral job descriptions for every job within the PSA. Thepositions are:

    Assistant Secretary

    Financial ControllerHuman Resources AdvisorPolicy AdvisorLegal officerAsset ManagerCommunications AdvisorOrganiserKnowledge management and information Advisor

    IT database developmentSenior Finance Officer

    IT systems officerIT service network database

    PSA asset and administration officerOn line organiserFinance systems and processes officerMembership officerOrganising administratorDatabase development administrator

    Support Administration

    Collecting job dataAfter consultation with staff on the job description, the new job descriptions were

    approved and job holders were interviewed using the Equitable Job EvaluationQuestionnaire. The outcome of the interviews was reviewed by the NationalSecretaries and Assistant Secretaries:

    is the material an accurate reflection of the job?

    is there anything missing than should be added?

    Their comments were available to the working party.

    Evaluation CommitteeThe evaluation committee was made up of the members of the working party and

    an external consultant with an in-depth knowledge and experience in EJE. In linewith the EJE Conditions of Use Agreement with the Department of Labour, theevaluation committee was trained in using the Equitable Job Evaluation factor

    plan. The training covered:

    the ground rules for working together on the evaluation team;

    an in-depth discussion of EJE factors and their underlying concepts;

    avoiding bias in the evaluation process;

    the evaluation process.

    The committee did not undertake evaluations of their own roles; this was done bythe external consultant. In addition, the committee undertook the evaluation to

    the point of determining the level score only. This meant that prior to applying

    10

  • 7/30/2019 Equitable Job Eval Project Overview Report (1)

    11/19

    the factor weightings9 it is not possible to see the final relationships between thejobs. It avoids any tendency to make jobs come out where it feels right.The factor weights were applied by the external consultant after the committeefinished all the evaluations. This is simply a mechanical process. On completion ofthe evaluations the committee reviewed all the level scores for consistency.The committee evaluated all jobs on the information that was available from the

    job descriptions and the questionnaires. The committee evaluated the jobs on thebasis on this information and what the PSA would require of a competent job

    holder, not on the job-holder who currently occupies the job. Job evaluation doesnot factor in length of service. That is a separate process determined by the

    progression structure of the remuneration system.

    The evaluation process took four days. It is important to check that jobsevaluated at different points in time have been treated consistently. Often, overtime evaluation committees increase their understanding of the factors and theway levels are assigned.

    All staff received a copy of the job scores for their own role, a copy of the

    weighted scores chart for all jobs and the evaluators written rationales forreaching those scores.

    Review process

    In line with the requirements of the Gender-inclusive Job Evaluation Standard, areview/appeal process was established for jobholders. This allowed staff torequest a review of their job evaluation score if they felt key information hadbeen omitted. Four jobholders sought a review and as a result of this job scoreswere amended.

    OutcomeThe final job scores were discussed at the JUMC. The final EJE results showed that

    some jobs have substantially changed their internal relativity. These are mainlyfemale-dominated jobs and their pay ranges were adjusted to reflect the EJE

    results. Some jobs have moved downwards in relation to other jobs. This may beaddressed if a vacancy arises.Another outcome was that the administrative jobsare all about the same size.

    Roles evaluated with EJEIn the projects completed to date the following roles were evaluated with EJE:

    Special Education Support Workers (Behavioural support workers,Education support workers and Communications support workers)

    Corrections Officers

    Hospital Orderlies

    Therapy Assistants

    Community Support Workers

    Three DOC Ranger roles

    Finance role within DOC

    Assistant Secretary

    Financial Controller

    Human Resources Advisor

    Policy Advisor

    Legal officer

    Asset Manager

    9 These weightings are transparent and are documented in the EJE Factor Plan

    11

  • 7/30/2019 Equitable Job Eval Project Overview Report (1)

    12/19

    Communications Advisor

    Organiser

    Knowledge management and information Advisor

    IT database development

    Senior Finance Officer

    IT systems officer

    IT service network database

    PSA asset and administration officer

    On line organiser

    Finance systems and processes officer

    Membership officer

    Organising administrator

    Database development administrator

    Support Administration.

    Project terminatedAs mentioned above, the CYF social worker pay investigation was terminated in

    February 2007. Interviews had taken place with a sample of jobholders in each ofthe five social work roles and planning was underway to establish the comparatorroles.

    Implementation issues for consideration in the review

    The discussion below describes some of the issues/ideas that arose during theimplementation of EJ training, data gathering, the evaluation process and thedescription and interpretation of the EJE factor plan and guidance notes. Whilefurther use of the EJE system would likely refine these ideas or suggestions, it isprobably typical of issues that would be considered in the review process once thebeta phase was completed.

    TrainingExperience has shown that the content and duration of the EJE training needs tobe sensitive to the skills and experience of the audience. However, it was alsoclear that people may not know what they dont know in terms of understandingthe potential for gender bias in job evaluation and HR processes generally.

    Examples of adaptations made to the material are:

    Training Ministry of Education data gatherers who were HR advisors withexperience of interviewing and some general job evaluation knowledge.

    More emphasis was placed in the course on understanding gender bias.The session time was reduced to a half day on condition that they had

    personally completed the questionnaire

    Training PSA data gatherers they too had good interviewing skills and allhad some familiarity (some in depth) with job evaluation and with issuesof bias. The training took half a day with an hour follow up later. None ofthe participants had completed the EJE questionnaire and so extra timewas allocated to going through the questionnaire.

    The modules cover a lot of material and have relied on participants doing some

    substantial home work prior to the course such as reading the factor planand/or completing the EJE questionnaire for their own job. This home work did

    not happen. The modules may need to be revised to allow for this reality

    perhaps by spending more time on the meanings of the factors or going throughthe questionnaire to demonstrate why particular questions are being asked.

    12

  • 7/30/2019 Equitable Job Eval Project Overview Report (1)

    13/19

    The success of the data gatherer and the evaluation training depends on havingsufficient time for practice. In any revision or adaptation this should not becompromised.

    ProcessIn the completed EJE projects from data gathering to evaluations the processas specified in the EJE Users Guide generally worked well.

    Some particular processes worked well and could be emphasized or added to theGuide as tips. These were:

    Staring the data gathering interview by asking the job holder for a generaloverview of their job perhaps by asking about a typical day providesthe interviewer with some useful vocabulary and a slightly wider

    knowledge base from which to probe

    Having the job information validated and signed off meant that there wasconfidence that it was complete and accurate. This supported the practiceof using only the validated information in decision-making. It also meant

    that it was not necessary to ask the job holder to answer questions fromthe committee

    The chair of the committee was one of the consultants. They did not havevoting rights but provided independent clarification of factors and factorlevel distinction when required

    The committee adopted a practice of parking a decision on a particularfactor when there was not sufficient agreement. Often, in discussion ofother factors or on further reading of the questionnaire or job descriptions,

    agreement was possible

    On the initial reading of the questionnaire committee members were

    encouraged to use post it notes to mark text where the informationprovided in one factor was pertinent to another factor

    While the knowledge factor is the first one in the questionnaire, it is alsoone of the more conceptually complex factors. The committee decided toscore this factor after the other skills factors had been scored

    It is important to read and re-read the guidance notes for the factors this allows correction of any misunderstanding or group think about what

    the factor really means

    The committee confirmed that it is essential to take the time to develop

    the scoring rationale once the factor has been evaluated. It is impossibleto re-create this later. The committee articulated its reasons and one

    person wordsmithed the final text which was read to the committee foragreement or modification.

    EJE is a new system and it took the committee some time to come to grips withthe factors and their meaning. To ensure that everyone understood the factors itwas decided that the initial reading of the job information would occur at thecommittee. Members became more confident during the process. At this point in

    a committees development it becomes possible to read the materialindependently prior to the meeting and even begin scoring as long as clear

    13

  • 7/30/2019 Equitable Job Eval Project Overview Report (1)

    14/19

    rationales are kept at the point of scoring. This would substantially reduce thetime it takes to evaluate the jobs. Independent scoring does increase therequirement of the chair to be alert to bias in individual decision-making byobserving patterns of scoring and rationales.

    Factor planExperience working with EJE is limited. However, the experience does suggestconsideration of changes to the factors or the factor guidelines. Some of the

    changes are in factor wording, refining factor concepts, or in the number of or,distinction between factor levels. Some of the minor changes could be made inadvance of any review because they provide greater clarity for use, rather thanany substantive change.

    Comparisons with other systemsTo date, EJE results have been compared with the outcomes of one other system Compers, now owned by Mercer (previously by Watson Wyatt). This system isbeing phased out in New Zealand.

    In the special education support workers pay investigation the EJE data(completed EJE questionnaire and agreed job description) was used by the Mercerconsultant for their Compers evaluation. Given the emphasis that the EJE systemplaces on having complete information about jobs, this was important and may

    have contributed to the similarity of the job rankings between the two systems.

    It is anticipated that EJE will be used as a gender neutrality check of othersystems within some organisations perhaps taking a small sample of the roles,evaluating them using EJE and comparing rank order. Where there aredifferences, it should be possible to isolate the job characteristics that are beingcounted (or not) in one system relative to the other system. Where these coincide

    with gender difference, further exploration will be warranted.

    Becoming more efficient in implementationEJE is in its beta release phase. It may be that some efficiencies in process can beachieved in the light of further experience. However, in the testing phase it hasbeen seen as important to test all the elements that experience elsewhere andresearch and jurisprudence suggest contribute to best practice gender neutral jobevaluation.

    The questions below illustrate some areas where experience (mainly of the twopay investigations) would indicate the need for some efficiencies and further

    guidelines/advice on making these.

    Identifying comparator occupationsThe general principle was to use male-dominated occupations that are largelypublic-sector based and have their pay set in the public sector market, and thatare the same ANZSCO skill level. There was some anxiety about using

    occupations that were apparently dissimilar in the nature of the work.

    The use of male-dominated comparators has been seen as necessary in the earlystages of implementation of EJE, to support arguments that differences inremuneration among jobs that are of similar job-evaluated size are gender-related. It would also be possible to use female-dominated comparators where it

    could be demonstrated that evaluation of them was free of gender bias.

    14

  • 7/30/2019 Equitable Job Eval Project Overview Report (1)

    15/19

    A difficulty that arises in choosing appropriate comparators is that the moredissimilarities there are in the work of the male- and female-dominatedoccupations, the more likely it is that the differences in remuneration may beseen as appropriately and legitimately attributable to differences in the nature ofthe work, irrespective of comparability of measured job size. However, it is likelythat there will be significant differences between female- and male-dominated

    jobs because of the way occupational segregation has developed in the labourmarket, and has been reflected in job evaluation and wage-fixing systems over

    time.

    The use of specific male-dominated comparators to establish gender neutral jobevaluation may not be necessary since job evaluation systems themselves buildin comparisons among jobs of similar size irrespective of similarity of job content.Once there is confidence in the job evaluation system, and it can bedemonstrated to be gender neutral, it can be sufficient that evaluations arecarried out using the gender neutral job evaluation system. In the UK, Canada,and the USA, among other countries, the use of a gender-neutral job evaluationsystem on its own provides evidence about the relative value of jobs. In the equal

    remuneration principles established in the Australian states New South Wales andQueensland, comparators specifically were not required.

    In operational terms, some delays were occasioned in seeking the participation of

    external organizations in providing comparator occupations. This is probablyrelated to concerns about the possible industrial implications for the organizationproviding the comparators as well as about the resources involved inparticipating.

    The demands on comparator organisations were very similar to those on the

    target organization provide contact people to locate jobholders, provide time forinterviews (up to 2 hours), provide other people, managers and staff, to validate

    the job information and provide details of remuneration setting and any factorsthat impact on this. Once comparator sites were located, job holders themselves

    were extremely cooperative and helpful.

    Choosing comparators at the national level could not recognise or take account oforganisational or local complexities such as (in the case of one DHB) thepressure on orderlies to manage the move to the new hospital block or, themedia attention to a series of events at one prison that made staff wary of extraengagement.

    The time projects have takenThe projects have had considerably more complex governance arrangements than

    job evaluation generally does, partly in response to the perceived need tomanage the risks of a new tool and new process, and concerns about managingthe possible resource implications. This also meant that key decisions tookconsiderable time.

    The administrative support required for the projects was substantial for

    organising data gathering interviews, negotiating participation of job holders andmanagers, and negotiating participation of comparator organizations and job

    holders and their managers

    In both pay investigations the working groups decided that the sample of targetgroups had to meet particular criteria including geographic spread. That was

    because in some cases there was a belief that the job was implementeddifferently in different locations. In other cases the decision was to assure job

    15

  • 7/30/2019 Equitable Job Eval Project Overview Report (1)

    16/19

    holders that the process was inclusive. The result was that a great deal of travelwas necessary and sometimes for only one appointment. This increased the timeand cost of the projects.

    An underlying principle of EJE is that gender neutrality in job evaluation requiresa significant level of participation of employees, in both data gathering and in

    evaluation.

    Some job evaluation practitioners and/or systems rely on existing job descriptionswhich may be incomplete, out-of-date and/or not endorsed as accurate by job

    holders and their managers. Using the EJE job information questionnaire to collectjob information is time-consuming. Experience with the major UK job evaluationschemes aimed at improving gender neutrality is that the quality of theevaluations depends heavily on the quality of the job information, andendorsement of job information by job holders and their managers is a criticalelement of that.

    Some job evaluations are carried out solely by a single consultant. While that can

    be quick, there is a potential for cutting corners and for bias. The Gender-inclusive Job Evaluation Standard P8007/2006 advises that a committee is used,

    or if the project is a small one, that there is a quality assurance process involvinganother evaluator10.

    Using EJE without its own market data base ofpoints and pay

    EJE can be used without a data base of its own by drawing on a range of marketinformation for some jobs to provide anchor points for an EJE points to pay line.

    For example, some jobs considered unlikely to be affected by the differencesarising from specific features of EJE can be used as reference points for one or

    more market surveys. Organisations often do refer to more than one marketsurvey in setting pay rates.

    Concerns about how to use the results of EJE while its own market data base wasbeing developed, and also an interest in gathering data on how EJE evaluationscompare with evaluations using other systems has meant that EJE was used inconjunction with other systems in most projects. The community support workerproject involved use of EJE alone and the results were for use in bargaining and indiscussions with funding bodies.

    The future of EJEEvery effort was made to minimise the risk of gender bias during the design of

    the EJE tool, associated resources and implementation process. Maintenance ofthis gender neutrality in the use of EJE relies on some of the integrity supportsbeing in place. The PEEU has managed access to, use and further developmentof EJE through mechanisms such as the Conditions of Use Agreement and the

    Monitoring and Review Committee.

    The PEEU has now been disestablished. In order to maintain the integrity of EJEthe following issues/processes will need to be supported in an alternative manner.

    - Ownership of EJE

    10Sue Hastings, UK job evaluation expert, has questioned whether it is possible to support a claimthat a job evaluation process is gender neutral if a joint employer/employee job evaluation committee

    has not been used.

    16

  • 7/30/2019 Equitable Job Eval Project Overview Report (1)

    17/19

    - Conditions of Use Agreement

    - Providing the EJE materials

    - Training

    - Review/refinement

    - Building up a database

    - Monitoring EJE outcomes

    The Department of Labour will continue to deliver the EJE tool. The Departmentswebsite will display information about the tool, and a Conditions of UseAgreement to complete to order the tool. The tool will then be delivered by email.

    Periodically users will be requested to provide the information they have agreed

    to provide about the project(s) they are undertaking and the information will beentered into the EJE monitoring tool11. The Monitoring and Review Committee willbe convened annually to review the evaluation records and scores. If there is a

    need for specialist job evaluation advice, it can be requested from the panel ofjob evaluation providers who have experience and/or training with EJE and haveagreed to provide advice. The Monitoring and Review Committee will have theresponsibility of conducting a review of the beta release of the tool once therehave been several hundred evaluations, and recommending to the Department ofLabour any changes the Committee considers necessary.

    ConclusionThere has been limited use of EJE to date. Experience in using the tool is that it

    can contribute to full and fair description and analysis of jobs, especially in servicesector occupations. Participants in the job evaluation projects have valued thecontribution of the EJE language and concepts to capturing job elements in betterways. The extent of the contribution will depend on the level of ongoing use andthe management of the quality and integrity of the system.

    11

    The EJE Monitoring Tool is stored by the Department of Labour and the currentdata base of job evaluation scores (for ongoing updates and committeemonitoring) includes thirty-one roles.

    17

  • 7/30/2019 Equitable Job Eval Project Overview Report (1)

    18/19

    Appendix 1 The Gender-inclusive Job EvaluationStandard, and Spotlight: A Skills RecognitionTool

    The Gender-inclusive Job Evaluation Standard was developed during 2006,adopted by the New Zealand Standards Council, and published by Standards NewZealand in December 2006. The Standards development committee includedrepresentatives of employers and unions, and experts in job evaluation and ingender issues, and the Human Resources Institute of new Zealand. The jobevaluation providers on the committee (Hay, Mercer and Strategic Pay) included

    the major providers of job evaluation in New Zealand. They have all undertakento meet the Standard as have other providers. The companies have provided

    statements on how their systems meet the Standard and where clients request it,statements on how the processes for particular evaluation projects meet theStandard. These statements provide valuable input for clients. As the Standard isa voluntary one, responsibility for demonstrating how the Standard is met lieswith those who claim to meet it, and responsibility for assessing their claims withthose to who they make the claim (most commonly human resources managers).Some job evaluation providers have advised that they now provide trainingand/or briefing on gender-neutral job evaluation for their own consultants and for

    participants in job evaluation projects.

    The Standard is presented in four sections:

    A description of job evaluation and of the Standard

    An outline of how gender bias can arise in job evaluation

    Requirements and optional guidance for planning and preparing jobevaluation projects

    Requirements and optional guidance for evaluation of jobs and reviewing

    evaluations, including appeal and review procedures, and the issue of

    slotting.

    Spotlight: A Skills Recognition Tool was developed and tested in New Zealandpublic sector workplaces by an Australian and New Zealand team led by Dr AnneJunor, University of New South Wales. The tool is to improve recognition of skills,

    especially those in service sector occupations, and to inform a range of humanresources management processes including recruitment, writing position

    descriptions, learning and development, and job evaluation. It complements otherskills and job description and job analysis instruments, and focuses specifically on

    the types of skills that are often overlooked, especially in human services workand in jobs in the lower levels of organisational hierarchies. The main types ofskills often overlooked are the skills of combining activities in work streams, and

    those involved in the sensitive, responsive and integrated delivery of appropriateservices to people.

    Spotlight provides a taxonomy of three sets of under-recognised tacit work skills,each divided into three skill elements, and five experience-based skill levels at

    which each skill element can be used. It can be used to describe the performanceof work in any job at any functional level. It has a set of pre-classified

    empirically-derived work activity descriptors through which the skill elements andlevels can be recognised. Based on this set of descriptors, it provides a jobanalysis questionnaire for use in identifying the implicit demand for those skills inany job and a skills audit questionnaire for use by individuals and teams toidentify their level of proficiency in using these skills, It includes a cross-

    referencing system whereby personal attributes and employability skills can bedefined more precisely and at different levels of workplace learning (the skills of

    18

  • 7/30/2019 Equitable Job Eval Project Overview Report (1)

    19/19

    experience), specifically focusing on attributes, customer focus, problem-solving, teamwork and leadership. It also incorporates a succinct graphicaltechnique for representing the combination of tacit work process skills and levelsrequired by a job and/or within an individuals capabilities at a point in time.Several briefings on the tool have been provided and its application is beingexplored in some community sector settings.

    19