development of a promotional assessment center in the
Post on 17-Oct-2021
1 Views
Preview:
TRANSCRIPT
Development of a Promotional 1
Running head: Development Of A Promotional Assessment Center
Development of a Promotional Assessment Center
in the Springfield Fire Department
Jerrold E. Prendergast
Springfield Massachusetts Fire Department
Development of a Promotional 2
I hereby certify that this paper constitutes my own language, where the language of
others is set forth, quotation marks so indicate and that appropriate credit is given where I have
used the language, ideas, expressions, or writings of another.
Signed ____________________________________
Development of a Promotional 3
Abstract
A properly developed and implemented promotional assessment can offer a superior
evaluation method for determining the most qualified Fire Lieutenant candidates. The problem is
having identified an assessment center as a better process for selecting future Fire Lieutenants;
the SFD does not know how to develop the assessment center process. The purpose of this ARP
is to make recommendations to the Fire Commissioner as to how to the SFD should develop an
assessment center for Fire Lieutenant Candidates.
The evaluative research method will be used to answer the following research questions:
what are the components of a promotional assessment center, what would be the internal barriers
to the successful utilization of an assessment center in the SFD, how do similar sized fire
departments in the US, that utilize assessment centers, conduct their assessment centers, and how
should the SFD ensure the successful implementation of the promotional assessment center.
Results have determined that the SFD, by following long established guidelines that
detail the common components that should be included in an assessment center, can develop and
implement a successful assessment center to determine the most-qualified Fire Lieutenant
candidates. Research suggests that the majority of US fire departments surveyed have included
many of these common components in their own promotional assessment centers, however, it is
unclear if it is due to any familiarity with any guidelines. Some of the internal barriers to
implementation of a promotional assessment center that should be considered include poor
planning, misuse of results, negative employee perception, and failing to maintain the relevancy
of the assessment center.
Development of a Promotional 4
Table of Contents
Certification Statement …………………………………………………………………………. 2
Abstract …………………………………………………………………………………………. 3
Introduction ……………………………………………………………………………………... 5
Background and Significance …………………………………………………………………... 6
Literature Review……………………………………………………………………………….. 9
Procedures …………………………………………………………………………………..… 20
Results ……………………………………………………………………………………….... 21
Discussion ……………………………………………………………………………..……… 27
Recommendations ……………………………………………………………………..……… 31
Reference List ………………………………………………………………………..……….. 34
Appendix A: Springfield FD Organizational Chart ……………...………………….…………38
Appendix B: HRD Fire Lieutenant Reading List ……...…………………………..…………..39
Appendix C: HRD Promotional Education/Experience Work Sheet …………………...……..42
Appendix D: US Fire Departments with Staffing Levels in the range of 200-400 ..………….43
Appendix E: Request to Participate in Questionnaire …………………………………………46
Appendix F: Questionnaire and Results ………………………………………………………47
Development of a Promotional 5
Introduction
Among the many responsibilities of a fire service manager, one of the most important and
most difficult has to be the selection of individuals for promotion to their first fire officer
position considering that selecting someone other than the most qualified can have detrimental
effects on the future development of both the officer and the department (Bachtler & Brennan
1995). A careful and thorough review of each candidate’s knowledge, skills, and ability can
greatly assist in making an informed decision regarding which candidate(s) are the most
qualified to assume the duties of an officer.
In general, qualified promotional candidates are identified through one or more
evaluation methods. Some of the common methods include: seniority, performance success,
merit, military service, written exams, interviews, evaluations, recommendations, and assessment
centers (Compton, D. & Granito, J. 2002). The SFD currently uses a combination of a written
exam and interview panel to select its Fire Lieutenant candidates, however in its effort to
improve decisions regarding promotional candidates, the SFD has identified the assessment
center as a better method for accomplishing this.
The problem is having identified an assessment center as a better process for selecting
future Fire Lieutenants; the SFD does not know how to develop the assessment center process.
The purpose of this ARP is to make recommendations to the Fire Commissioner as to how to the
SFD should develop an assessment center for Fire Lieutenant Candidates. The evaluative
research method will be used to answer the following research questions: what are the
components of a promotional assessment center, what would be the internal barriers to the
successful utilization of an assessment center in the SFD, how do similar sized fire departments
Development of a Promotional 6
in the US, that utilize assessment centers, conduct their assessment centers, and how should the
SFD ensure the successful implementation of the promotional assessment center.
Background and Significance
Established as a fire department in 1794, today the SFD is a career fire department that is
comprised of 25 civilians, 173 firefighters, 36 Fire Lieutenants, 14 Fire Captains, 8 District Fire
Chiefs, 2 Deputy Fire Chiefs, and 1 Fire Commissioner (Appendix A). The SFD provides
emergency and prevention services to 153,000 residents of the City of Springfield that reside
within 32.2 square miles (US Census Bureau).
Promotions in the SFD are governed by a civil service merit system, which is
administered by the Commonwealth of Massachusetts Human Resources Division (HRD), in
accordance with Massachusetts General Law Chapter 31. To assist in implementing their
statuary duty, HRD has promulgated standards that each “civil service department” must follow
when promoting individuals within their respective departments. According to HRD, of the 360
fire departments in the Commonwealth of Massachusetts (Commonwealth), 28% are civil service
departments (HRD, 2010).
The current promotional process for Fire Lieutenant in the SFD is a combination of a
civil service multiple-choice written exam and a department interview panel. The written exam,
consisting of 100- multiple choice questions, is conducted under the supervision of the HRD and
is generally held every 2-years and is at a neutral site within the Commonwealth. Six months
prior to the administering of the exam, participants are provided with a HRD generated “reading
list” indicating the texts that the exam questions will cover (Appendix B). At the time of the
administering of the written exam participants provide information relative to their education and
experience (Appendix C) that will be weighted and combined with the results of their written
Development of a Promotional 7
exam. As of July 2011, the weights given to the written and education and experience ratings are
80% and 20%, respectively. The minimum passing net score, which is a product of both the
written exam score and education and experience ratings, participants must attain is 70%.
Subsequent to the administration of the exam, HRD conducts a review of the exam for
irregularities including but not limited to biases and questions that may have more than one
correct answer. After this review, participants receive their net scores and an “eligibility” list is
generated by HRD that contains the names, ranked by score, of each participant that passed the
written exam . When the SFD determines that Fire Lieutenant vacancies exist and that they are to
be filled, a written request is submitted to HRD for a “certified” list of candidates.
The certified list is generated from the eligibility list using the formula 2n+1. With “n”
representing the number of vacancies that are going to be filled. The formula ensures that the
certified list containing twice the number of candidates, plus one more candidate, for each
vacancy to be filled. For example, if the SFD elects to fill 3 Fire Lieutenant vacancies, the
certified list would contain the names of a minimum of 7 candidates ranked, in descending order,
by their net score. Candidates that have the same net score are considered to be “tied” and as a
result would appear of the certified list as having equal standing. The certified list is then sent to
the appointing authority for use in the next steps of the promotional process, evaluation and
selection.
The evaluation and selection stage of the promotional process is handled by the
appointing authority. To begin this stage of the process the appointing authority, in this case the
Fire Commissioner, using the 2n+1 rule, notifies the candidates in writing that a vacancy is going
to be filled and that, if the candidate wants to be considered for the promotion, they must report
in person to the Office of the Fire Commissioner, within a certain time period, generally 2-
Development of a Promotional 8
weeks, and indicate such on the certified list and sign their name. Once all eligible candidates
have been given the opportunity to indicate a willingness sign the certification the appointing
authority then notifies the candidates, in writing, of the date and time of their oral interview.
The SFD currently uses a panel type of oral interview that generally is made up of at least
2 command officers. The panel uses both behavioral and situational questions in order to
evaluate each candidate’s knowledge, skills and abilities. Responses are rated on a scale of 1-5,
with 1 being poor and 5 being outstanding. At the end of each interview each panelist tallies up
the candidates ratings on each question and gives each candidate an interview score. The
interview score is used along with a review of other dimensions including but not limited to
attendance, education, and work experience. After this review the appointing authority then
makes a selection based on their opinion of who is the most qualified candidate. The entire
promotional process described can take 6 or more months to complete.
A public safety efficiency study completed by Carroll Buracker & Associates in 2006,
described the SFD promotional process as “dated and inconsistent with state of the art
promotional practices utilized in other fire departments. To date, there has been very little
progress in modernizing the process. The SFD continues to rely on the results of a civil service
multiple-choice, written exam and an oral-interview process that as of the date of this ARP has
never been professionally validated. This lack of progress coupled with the recent findings in
Ricci v. DeStefano, 129 S. Ct. 2658 (2009), a Supreme Court decision, where the City of New
Haven was found to have discriminated against a group of officer candidates have highlighted
the need for the SFD to revisit its promotional process to make changes to bring the process up to
modern standards. To ignore this opportunity for change would pay a disservice to the
Development of a Promotional 9
promotional process in the SFD and question the credibility and reliability of the entire process
for those future Fire Lieutenant candidates seeking promotion.
This ARP is related to the course content presented in Executive Fire Officer Course
Executive Leadership. More specifically, it is related to the succession planning material
presented in Module 7 of the Executive Leadership Student Manual which asserts that a “the
time spent in developing a selection process helps improve the deployment of human resources
and provides for better fits (USFA, 2010). The ARP also impacts 4 of the 5 USFA strategic goals
since a process that identifies the most qualified officer candidate has the probability of
positively influencing an organization’s decisions involving prevention, mitigation, response and
recovery, and professional status.
Literature Review
With its over-reliance on HRD for the development and administration of testing
procedures, the SFD has gained no appreciable experience in designing and implementing its
own promotional testing model. This lack of experience has created a disadvantage for the SFD
in that any effort to adopt a different promotional model will require first an acquiring of an
understanding of what an assessment center is and its different components. Just as important in
understanding assessment centers will be forecasting any internal barriers, both personnel and
structural, that could affect the success of the changeover to an assessment center. Finally, in
order to avoid any common mistakes that could accompany the change to an assessment center
the SFD should, prior to finalizing any development efforts take advantage of other fire
department’s experience that has conducted assessment centers.
The assessment center is not a new concept. The origins of the assessment center process
can be traced back as early as the early part of the twentieth century where it was used for officer
Development of a Promotional 10
selections by both the German Army in WWI (Ward, 2006) and by British intelligence during
WWII (Schwabe, 1982). Contemporary use of assessment center includes both the private and
public sector. The American Telephone and Telegraph Company began using the assessment
center process for their manager selections (Schwabe, 1982) and the cities of Loveland, CO,
Richmond, VA, and Gillette, WY have all used assessment centers for filling vacancies in
various high level positions, including but not limited to their city managers, police and fire
chiefs (Taylor, n.d.).
The assessment center process is a consistent process that offers are considered superior
over traditional multiple-choice exams. While multiple choices exams do offer certain
advantages over other types of assessments including cost effectiveness, less time to complete,
and capable of being machine graded (Kuechler and Simkin, 2003), one of the major
disadvantages is that they use just a single method of evaluation. Another disadvantage of
written exams is that with regards to promotions, the candidate may perform well on the written
exam but when faced with the “real-life” situations that face the company officer, they do not
perform as well (Ward, 2006). Kouwe (1993) viewed assessment centers used for promotions as
an objective method for “evaluating the performance of promotional applicants within the
context of the actual activities they will perform…” (pg. 29). Thornton III and Rupp (2008)
detail the assessment center further by stating that the assessment center is a comprehensive
measurement for assessing the future potential of aspiring employees by evaluating how well
they display those behaviors that have been determined to be necessary if one is to be successful
in the new position. Byham (2011) states that by using more than one method of evaluation,
assessment centers allow for a more comprehensive analysis of those behaviors that are deemed
more
Development of a Promotional 11
relevant for a candidate to be successful in the position they are vying for. Simply put,
assessment centers, by using several different but related assessment procedures allows for a
more complete picture of the candidates ability to succeed in the position contemplated (Byham
2011, Thornton III 1988).
To ensure that an assessment center is valid, there are certain common components that
should be present in the process. Since 1975, the International Task Force on Assessment Center
Guidelines (ITFACG) has published the report, Guidelines and Ethical Considerations for
Assessment Center Operations (GECACO), that provides guidance for organizations and
professionals wishing to develop and use the assessment center process. In the latest revision,
Rupp, Reynolds, et el.(2009) state that an assessment center, must contain the following key
components, (1) a job analysis; (2) a classification of behaviors to be measured; (3) assessment
center techniques: (4) multiple assessments; (5) simulations: (6) multiple trained assessors: (7)
method for capturing candidate behaviors: (8) individual reports on each candidate; and (9) a
final report containing the notes from each assessors (pg. 245-246). Caldwell, Gruys, &
Thornton III (2003) believed that, to ensure that an assessment center is given its best chance of
being “valid and predictive,” the assessment center should be based on the GECACO (pg. 10).
One of the most important steps involved in the development of an assessment centers is
the job analysis. Prior to the development of an assessment center, it is essential that a thorough
understanding of the essential behaviors of the target position be captured. The conducting of a
comprehensive job analysis (JA) process allows the organization to identify these essential
behaviors (Caldwell, 2003, Terpak, 2009). Furthermore, when conducting the JA the
organization should be looking for those general behaviors that can be used to determine the
“dimensions, competencies, attributes and job performance,” which are deemed a must for the
Development of a Promotional 12
candidate to be successful in fulfilling the new role (Rupp, et el 2008 pg.245). The criticality of
the JA process is further emphasized by both Neidig and Neidig (1988) who found that, as far as
ensuring that an assessment center is relevant, there is no more important step than the job
analysis and Sackett (1999) who points out that the job analysis step is such a critical step that an
organizations failure to properly conduct the JA invites questions about the validity of the whole
process. Others (Cosner & Baumgart, 2000) stressing the importance of the JA recommend that
in addition to the reviewing for the essential behaviors of the targeted position, to ensure that the
JA is complete, there must be a review of the department and/or unit as well as the community
that the department/unit serves.
Within the JA step there are many methods for determining the essential behaviors of a
position. These methods include, a review of the position’s job description, conducting
interviews with those that hold or have served in the position, and conducting direct observations
of incumbents while they perform the tasks required of the position (Adler, Gold, Gowing &
Morris 2008). In their study, A Survey of Assessment Center Practices in the Organizations in
the United States, Spychalski, Quinones, Gaugler, and Pohley (1997) found that in addition to
job description reviews and interviews many organizations used job questionnaires filled out by
both the incumbent and their supervisor. But it was Neidig & Neidig (1988) who prescribed a
simpler method by inventorying all of the tasks of the position, rate the tasks from most to least
important, and identifying the skills, knowledge, and abilities for those that are the most
important to the position. Once completed, the results of the JA should provide the organization
with comprehensive data on the behaviors of the targeted position.
From the perspective of accuracy it is important that only relevant and essential behaviors
of the targeted position are captured. Spychalski, et el (1997) determined that it was common for
Development of a Promotional 13
many organizations in the job analysis step to make the mistake of identifying a large number of
behaviors as being essential. Identifying too many behaviors can impact the ability of the results
of the assessment center. Gaugler and Thornton III (1989) found that to ensure the accuracy of
assessor’s ratings in an assessment center, it is recommended that, as a best practice, an
assessment center limit the number of behaviors that are going to be evaluated. Due to the
possibility of inconsistent assessor’s ratings and an increase in the number of assessment center
exercises, Thornton III (1992) found that as a guide, assessors should not be asked to rate on
more than 5-7 dimensions.
Developing exercises for an assessment center is another essential step of the assessment
center process. Caldwell et el (2003) found several common errors in assessment center design
and lists poor exercise development as one of most common critical errors due to the failure of
organizations to ensure that the exercises measure properly the behaviors or dimensions
established in the JA stage. These thoughts are echoed by Thornton III (2004) who found that
failing to properly design exercises can lead to inefficiency and, if the exercises discriminate
against a protected group, can also lead to legal challenges to the assessment center. Exercises
that require simulations are considered by many to be the most effective way of determining
future capability and according to Spychalski et el (1997) are used by a majority of assessment
center processes . The reason for their popularity is expressed by Byham (n.d.) who offers that
simulations exercises are designed to elicit those behaviors that are the most relevant to the
prospective position being considered. Thornton III (2004) believed that exercises involving job
simulation are essential to any successful assessment center. Spychalski, et el (1997) lists some
of the core exercises in use including, in-baskets, leaderless group discussions, role playing, and
Development of a Promotional 14
interviews. Taylor (n.d.) adds that the use of presentations, both oral and written, is also common
exercise that is used by many assessment centers.
In basket exercises are meant to reflect what one would find in the “office in basket” of
someone holding the prospective position and are generally comprised of several different
communications including but not limited to reports, memos, phone messages, and other types of
written correspondence Taylor (n.d.). The participant is given a set time to prioritize and take
action on each of the items and to explain their rationale for their actions. Due to the increased
need for personnel to work as part of a team the leaderless group discussion
(LGD) has become an important part of the assessment center. In a LGD exercise, assessment
center participants are presented with an issue, given limited instruction, and then with or
without predetermined roles asked to solve the issue as a team while operating under time
constraints and the requirement that the group reach a decision by consensus (Schwabe 1982).
Michelson, Maher, and Curran (2001) state that the value of the LGD is it allows each participant
to demonstrate dimensions that may indicate future ability to lead a group, including but not
limited to interpersonal skills, initiative, and persuasiveness. In the role playing exercise, Taylor
(n.d) states that candidates are asked to pretend they are serving in the sought after position and
interact with an individual that they may encounter during the normal course of their duties.
Many find the idea of role playing difficult. Rowe (2001) found that many candidates are not at
ease in the role playing exercise and have complained that, of all of the exercises they are asked
to perform, they are the most uncomfortable in the role play. Part of the reason for the dislike of
the role-play may be the confusion that participants have concerning what is being asked of
them. Thornton III and Rupp (2004) recommend that participants may feel more at ease if they
display the behaviors according their usual style rather than taking the term role playing literally
Development of a Promotional 15
and attempting to “play a role.” The interview exercise is generally the last of the exercises used
and its use allows the observers to ask any clarifying questions of each candidate that may assist
them in rating their performance (Taylor n.d.). The decision regarding which of the
aforementioned exercises and how many to use may be difficult one. Gaugler, et el (1987), found
that the accuracy of the assessment center in predicting the most qualified candidate was a
product of the both the number and the type of exercises that were used. With regards to
promotional assessment centers, Thornton III (1992) recommends that there should be
approximately of 3-5 exercises that measure global traits. Once an organization has determined
the type and number of exercises the next step in the assessment center process involves
evaluating the candidates as they perform each of the exercises.
The evaluation step is accomplished by selecting and assigning trained assessors whose
primary role it will be to observe and report how each candidate performs compared to a
predetermined set of performance indicators (Lowe, 2006). When selecting assessors,
organizations must thoroughly examine the background of those that are being considered. As
part of the initial screening of would be assessors, Taylor (n.d.) suggests that, generally, the
individual should have a background in the position being evaluated and serves as the supervisor
of the position. Similar findings are offered by Rupp, et el (2008) who indicates that when
selecting individuals to serve as assessors there should be a primary focus on having a diverse
representation among assessors including selecting individuals from different race and ethnic
backgrounds as well as diverse experience levels. Rupp, et el also suggests that, again for the
sake of diversity that managers, psychologists and supervisors are included in the assessor pool,
with special note that when selecting supervisors to serve as an assessor it is a best practice not to
include someone who serves as a participant’s direct report. Spychalski (1997) found that
Development of a Promotional 16
supervisors serving in the assessor role were generally two levels above level that the candidate
was seeking. Assessors need not come from the organization seeking to conduct an assessment
center. Byham’s (n.d.) opinion regarding selection of in-house assessors is that due to constraints
of qualifications and time organizations may not be able to field an ample number of assessors
within their own ranks. Under these circumstances, Taylor (n.d.) states that organizations may
benefit more by using assessors from outside agencies because representatives from outside
agencies organizations may increase the perception of the fairness and validity of the results of
the assessment center. However, Thornton III and Hanson, (2009) provide a contrary opinion
since they believe that external observers could have a negative impact on the assessment center
due their lack of detailed knowledge of the position being assessed and of the organization.
In addition to the reviewing the background of assessors, the number of assessors that
will be needed and the type of training to provide to those that will serve as assessors must also
be considered. There are many considerations when trying to determine the number of assessors
needed. The GECACO recommends that multiple assessors are needed for the exercise
component and that factors to be considered in determining the number of assessors needed
include but are not limited to “the experience of the assessors, the amount of training provided
and the type of exercises used (Rupp, et el, 2008). Spychalski, et el (1997) found that
promotional assessment centers evaluate more candidates than other types due to using fewer
exercises which allowed assessors to evaluate more candidates.
The proper training of assessors also plays a major role in the whole assessment center
process and has an impact on the credibility of the entire process. Improper training can affect
the ability of the assessors to objectively and effectively rate the performance of the candidates
(Caldwell et el 2003). According to Thornton III and Hanson (2009) n order to ensure
Development of a Promotional 17
consistency in the development of assessor training programs, organizations should take into
account several factors including the, length of the training, format of training, content, location
and scheduling of training, assessor performance and feedback (Thornton III and Hanson 2009).
Another opinion regarding assessor training is that the training of assessors plays an “integral
part of the assessment center program” and Rupp et el, (2008) offers, that to ensure quality it is
important that an assessor have an understanding of the “purpose of the assessment center, be
able to recognize, observe, rate, and report candidate behavior, administer exercises, be
consistent in their role playing parts, and provide meaningful feedback” (pg. 248).Consistency in
the ratings of assessors is a concern of Taylor’s (n.d.) who recommends that assessors, regardless
of the current skill and knowledge level, should be required to undergo mandatory training.
There are many internal obstacles or barriers that can affect an organization’s ability to
successfully use an assessment center process. According to Stiber (n.d.), “the difference
between an effective and ineffective …assessment program lies in the quality of planning…”
Thornton III (1992) also found in his experience that poor planning was a major flaw in
assessment center failures, including failing to garner the overall support of those serving in the
highest levels of the organization including their commitment to dedicate the time and finances
necessary to use and sustain the process; failing to ensure that there is a continual cadre of line
staff to keep the effort going; and a general failing to properly understand what it takes to
conduct an efficient and effective assessment center. Jeanneret (1989) (as cited in Thornton III
1992, pg. 217) in a study of an individual organization’s assessment center efforts found that
there must be a plan in place to ensure that any changes in job responsibilities or duties are
reflected in the assessment center or the organizations faces the risk that the relevance of
assessment center becomes an issue. Failing to properly plan for other qualification analysis can
Development of a Promotional 18
affect assessment center use as well. Using the assessment center as the only measurement of
qualifications can also affect the success of an assessment center. Thornton III and Bynham
(1982) believed that the effective utilization of an assessment center occurred, not when the
process was used in isolation but instead, when it was used as part of an overall selection system.
Lowry (1996) recommended that a plan for a systems approach to selection should include,
ensuring that the assessment center process part of an overall plan by including other dimensions
such as interviews, background checks and in some cases medical exams.
There are several factors that need to be considered when an organization is ready to
implement an assessment center. Primarily among these consideration, is the reaction by the
employees in the organization who are going to be affected by the new initiative. Schawbe
(1999) found that employee reactions can be both positive and negative and that regardless of the
nature of the reaction, it was critical that the organization “keep employees informed and
involved in the assessment center from the start” (pg. 19).
The key to the successful implementation of an assessment center, especially in union
environments, is the organization’s commitment to communicating with its employees, prior to
any implementation effort. The Bureau of National Affairs (2008) found that many collective
bargaining agreements (CBA) contain language governing promotions, including the process
used to determine ability. Depending on the restrictions placed on an organization by the CBA,
the ability to implement a measurement method that is contrary to the CBA may require extra
effort in the form of collective bargaining. Lacking promotion restrictions in a CBA,
organizations have been granted wide latitude in determining the method for measuring ability as
long as the measurement tool produces results that are “fair and non-discriminatory” (Elkouri
and Elkouri, 2003, pg 882).
Development of a Promotional 19
Additional requirements to the implementation of an assessment center may have to be
met in organizations that are governed by state civil service agencies. Under the Massachusetts
civil service system, fire departments can elect to use either a traditional, multiple-choice exam
or an assessment center as part of their promotional process. According to HRD, there are 3-
types of assessment centers allowed under civil service, the weighted-graded, the sole ranking,
and the post-list (HRD, 2007). The weighted-graded assessment center process involves the
incorporation of the results of the, assessment center, multiple-choice exam, and experience and
education worksheet (Appendix E) into one final score. The sole ranking assessment center
allows for the generation of the final score based on only the assessment center and the
experience and education worksheet. In both the weighted-graded and the sole ranking
assessment center process, it is a requirement that the requesting agency hire a qualified
consultant, who is approved by HRD, to develop and conduct the assessment center portions of
the process. The final type of assessment center permitted by HRD is the post-list assessment
center (PLAC) that allows a fire department to conduct their own assessment center without
hiring a consultant. The PLAC requires candidates to take and pass the multiple-choice exam and
then, using the 2n+1 formula, allows the fire department to use the assessment center along with
the interview as part of the final selection process. Information available from HRD, indicates
that, overall, agencies use of the assessment center option happens less frequently in
Massachusetts with the sole-ranking model used only slightly more than the weighted-graded
model (L. Columbre, personal communication, 9/24/11). The multiple-choice exam continues to
serve as the predominate testing procedure used by most civil service fire departments in
Massachusetts.
Development of a Promotional 20
Procedures
Research into assessment centers began at the Learning Resources Center (LRC) located
at the NFA. A review of published literature on assessment centers at the LRC included books,
periodicals, and web-based material. A similar review continued at the author’s local library. It
was determined that to ensure that the recommendations to the Fire Commissioner as to how to
the SFD should develop an assessment center for Fire Lieutenant Candidates were
comprehensive, in addition to the review of published literature on assessment centers, a
questionnaire designed to elicit the promotional assessment center experience of other US fire
departments would be required.
The first step in the questionnaire process was to identify a group of fire departments that
had similar staffing levels as the SFD and designate them as questionnaire recipients. Using the
SFD FY 2010 staffing level of 245 as a guide and subsequent information provided by both
Neroulas and Roche (2009) and the website, usfiredept.com, research was conducted into the
staffing profiles of US Fire Departments to identify those fire departments that most closely
resembled the SFD. It was clear from the initial research that to ensure that the questionnaire
reached enough fire departments that could comment on assessment center use that a staffing
range would have better serve the intent of the questionnaire. Research indicated that there were
81 fire departments that had between 200-300 staff and 120 fire departments that has staffing
level between 200-400. Due to the uncertainty surrounding the number of fire departments that
utilize the assessment center, it was determined that fire departments with staffing levels in the
range of 200-400 would provide an improved opportunity to capture a sufficient number of
departments that had promotional assessment center experience.
Development of a Promotional 21
There were 132 fire departments identified with staffing levels in the range of 200-400
(Appendix D). Of the 132 fire departments, the majority (97%) were municipal or county fire
departments and the remainder (3%) representing federal or state fire resources. The mean,
median, and mode staffing levels of the 132 fire departments were 288, 260, and 250,
respectively. Using information provided by usfiredept.com, fire department websites, and
municipal websites, those fire departments with email information available were sent email
requests to participate in the questionnaire process and the remainder of were sent requests by
way of the US Postal Service.
The request to participate in the questionnaire was in the form of a
correspondence (Appendix E) and indicated that the questionnaire was part of a NFA EFOP
ARP and that the results would be used to assist the SFD in the development and implementation
of their own assessment center. The request also provided supplied a web-link to the online
questionnaire and notified recipients that the questionnaire results may also become part of the
permanent collection of ARP’s located at the NFA. The questionnaire consisted of 10-multiple
choice questions and addressed the promotional assessment center process employed by the
organization. The format of the questionnaire was designed using the website,
Surveymonkey.com. Of the 120 fire departments that were sent survey requests, 80 completed
the survey.
Results
The purpose of the research presented in this ARP is to provide guidance to the Fire SFD
as to how they can develop their own assessment center for filling future Fire Lieutenant
vacancies. A review of published literature coupled with the experience of other fire departments
in the US suggests that the successful development and implementation of a promotional
Development of a Promotional 22
assessment center is contingent on both the adherence to several assessment center concepts and
the successful forecasting of obstacles within the organization contemplating the implantation of
an assessment center.
Research indicates that a promotional assessment center should contain several essential
components, if it is to be considered a valid assessment center. Professional guidance provided
by GECACO stresses that these components include, a complete job analysis of the position so
that the appropriate behaviors that are essential can be captured and classified, assessment center
exercises that provide for multiple assessments of these behaviors, including job simulation
type exercises, evaluations conducted by multiple trained assessors, whom utilize a common
method for capturing candidate behaviors, and both individual candidate reports and a final
report containing the notes from each assessors (Rupp, Reynolds, et el.2009 pg. 245-246). Earlier
research conducted by Lowry (1996) found that the majority of surveyed organizations used a
job analysis (80%), but that fewer (45%) provided final reports. Subsequent research by
Spychalski, Quinones, Gaugler, & Pohley (1997) found that most of the organizations they
surveyed (93%) reported using a job analysis, (81%) reported the measurement of relevant
behaviors, most used multiple assessments including in-baskets (82%), and slightly less (54%)
reported using simulations, most (82%) provided assessor training of some kind.
The results of the 10 part questionnaire (Appendix F) suggest that, of the fire departments
that utilize assessment centers for promotion, the majority of the departments mirror the
GECACO guidelines.
Question 1 of the questionnaire was designed to find out how many fire departments use
promotional assessment centers. Of the 80 respondents to the questionnaire, more than 1/3
Development of a Promotional 23
(68.2%) reported that their departments used an assessment center for selection and/or promotion
decisions.
Question 2 requested information regarding the components that make up their
department’s promotional assessment center. All (100%) report that their departments utilize
multiple assessors; a majority (89.7%) indicates that multiple assessment exercises are part of the
center and (89.7%) state that they use simulations as part of these assessments. Most (62.1%)
indicated that their assessors receive training however, only slightly more (65.5%) report
assessor feedback to participants. Even less (62.1%) report the use of a job analysis step and
much less (58.6%) do not validate the process.
Question 3 involved an inquiry into the process used to determine who would participate
in the promotional assessment center. Most (67.7%) answered that a written exam is used, only a
few (19.4%) indicated that they measured seniority as a criteria, and even less (9.7%) used an
interview process to determine eligibility. Some fire departments (35.5%) used some other
measure for determining participation.
Question 4 dealt with assessor selection and specifically requested information regarding
who serves as an assessor in the department’s assessment center. The majority (80.6%) stated
that their department used fire department personnel from other fire departments. Less than half
(41.9%) reported that the assessment center used supervisors 1 rank above the position being
assessed and about 1/3 (32.3%) use supervisors that are only 1 rank above. None of the
responding departments indicated the use of psychologists as assessors.
Question 5 asked for information regarding the type of assessments or exercises used in
the department’s assessment center. Most of the respondents (87.1%) indicated that their
assessment center included a role play exercise interview exercise and many (77.4%) responded
Development of a Promotional 24
that the process included both an interview and oral presentations. Seventy-one percent stated
that an in-basket exercise is used, written presentations were reported as part of the exercise
cadre by little more than ½ (58%) and much less (16.1%) reported using the leaderless group
exercise. Twenty-six percent listed “other” exercises being part of their department’s assessment
center.
Question 6 was designed to elicit information regarding the costs, per participant, for
conducting an assessment center. Eighteen percent indicated that their department spends up to
$100 for each participant in their assessment center. Slightly more (21.4%) believed that their
department costs were in the range of $101-$500 per participant. Departments with costs in the
$501-$1,000 range were reported by a few departments (17.9%) very few amount ( 13%) spend
over $1,000 per person to conduct an assessment center. Almost 1/3 (35.7%) do not know how
much is spent on each candidate that participates in their organization’s promotional assessment
center.
Question 7 asked for information on how long, in days, the assessment center takes.
About half (51.6%) of the promotional assessment centers take 3 or more days. Slightly less
(45.2%) take 1-2 days and very few (3.2%) will take less than 1-day.
Question 8 asked departments to indicate where the assessment center usually takes
place. Many (61.3%) stated that the promotional assessment center is conducted in a fire
department facility. The remainder, (38.7%) reported that the process occurs somewhere other
than the fire department.
Question 9 dealt with the length of time it took to deliver the results of the assessment
center. Respondents reported that slightly less than 1/2 (45.2%) take more than 4-days to deliver,
Development of a Promotional 25
while about 1/3 (32.3%) found that the reporting of results took from 2-4 days and very few
(22.6%) reported that results were delivered in 1-day or less.
Question 10 asked respondents to comment on whether or not those that were not
selected were given feedback on their performance. The majority of departments (83.9%)
indicate that they do provide this feedback to participants. The rest (16.1%) indicated that they
do not provide feedback to those that were not selected.
The literature suggests that the successful utilization of an assessment center also depends
to some degree on the ability of the organization to identify and address any internal barriers that
could impact its use of the process. Internal barriers, like poor planning, misperceptions, and
failing to update the assessment center are within the ability of the organization to influence.
Findings point to the major role that proper planning has in successful adoption of
assessment centers. The importance of planning in the development and implementation of an
assessment center is stressed by Maher (1985) who found that due to the complexities involved
in conducting assessment centers, including but not limited to the number of candidates to be
measured and the number and types of exercises, organizations should reserve 6-12 months of
planning time before administering any assessment center. Research provides some specific
examples how poor planning resulted in the failure of some assessment centers. Thornton (1992)
in his research found that failing to; enlist the early support of the key decision makers including
the upper management in the organization; ensuring that all involved are educated regarding the
time and financial requirements involved; use of the assessment center results for other than their
intended purpose; and ensuring that the dimensions measured are relevant to the success in the
position are all examples of poor planning.
Development of a Promotional 26
Another internal obstacle to overcome is the organizational and individual misperceptions
about what assessment centers are. Some organizations, unfamiliar with the limits of assessment
centers, have mistakenly relied only on the results derived from the assessment center process in
making promotional decisions. Thornton III and Byham (1982) point out that organizations using
assessment centers as the sole determiner of qualifications, fail to comprehensively measure
ability and that instead of using an assessment center in this manner they propose that real
success is obtained when the assessment center is used as part of an overall promotional system.
The systems approach to promotions is shared by Lowry (1996) who stated that other dimensions
such as interviews, and background checks, should be considered when making promotion
decisions. Individual perceptions can also affect the successful use of an assessment center,
especially when the organization is changing the way things have always been done. Kiniki and
Kreitner (2003) determined that employee reaction to change in an organization is very common
and in some cases can result in resistance and subsequent failure of an organization’s initiatives,
no matter how “technically or administratively perfect” the change is (pg 330). Schwabe (1982)
found that employees tend to act negatively when assessment centers are introduced as a change
and that these negative reactions can have a detrimental impact on the effectiveness of an
assessment center (pg 19). Stiber (2011) states that the degree of success experienced by an
organization’s implementation effort was directly related to the level of communication the
organization engaged in prior to implementing the assessment center including addressing such
issues as the reasons for implementation of the assessment center and how it will benefit the
employees and the organization (Stiber 2011). These communications take on a heightened
importance in those environments that involve unionized employees since many CBA’s dictate
how promotions will be made (Bureau of National Affairs 2008). In these circumstances the
Development of a Promotional 27
implementation of a promotional assessment center may require the organization to engage in
collective bargaining.
Ensuring that the duties and responsibilities of the targeted position are current and
reflected in the exercises is another barrier to the successful utilization of the assessment center.
Jeanneret (1989) (as cited in Thornton III, 1992) found that performing periodic reviews of the
targeted position’s duties and responsibilities ensures that the assessment center exercises are
current and that a failure to ensure that the exercises reflect a contemporary view of the position
and what it takes to perform in the position successfully, may also impact the relevance of not
only the exercises portion of the assessment center but the whole process in general. A similar
opinion is offered by Caldwell et el (2003) who found that when an organization designing the
exercises bases the exercises on the actual duties and responsibilities of the position then they
can in most cases produce the desired results. Once any barriers are identified and a strategy is
developed to eliminate or minimize them an organization can move on to planning the
implementation of the assessment center.
Discussion
The findings represented in this ARP show that an organization willing to commit the
time and financial resources can develop a successful promotional assessment center. Research
suggests that the most effective promotional assessment centers are those that are part of an
overall selection system and contain the essential components. The experience of fire
departments in the US shows that the majority of fire departments surveyed who use promotional
assessment centers do include, to some extent, the key components provided by the ITFASG.
The successful utilization of an assessment center depends a great deal on the ability of the
organization to forecast and resolve the internal barriers that could result from the introduction of
Development of a Promotional 28
the assessment center. And finally, the results indicate that to experience the successful
implementation of an assessment center, the key decision makers and users of the assessment
center be brought in early enough so that they are informed and involved throughout the entire
process.
It was slightly satisfying to learn that approximately 2/3 (68.2%) of fire departments that
completed the questionnaire indicated that they use an assessment center for promotions.
Previous studies by Yeager (as cited in Lowry, 1996) into assessment center use by both police
and fire departments found that only 44% of police and fire departments surveyed reported using
assessment centers. The current survey results indicate an increased awareness among the fire
service as to the benefits that an assessment center can have in selecting supervisory personnel.
It is important that all parties involved in the development of an assessment center
understand what an assessment center is and what it is not. In its purest form, an assessment
center is an evaluation method used to determine the most qualified candidate by using multiple
testing techniques designed to “simulate” actual job duties and responsibilities (Taylor, M. n.d) .
At the same time, an assessment center is not a panacea. To ensure that the most qualified
candidate is selected it is important that assessment centers be coordinated with the results of
other measurements including but not limited to oral interviews, written tests, and experience
and background checks (Thornton III & Byham, 1982) (Lowry, 1996). Survey results agree with
the literature in that 89.7% of respondent fire departments indicated that simulations are used as
part of their assessment center process. Also, a majority of fire department respondents use other
means of evaluation in conjunction with assessment center results, including written exams,
interviews, and seniority.
Development of a Promotional 29
The current assessment center guidelines provided for by Rupp et el (2008) evolved from
previous guidelines and was developed primarily to provide continued professional guidance and
standards to developers and users of assessment centers. One of the keys to the developmental
success of a promotional assessment center for Lieutenants in the SFD lies in the willingness of
the SFD to incorporate as much of the standards outlined by Rupp, et el into their own
assessment center. A majority of other fire departments surveyed reported that their promotional
assessment centers included many of the recommendations in Rupp, et el.; however it is difficult
to determine if the inclusion of these recommendations is a result of the familiarity with the
guidelines. The importance of following some standardization also is revealed in cases where the
assessment center is challenged, legally, and the defending organization must show that the
design, use, and results of their assessment center was not arbitrary, capricious, or discriminatory
(Thornton III, 1992).
Like many other organization initiatives, efforts toward the successful implementation of
a promotional assessment center can be impacted by either internal or external barriers. External
barriers, such as budget cuts, may have an impact on the organization and its initiatives, but they
are generally outside of the control and influence of the organization. Internal barriers, like poor
planning, misperceptions, and failing to update the assessment center are within the ability of the
organization to influence.
Caldwell et el (2003) found that poor planning was one of the major reasons for the
failure of assessment centers. Proper planning for an assessment center involves considering
several issues (Thornton III, 1992) and can take anywhere from 6-12 months (Maher, 1985).
Another internal barrier to the implementation of an assessment center involves using assessment
center results for purposes other than what they were intended (Stiber, n.d.). Assessment center
Development of a Promotional 30
results should not be the only measurement relied on to make promotional decisions. For an
assessment center to provide the most value it should be used as part of an overall system of
evaluation that includes other types of measurements like interviews, written tests, and
experience and background checks (Thornton III & Byham, 1982) (Lowry, 1996). A systematic
approach to promotion, where the results of an assessment center are coupled with the results of
other factors, provide the most effective way of determining the most-qualified candidate
(Thornton III et el, 1982) (Thornton III, 1992) while at the same time ensure that the selection
system can withstand any claims that the system is arbitrary, capricious or discriminatory
(Thornton III et el, 1982).
Reaction to change is common in organizations and failing to forecast its impact can
affect the success of organizational initiatives (Kiniki & Kreitner 2003). The implementation of
a promotional assessment center is the kind of change that employees will some times react
negatively to. To better manage the reaction that may surface as a result of the implementation of
an assessment center Stiber (n.d.) believes that organizations should communicate with
employees early in the process about issues such as the objective of the assessment center, why it
is needed, and how both the organization and the employees will benefit from its
implementation. Collective bargaining may be needed in organizations operating under a CBA
that governs the promotional process.
It was disappointing to learn that only 63% promotional assessment centers made use of
the JA step. This is in sharp contrast to the literature that points out how critical it is to conduct a
thorough review of the essential duties and responsibilities of the position being evaluated
(Caldwell, 2003, Terpak, 2009). The lack of a comprehensive JA coupled with the findings that
even less (59%) spend any time on validating their assessment center indicates that some of the
Development of a Promotional 31
fire departments may be conducting assessment centers without the knowledge provided for by
the GECACO. The survey did not include any inquiries regarding the maintenance of the JA,
however, it can fairly assumed that for those departments that are not conducting any analysis of
the targeted job duties and responsibilities, that any changes in these duties and responsibilities
may not be reflected in future assessment centers for the position. Organizations must
understand that without the completion of JA, considered to be an essential step in the
development of an assessment center, the credibility of the entire assessment center can be in
doubt (Sackett, 1999).
The development of an effective assessment center is a complex effort. However, the
value that it can contribute to an organization’s promotional system may outweigh the time and
effort required to produce it. Throughout the entire assessment effort, it is wise to ensure that the
key personnel who will be needed to make sure that the assessment center effort is successful
understand the time and commitment that is necessary. Without their commitment the
assessment center risks failure. Regarding the design, there is a wealth of knowledge available,
both by those in the human resources field and by those in the fire service, that can greatly assist
in an organization’s assessment center efforts. Congruent with the design of the assessment
center, organizations should look ahead and try to determine any barriers, internal to the
organization, which could impact the subsequent implementation of the assessment center.
Developing strategies to manage these barriers may make the implementation process that much
easier.
Recommendations
Research suggests that, with a properly designed and executed plan, the SFD can develop
and implement a successful promotional assessment center. It is the purpose of this ARP to
Development of a Promotional 32
provide the Fire Commissioner of the SFD with the following recommendations regarding how
the SFD should develop an assessment center for Fire Lieutenants Candidates.
Lieutenants in the SFD are covered by a CBA, therefore the SFD is has to ensure that its
promotional procedures are consistent with said CBA. In addition to the CBA concerns,
promotions are also subject to MGL that impose additional requirements upon the SFD. It is
highly recommended that the Fire Commissioner discuss the contemplated use of a post-list
promotional assessment center for future Lieutenant vacancies with the city’s labor relations
counsel to determine what obligations there will be under the CBA and/or the MGL. The
rationale behind recommending the post-list promotional assessment center is due to the (1)
consultant costs that associated with adopting both the weighted-graded or sole ranking
assessment centers, and (2) fact that it does not affect the ranking of the candidates . Prior to
taking any further actions, it is recommended that the Fire Commissioner first satisfy any of the
legal and/or collective bargaining obligations that will arise as a result of the implementation of
the assessment center.
Once the collective bargaining and legal concerns are addressed, the Fire Commissioner
should organize a cross-functional team to spearhead the development, implementation and
support of the promotional assessment center effort. Ideally, the team makeup should include, at
a minimum, representatives from the, SFD, union, human resources, and law. If finances allow
an independent assessment center consultant (IACC) should be retained as part of the cross-
functional team. The IACC would coordinate the efforts of the team and need only be retained
for the initial development and implementation efforts. If an IACC is not retained than to ensure
that best practices are included in the development and implementation efforts the team should
be encouraged to follow the recommendations provided by the ITFACG In addition to the
Development of a Promotional 33
development and implementation responsibilities, the team should be charged with producing
and communicating the purpose for the assessment center, determining the methods for
evaluating the success of the assessment center and identifying the short and long-term needs
required to maintain the assessment center. The end product for the team should be a document
containing recommendations for the development, implementation and support of the
promotional assessment center that can be presented to the Fire Commissioner for his perusal
and subsequent approval.
Once the approval of the team recommendations have occurred the Fire Commissioner is
urged to make the recommendations part of the SFD’s promotional system governing Lieutenant
promotions. This will ensure that employees are aware of the entire process and the expectations
that the SFD has of their future Lieutenant candidates.
Development of a Promotional 34
Reference List
Adler, S., Gold, M., Gowing, M., & Morris, G. (2008). The Next Generation of
Leadership Assessments: Some Case Studies [Electronic version]. Public
Personnel Management (37) 4 Winter 2008
Bachtler, J. & Brennan, T (1995). The Fire Chief’s Handbook. PennWell Publishing
Company: Saddle Brook, NJ
Bureau of National Affairs (2008). Grievance Guide 12th Edition. BNA Books:
Arlington, VA
Bynham, W. (n.d.). The Assessment Center Method, Applications, and
Technologies. [Electronic version]. Development Dimensions: Pittsburgh PA
Caldwell, C., Gruys, M., and Thornton III, G. (2003). Ten Classic Assessment Center
Errors: Challenges to Selection Validity [Electronic version]. Public Personnel
Management (32) 1 pg 73.
Compton, D. & Granito, J. (2002). Managing Fire and Rescue Services. ICMA:
Washington D. C.
Cosner, T., Baumgart, W. (2000). An Effective Assessment Center Program: Essential
Components [Electronic version]. FBI Law Enforcement Bulletin v (69) 6 June
2000.
Elkouri, F. (2003). How Arbitration Works 6th Edition. BNA Book: Washington D.C.
Gaugler, B & Thornton III, G. (1989). Number of Assessment Center Dimensions as a
Determinant of Assessor Accuracy [Abstract]. Journal of Applied Psychology (74) 4
August 1989 pg. 611.
Development of a Promotional 35
Human Resources Division (2007). Assessment Centers-Use in Civil Service Promotions.
Retrieved from www.mass.gov/Eoaf/docs/hrd/cs/publications/ac/cs_ac_info.doc , on July
22, 2011.
Human Resources Division (2010). Fire Departments covered by Civil Service as of
October 13, 2010. Retrieved from
www.mass.gov/Eoaf/docs/hrd/cs/publications/fire_departments_covered_by_civil
_service.rtf on July 2, 2011
Kinicki, A. & Kreitner, R. (2003). Organizational Behavior: Key Concepts, Skills & Best
Practices. McGraw-Hill Companies: New York, NY
Kouwe, P. (1993). Assessment Center Processes for Officer Selection. The Voice. June 1993
22 (6) p 29-30
Kuechler, W., Simkin, M. (2003). How Well Do Multiple Choice Tests Evaluate Student
Understanding in Computer Programming Classes? [Electronic version]. Journal of
Informational Systems Education. V 14 (4).
Michelson, E. Mahar, P., Curran, B. (2001) Preparing for Promotion: A Guide for Public
Safety Assessment Centers 2cd Edition. Law Tech Publishing LTD: San
Clemente, CA
Neidid, P. & R. (1988). Developing an Assessment Center [Electronic Version]. Fire
Chief. April 1988.
Neroulas, E., Roche, K. (2010), 2009 National Run Survey-Department Profiles
[Electronic version]. Firehouse Magazine June 2010. Retrieved from
www.firehouse.com/files/article_pdfs/Profiles610.pdf on July 5, 2011.
Development of a Promotional 36
Rowe, T. (2006). A Preperation Guide for the Assessment Center. Charles C. Thomas
Publishing: Springfield, IL
Rupp, D., Reynolds, et el (2009). Guidelines and Ethical Considerations for Assessment Center
Operations. International Task Force on Assessment Center Guidelines. International
Journal of Selection and Assessment [Electronic Version]. V(17) (3). Blackwell
Publishing Malden, MA.
Schwabe, J. (1982). Assessing employee potential: using assessment centers in local
government. International City Management Association: Washington D.C.
Stiber, A (n.d.). Implementation Planning for Multirater Assessment Success. Retrieved from
www.ddiworld.com/DDIWorld/media/white-
papers/implementationplanningformultirater_wp_ddi.pdf?ext=.pdf on July 19, 2011.
Sychalski, A., Quinones, M., Gugler, B., Pohley, K. (1997). A Survey of Assessment Center
Practices in Organizations in the US [Electronic Version]. Personnel Psychology V50
Taylor, M. (n.d.). Assessment Centers for Hiring and Development. ICMA: Washington, D.C.
Terpak, M. (2009). Promotional Assessment Centers: Understanding the Process.
Retrieved from Fire Engineering on July 22, 2011
Thornton III, G., Byham, W. (1982). Assessment Centers and Managerial Performance.
Academic Press: London, England
Thornton III, G. (1992). Assessment Centers in Human Resource Management. Addison- Wesley
Publishing Company, Inc.: Reading, MA
Thornton, III, G., Mueller-Hanson, R (2004). Developing Organizational
Simulations: A Guide for Practitioners and Students. Lawrence Erlbaum
Associates: Mahwah, New Jersey
Development of a Promotional 37
Thornton, III, G., Rupp, D. (2006). Assessment Centers in Human Resource
Management. Lawrence Erlbaum Associates: Mahwah, New Jersey
US Census Bureau. (2010). [Data file]. City of Springfield, MA Quick links. Retrieved
from www.quickfacts.census.gov/qfd/states/25/2567000lk.html on, July 1, 2011.
United States Fire Administration (2010). Executive Leadership Student Manual 5th
Edition. Emiitsburg, MD
US Fire Departments. USA Fire Departments by Career Firefighters [Data file]. Retrieved
from www.usfiredept.com/usa-fire-departments-carreers.html?page=2 on July 1, 2011
Ward, M. (2006). Fire Officer Principles and Practices. Jones and Bartlett Publishing:
Sudbury, MA
Yeager, S. (1986). Use of Assessment Centers by Metropolitan Fire Departments in
North America [Abstract] [Electronic version]. Public-Personnel Management Spring
(15)
Development of a Promotional 38
Appendix A
Springfield Fire Department Organization Chart
Development of a Promotional 39
Appendix B
HRD Fire Lieutenant Reading List
TO: Fire Department Appointing Authorities/ Fire Chiefs FROM: Director, Civil Service DATE: May 21, 2009 SUBJECT: READING LISTS FOR THE PROMOTIONAL EXAMINATIONS
FIRE LIEUTENANT FIRE CAPTAIN
EXAMINATION DATE – SATURDAY, NOVEMBER 21, 2009
Candidates are responsible for reading the texts and other materials listed below and all pages of this announcement. Please note carefully which edition and date of publication is listed for each item. Questions will cover the reference in its entirety unless otherwise indicated. All examination questions will be based on these materials.
LIEUTENANT AND CAPTAIN
Fire and Emergency Services Company Officer, Fourth Edition, 2007. (IFSTA)
Lieutenants: Part A, Chapters 1-21 and Part B, Chapters 22, 23, 24, 26, 31, and 32.
Exclude appendices and glossary.
Captains: Entire book except appendices and glossary.
NOTE FOR ALL APPLICANTS: Chapter 23 (only pp. 537-558, start at beginning of p. 537 and stop at p. 548 before “Provincial and Territorial Governments”; start at p. 549 “Agencies of State and Provincial Governments” and stop at p. 558 before “Canadian Federal Government”).
Fire Inspection and Code Enforcement, Seventh Edition, 2009. (IFSTA)
Chapters 3-13 and 17, and all glossary terms related to these chapters. Exclude appendices.
First Responder, Bergeron, J. D. and Le Baudour, C., Eighth Edition, 2009. (Brady Prentice Hall)
Chapters 1-7: Entire chapters.
Development of a Promotional 40
Chapter 8: Only pp. 206-234 (start at p. 206 “Cardiopulmonary Resuscitation (CPR)” and stop at end of p. 209; start at p. 230 “CPR--Responsibilities of the Emergency Medical Responder” and stop at the end of p. 231; start at p. 232 “Automated External Defibrillation” and stop at the end of p. 234).
Chapters 9-16: Entire chapters.
Hazardous Materials for First Responders, Third Edition, 2004. (IFSTA)
Entire book except appendices, metric equivalents, glossary, tables, and figures.
National Incident Management System, FEMA P-501, December 2008. (Federal Emergency Management Agency, U.S. Department of Homeland Security)
Entire book, including Appendix B, Glossary of Key Terms, and Acronyms.
Exclude Appendix A.
LIEUTENANT AND CAPTAIN (Cont.)
Massachusetts General Laws, Chapter 148 (as amended through the release date of this reading list).
Pumping Apparatus Driver/Operator Handbook, Second Edition, 2006. (IFSTA)
Chapter 4: Entire chapter.
Chapter 6: Entire chapter.
Chapter 7: Only pp. 165-168 (start at p. 165 “Solid Stream Nozzles” and stop at p. 166 before “Determining the Flow From a Solid Stream Nozzle”; start at p. 167 “Fog Stream Nozzles” and stop at p. 168 before “Constant Flow Nozzles”).
Chapter 8: Only pp. 184-212 (start at p. 184 “Total Pressure Loss: Friction Loss and Elevation Pressure Loss” and stop at p. 187 before “Determining Your Own Friction Loss Coefficients” ; start at p. 190 “Determining Elevation Pressure” and stop at p. 212 before “Determining Net Pump Discharge Pressure”). Exclude metric pages.
Chapter 9: Only pp. 258-259 (start at p. 258 “Hand Method” and stop at p. 259 before “Equation H (5-inch hose)”).
Chapter 11: Only pp. 323-351 (start at p. 323 “Operating from a Pressurized Water Supply Source” and stop at the
end of p. 351).
Chapter 15: Entire chapter.
Development of a Promotional 41
Of the lettered formulas, you are required to know ONLY Formulas A, B, C, D, F, G, and I. You must also be able to apply Formula A. To do this, you need to know the coefficients in Table 8.3 (p. 186) for 1 ¾” hose with 1 ½” couplings and for 2 ½” hose; and the coefficients in Table 8.4 (p. 198) for two 2 ½” lines and for two 3” lines with
2 ½” couplings. NOTE: You will NOT be required to know or APPLY any formulas OTHER THAN those listed above.
LIEUTENANT ONLY Building Construction Related to the Fire Service, Second Edition, 1999. (IFSTA)
Engine Company Fireground Operations, Richman, H., Third Edition, 2007. (Jones and Bartlett)
Ladder Company Fireground Operations, Richman, H., Third Edition, 2007. (Jones and Bartlett)
CAPTAIN ONLY Collapse of Burning Buildings, Dunn, V., 1988. (Fire Engineering)
Entire book except appendix.
Strategic and Tactical Considerations on the Fireground, Smith, J. P., Second Edition, 2008. (Prentice-Hall, Inc.)
Entire book except Chapter 2.
Supervision Today, Robbins, S. P., and DeCenzo, D. A., Sixth Edition, 2010. (Prentice-Hall, Inc.)
Chapter 1: entire chapter, except section “Where Do Supervisors Come From?” (p. 11).
Chapter 3: only p. 62 (entire page) and pp. 65-70 (start at p. 65 “Key Planning Guides” and stop at the end of Step 5 on p. 70).
Chapter 7: entire chapter.
Chapter 8: only pp. 191-205 (start at p. 191 “What is Motivation?” and stop at p. 205 before “Should Employees be Paid for Performance or Time on the Job?”).
Chapters 9, 10, 11, 14, 15, and 16: entire chapters.
(Exclude sections on enhancing understanding and developing your supervisory skills at the end of each chapter, and sections on news flashes.)
Development of a Promotional 42
Appendix C
HRD Promotional Education/Experience Work Sheet
Development of a Promotional 43
Appendix D
US Fire Departments with Staffing Levels in the Range of 200-400
Department Name Career Firefighters New Haven Fire Department 400 City of Saint Paul Fire Department 400 Akron Fire Department 392 Little Rock Fire Department 391 Syracuse FD 391 Columbia Fire Department 385 Chattanooga Fire Department 384 Navy Regional Fire Rescue Hampton Roads 384 Brevard County Fire Rescue 382 Sarasota County Fire Department 380 Unified Fire Authority 379 Fort Lauderdale Fire Rescue 378 Macon-Bibb County Fire Dept 372 Columbus Department of Fire and EMS 370 Wichita Fire Department 370 Mesa Fire Department 369 Chesapeake Fire Department 361 Laredo Fire Department 357 Hartford Fire Department 354 Bridgeport Fire Department 350 Loudoun County Fire, Rescue, & EMS 350 Newport News Fire Department 350 Anchorage Fire Department 345 Huntsville Fire & Rescue 344 Navy Regional Fire Rescue Hampton Roads 343 St. Lucie County Fire District 342 Pasco County Emergency Services 340 City of Dayton Fire Department 340 Riverside County Fire Department 336 Salt Lake City Fire Department 332 Fort Wayne Fire Department 330 City of Knoxville Fire Department 330 Kalamazoo Department of Public Safety 327 Horry County Fire/Rescue 327 Saint Petersburg Fire & Rescue Station 324 Plano Fire Department 321 Augusta Fire Department 312 Howard County Fire & Rescue 312 Hawaii County Fire Department 308 Aurora Fire Department 307 West Metro Fire Rescue 300 Des Moines Fire Department 300 Reno Fire Department 300
Development of a Promotional 44
Lubbock Fire Department 300 Madison Fire Department 299 Winston Salem Fire Department 296 Fayetteville Fire and Emergency Management Department 294 North Hudson Regional Fire & Rescue 293 City Of Spokane Fire Department 292 Clayton County Fire Department 291 Gary Fire Department 290 Arlington County Fire Department 290 Irving Fire Dept 287 City of Fresno Fire Department 284 Contra Costa County Fire Protection District 283 Waterbury Fire Department 280 Cambridge Fire Department 280 Lincoln Fire & Rescue 280 Evansville Fire Department 274 Atlantic City Fire Department 273 Rockford Fire Department 272 Elizabeth Fire Department 269 Tualatin Valley Fire & Rescue 265 Arlington Texas Fire Department 265 Savannah Fire & Emergency Services 262 Durham Fire Department 261 Department of Fire, Emergency Services, and Buildings and Codes 260 Charleston Fire Department 260 Roanoke Fire- EMS 260 Marion County Fire Rescue 255 Paterson Fire Department 255 Hialeah Fire Department 254 Hall County Fire Services 253 Osceola County Emergency Services Department 252 New Bedford Fire Department 252 Frederick County Division of Fire and Rescue Services 252 Shasta Trinity National Forest 250 City of Sunnyvale Department of Public Safety 250 San Bernardino County Fire Department 250 Santa Ana Fire Department 250 Prince William County Department of Fire & Rescue 250 Tallahassee Fire Department 249 Henry County Fire Department 249 City of South Bend Indiana Fire Department 248 Topeka Fire Department 244 Trenton Fire Department 242 Maui Department of Fire Control Engine #10 241 Fort Rucker Fire & Emergency Services 240 Indian River County Fire Rescue 240 Seminole County EMS/Fire/Rescue 240 Boise City Fire Department 240
Development of a Promotional 45
Stockton Fire Department 239 Stamford Fire Rescue Department 238 East Bank Consolidated Fire Department 238 Grand Rapids Fire Dept 234 City of Scottsdale Fire Department 233 Portland Fire Department 232 Polk County Fire Services 230 Lansing Fire Department 230 Garland Fire Department 230 Amarillo Fire Department 229 Beaumont Fire/ Rescue Services 229 Martin County Fire Rescue 228 Pembroke Pines Fire & Rescue 227 Anaheim Fire Department 226 Alameda County Fire Department 225 Hampton Fire & Rescue 225 Lafayette Fire Department 224 Hollywood Fire Rescue & Beach Safety Department 223 Fall River Fire Department 222 Nevada Division Of Forestry 220 Portsmouth Fire, Rescue and Emergency Services 220 Suffolk Fire Department 220 Manchester Fire Department 219 Tuscaloosa Fire and Rescue Service 218 Springfield Fire Department 215 Plumas National Forest 214 Warwick Fire Department 213 Springfield Fire Department 212 Glendale Fire Department 208 Federal Fire Department Hawaii 208 Aurora Fire Department Central Station 207 Asheville Fire and Rescue Department 207 Cary Fire Department 206 Wilmington Fire Department 206 City of Camden Fire Department 204 Peoria Fire Dept 201 Cranston Fire Department 201 Santa Barbara County Fire Department 200 Miami Beach Fire Department 200 City of Lowell Fire Department 200 Lynn Fire Department 200
Development of a Promotional 46
Appendix E
Request to Participate in Questionnaire
Dear Sir or Madam,
I am conducting research into assessment centers for my final applied research project for the Executive Fire Officer Program at the National Fire Academy (Academy). As part of my research I am conducting a survey of the assessment center practices of fire departments in the US.
I respectfully request your time in completing a 10-question survey regarding your department’s experience with assessment centers. The survey can be accessed at www.surveymonkey.com/s/C7D6RZ2 and should only take a few minutes. The survey results will not indicate your individual response however the overall survey may become a permanent part of the Academy’s applied research project collection as well as serve as guidance towards the Springfield Fire Department’s efforts regarding an assessment center for Fire Lieutenants.
I want to thank you for your assistance in this important research.
Very truly yours,
Jerrold E. Prendergast
Chief of Administration
Development of a Promotional 47
Appendix F
Questionnaire and Results
1. Does your Department use assessment centers for employee selection/promotion?
o Yes (68.2%)
o No (31.8%)
2. Does your assessment center process involve any of the following? (Check all that
apply)
o Job analysis (62.3%)
o Multiple assessments (89.7%)
o Simulations (89.3%)
o Multiple assessors (100%)
o Assessor training (62.1%)
o Feedback (65.5%)
o Validation of the whole process (58.6%)
o Don’t know (0%)
3. How does your department determine the candidates who will participate in the
assessment center?
o Interview (9.7%)
o Written exam (67.7%)
o Seniority (19.4%)
o Other (35.5%)
4. Generally, who serves as an assessor in your department's assessment center?
Development of a Promotional 48
o Department supervisor, 1-rank above (41.9%)
o Department supervisor, 2 ranks above (32.3%)
o Psychologists (0%)
o FD personnel from outside Department (80.6%)
5. What types of exercises are used in your assessment center?
o In-basket exercise (71%)
o Leaderless group exercise (16.1%)
o Role play exercise (87.1%)
o Interview exercise (77.4%)
o Oral presentation (77.4%)
o Written presentation (58.1%)
o Other (25.8%)
6. What is the average cost per participant of conducting your assessment center?
o $0-100 (17.9%)
o $101-500 (21.4%)
o $501-1,000 (17.9%)
o Over $1,000 (7.1%)
o Don’t know (35.7%)
7. How much time does it take to conduct your department's assessment center?
o < 1-day (3.2%)
o 1-2-days (45.2%)
o 3 or more days (51.6%)
Development of a Promotional 49
o Don’t know (0%)
8. Where is your assessment center conducted, generally?
o Fire Department location (61.3%)
o Off-site (38.7%)
9. How long does it take to deliver the results of the assessment center?
o 0-1 day (22.6%)
o 2-4 days (32.3%)
o >4 days (45.2%)
o Don’t know (0%)
10. Once the results of the assessment center are final is there any feedback for those that
were not selected?
o Yes (83.9%)
o No (16.1%)
top related