the effect of differing audience response system question types on student attention in the...

9
Student Performance The Effect of Differing Audience Response System Question Types on Student Attention in the Veterinary Medical Classroom Bonnie R. Rush n McArthur Hafen, Jr. n David S. Biller n Elizabeth G. Davis n Judy A. Klimek n Butch Kukanich n Robert L. Larson n James K. Roush n Thomas Schermerhorn n Melinda J. Wilkerson n Brad J. White ABSTRACT The purpose of this study was to evaluate the ability of specific types of multiple-choice questions delivered using an Audience Response System (ARS) to maintain student attention in a professional educational setting. Veterinary students (N ¼ 324) enrolled in the first three years of the professional curriculum were presented with four different ARS question types (knowledge base, discussion, polling, and psychological investment) and no ARS questions (control) during five lectures presented by 10 instructors in 10 core courses. Toward the end of the lecture, students were polled to determine the relative effectiveness of specific question types. Student participation was high (76.1% e 2.0), and most students indicated that the system enhanced the lecture (64.4%). Knowledge base and discussion questions resulted in the highest student-reported attention to lecture content. Questions polling students about their experiences resulted in attention rates similar to those without use of ARS technology. Psychological investment questions, based on upcoming lecture content, detracted from student attention. Faculty preparation time for three ARS questions was shorter for knowledge base questions (22.3 min) compared with discussion and psychological investment questions (38.6 min and 34.7 min, respectively). Polling questions required less time to prepare (22.2 min) than discussion questions but were not different from other types. Faculty stated that the investment in preparation time was justified on the basis of the impact on classroom atmosphere. These findings indicate that audience response systems enhance attention and interest during lectures when used to pose questions that require application of an existing knowledge base and allow for peer interaction. Key words: audience response systems, clickers, classroom learning INTRODUCTION Audience Response Systems (ARS) are designed to facili- tate student participation in a large classroom lecture set- ting. Instructors embed prepared multiple-choice ques- tions in their presentations, to which students respond using wireless remote keypads. Student responses are collected instantaneously, and on instructor signal, the distribution of student responses is displayed as a his- togram in a PowerPoint a slide. Students and faculty gain immediate information regarding the student body’s knowledge base or opinions. In many instances, ARS are used for non-graded activities; however, the system cap- tures individual student data, which allows instructors to assign point values for correct answers, develop timed activities, formulate team activities, link individual stu- dent responses over multiple questions, or track student attendance. 1 Most systems have the capability to allow faculty to create questions during the lecture in response to learning issues that arise. The technology was de- signed to compensate for the passive, one-way communi- cation inherent in lecturing and the difficulty students experience in maintaining concentration during a stan- dard 50-minute lecture. Professional students have an additional challenge of maintaining concentration for three or four consecutive 50-minute lectures, four or five days per week, for two or three years. ARS have been used in most professional disciplines in higher education, including the instruction of physi- cians, 2–5 dentists, 6–8 pharmacists, 9,10 and veterinarians. 11,12 Converging evidence has indicated that use of ARS in large classroom settings generally improves student enjoy- ment, attendance, and some student outcomes. Instructors typically respond favorably to ARS and describe class- room use of the technology as revealing and motivating. 1 ARS are described as promoting active learning through enhanced student engagement, participation, and appre- ciation of lecture material. Although ARS use is clearly enjoyable for students and faculty, improved retention of lecture content is less predictable; benefit has been demonstrated in some studies 4,13,14 but not in others. 11,15 Most existing publications on ARS in the classroom are descriptive reports. 1,9,16,17 Best-practice recommendations indicate that question frequency should be between two and five questions per 50 minutes of classroom instruc- tion and advise allowing 2 to 3 minutes of classroom time to deliver and review each question. 1,16,18,19 Sug- gested categories of ARS questions including opinion polling, small-group problem-solving, knowledge-base application, forward-looking, quiz for credit, and initia- tion of discussion. 1,17 Until recently, there were little to no data regarding the effectiveness of specific question types in engaging students or promoting attention dur- JVME 37(2) 6 2010 AAVMC 145

Upload: brad-j

Post on 13-Dec-2016

212 views

Category:

Documents


0 download

TRANSCRIPT

Student Performance

The Effect of Differing Audience Response SystemQuestion Types on Student Attention in theVeterinary Medical Classroom

Bonnie R. Rush n McArthur Hafen, Jr. n David S. Biller n Elizabeth G. Davis n Judy A. Klimek n

Butch Kukanich n Robert L. Larson n James K. Roush n Thomas Schermerhorn n Melinda J.Wilkerson n Brad J. White

ABSTRACTThe purpose of this study was to evaluate the ability of specific types of multiple-choice questions delivered using anAudience Response System (ARS) to maintain student attention in a professional educational setting. Veterinary students(N ¼ 324) enrolled in the first three years of the professional curriculum were presented with four different ARS questiontypes (knowledge base, discussion, polling, and psychological investment) and no ARS questions (control) during five lecturespresented by 10 instructors in 10 core courses. Toward the end of the lecture, students were polled to determine the relativeeffectiveness of specific question types. Student participation was high (76.1%e 2.0), and most students indicated that thesystem enhanced the lecture (64.4%). Knowledge base and discussion questions resulted in the highest student-reportedattention to lecture content. Questions polling students about their experiences resulted in attention rates similar to thosewithout use of ARS technology. Psychological investment questions, based on upcoming lecture content, detracted fromstudent attention. Faculty preparation time for three ARS questions was shorter for knowledge base questions (22.3 min)compared with discussion and psychological investment questions (38.6 min and 34.7 min, respectively). Polling questionsrequired less time to prepare (22.2 min) than discussion questions but were not different from other types. Faculty statedthat the investment in preparation time was justified on the basis of the impact on classroom atmosphere. These findingsindicate that audience response systems enhance attention and interest during lectures when used to pose questions thatrequire application of an existing knowledge base and allow for peer interaction.

Key words: audience response systems, clickers, classroom learning

INTRODUCTIONAudience Response Systems (ARS) are designed to facili-tate student participation in a large classroom lecture set-ting. Instructors embed prepared multiple-choice ques-tions in their presentations, to which students respondusing wireless remote keypads. Student responses arecollected instantaneously, and on instructor signal, thedistribution of student responses is displayed as a his-togram in a PowerPointa slide. Students and faculty gainimmediate information regarding the student body’sknowledge base or opinions. In many instances, ARS areused for non-graded activities; however, the system cap-tures individual student data, which allows instructorsto assign point values for correct answers, develop timedactivities, formulate team activities, link individual stu-dent responses over multiple questions, or track studentattendance.1 Most systems have the capability to allowfaculty to create questions during the lecture in responseto learning issues that arise. The technology was de-signed to compensate for the passive, one-way communi-cation inherent in lecturing and the difficulty studentsexperience in maintaining concentration during a stan-dard 50-minute lecture. Professional students have anadditional challenge of maintaining concentration forthree or four consecutive 50-minute lectures, four or fivedays per week, for two or three years.

ARS have been used in most professional disciplinesin higher education, including the instruction of physi-cians,2–5 dentists,6–8 pharmacists,9,10 and veterinarians.11,12Converging evidence has indicated that use of ARS inlarge classroom settings generally improves student enjoy-ment, attendance, and some student outcomes. Instructorstypically respond favorably to ARS and describe class-room use of the technology as revealing and motivating.1ARS are described as promoting active learning throughenhanced student engagement, participation, and appre-ciation of lecture material. Although ARS use is clearlyenjoyable for students and faculty, improved retentionof lecture content is less predictable; benefit has beendemonstrated in some studies4,13,14 but not in others.11,15

Most existing publications on ARS in the classroom aredescriptive reports.1,9,16,17 Best-practice recommendationsindicate that question frequency should be between twoand five questions per 50 minutes of classroom instruc-tion and advise allowing 2 to 3 minutes of classroomtime to deliver and review each question.1,16,18,19 Sug-gested categories of ARS questions including opinionpolling, small-group problem-solving, knowledge-baseapplication, forward-looking, quiz for credit, and initia-tion of discussion.1,17 Until recently, there were little tono data regarding the effectiveness of specific questiontypes in engaging students or promoting attention dur-

JVME 37(2) 6 2010 AAVMC 145

ing the lecture period.9 In 2008, Crossgrove and Curran14

demonstrated that student performance is significantlyhigher on examination questions covering material taughtwith knowledge-base ARS questions. Retention of knowl-edge four months after course completion was enhancedfor some students but not for others. In 2009, Smith etal.13 identified that questions requiring peer discussionenhance student understanding, even when none of thestudents in a peer discussion group originally knows thecorrect answer. Both of these reports demonstrated en-hanced active learning using ARS in a large classroomsetting.

In veterinary medical education, the use of ARS has beenfavorably received by both students and lecturers. Tworeports from North American colleges of veterinary medi-cine identified ARS as facilitating an interactive learningenvironment for students and faculty.11,12 Elective der-matology students reported enhanced participation, inde-pendent thought, and enjoyment with the use of ARS inthe classroom.11 Despite positive student reviews, contri-butions to short- or long-term retention of the knowledgebase could not be demonstrated as a benefit of the sys-tem. When used in a core, non–technical-skills profes-sional development course, students indicated that thesystem encouraged discussion and enhanced lecture con-tent. Faculty stated that student learning feedback wasvaluable, and the system was an efficient tool in takingattendance and delivering quizzes.12

Not all reports on ARS are favorable. Negative studentreactions are linked to ineffectual questions or inappro-priate use of the technology. Specifically, students havestated that ARS distracts from lecture content when thelearning value of individual questions is unclear, ques-tions are posed for the sake of using novel technology,question frequency is excessive, and questions are usedto gather data for future years.1,16 Superfluous use ofARS technology does not contribute to an interactivelearning environment. Literature reviewing ARS use inhigher education has made general recommendationsfor careful pre-lecture planning and thoughtful questiondesign to avoid negative student response; however,specific recommendations for effective use based on ob-jective data are limited.

The objectives of the reported study were to evaluate theability of specific types of multiple-choice questions tomaintain veterinary medical student engagement andattention during a large classroom lecture, with the goalof contributing to best-practice recommendations forapplication of ARS technology for instructors in a pro-fessional educational setting.

MATERIALS AND METHODSThree hundred twenty-four veterinary medical studentsenrolled in the first three years of the professional cur-riculum were invited to participate. All students wereissued ResponseCard radio frequency audience responsedevices (classes of 2009 and 2010) or virtual audienceresponse keypads (vPad) on institutional laptops (classof 2011) for use with TurningPointb software. (The ap-proximate cost to equip one classroom with TurningPoint software, 100 response cards, and one receiver was$4,150 in 2009.) Participation was voluntary and anony-

mous. This study was approved by the Kansas StateUniversity Committee on Research Involving HumanSubjects Institutional Review Board.

Ten faculty instructors participated in the study: Fourfaculty members presented to the class of 2011, and threefaculty members each presented to the classes of 2010and 2009. Before their participation, instructors receivedspecific training in using TurningPoint software. Curricu-lar courses presented by faculty during this investigationinclude anatomy, physiology, immunology, pharmaco-logy, radiology, theriogenology, small-animal medicine,small-animal surgery, equine medicine, and productionmedicine. Participating faculty selected five lectures toserve as investigational lectures. Investigational lectureswere delivered throughout the 2007–2008 and 2008–2009academic years and were not necessarily delivered inconsecutive lectures. Investigational lectures consisted ofa 50-minute lecture with the insertion of three multiple-choice ARS questions (five answer choices per question),spaced 7 to 10 minutes apart, maintaining the same ques-tion category during the entire lecture period. Contentquestion categories included (1) knowledge base, (2) dis-cussion, (3) polling, (4) psychological investment, and (5)no questions (control). A description of each questiontype is provided in Box 1. The order of lectures by ques-tion category was randomized across five lecture periodsfor each instructor. Before presenting ARS questions inlectures, each instructor received direction on developing

Box 1: Categories of Content Questions

1. Knowledge base questions require recall oflecture content presented earlier in the lectureperiod or on a previous day.

2. Psychological investment questions—also calledforward-looking, ‘‘film at 11:00,’’ or teaser ques-tions—are developed from content to be coveredduring the ensuing 5–15 minutes of lecture timeand require students to formulate an educatedguess based on existing knowledge of the topic.

3. Discussion questions require application andintegration of previous lecture content and weredesigned to be sufficiently difficult to distributestudent responses across the distracters. Afterdisplay of the histogram of student responsesbut before revealing the correct answer, studentswere asked to discuss the question with peers forone minute and re-answer the question. A second(post-discussion) histogram was displayed, thecorrect answer revealed, and the question dis-cussed.

4. Polling questions inquire about student back-ground, interests, and experiences with topicsrelated to the lecture content.

5. No ARS questions were used during one inves-tigational lecture hour to serve as control re-sponses, and the five assessment questions (seeBox 2) were asked in the same manner as otherquestion types.

146 JVME 37(2) 6 2010 AAVMC

specific ARS question types. Knowledge base, psycholo-gical investment, and discussion questions had one cor-rect answer and four distracters. Student responses tocontent questions were collected and displayed as a his-togram. Correct answers were revealed after classroomresponses were displayed, and frequently selected dis-tracters were discussed. There is no correct answer topolling questions; however, responses were revealed inall cases and discussed when appropriate.

About 30–40 minutes after the start of the lecture period,instructors posed five assessment questions (see Box 2) toevaluate student engagement in response to variousquestion types. Students were not aware that the focusof study was variation in question type. Students were

invited to exchange response devices before respondingto assessment questions to ensure anonymity. Studentsdid not receive additional points for correct answers orattendance. Student responses to assessment questionswere not displayed or discussed. Faculty were unawareof the results of the assessment questions. After comple-tion of the lecture, instructors completed a survey (seenext section) of their impression of the classroom re-sponse to various content question types (knowledgebase, discussion, psychological investment, polling).

Faculty QuestionsFaculty were asked to complete this questionnaire aftereach lecture on the basis of their own impression of thespecific content question types (knowledge base, discus-sion, psychological investment, polling).

1. What percentage of the students appeared engagedin the lecture immediately before this question?

e < 25%

e 26%–50%

e 51%–75%

e > 75%.

2. Students appear fully engaged for the duration ofthis lecture.

e Strongly agree

e Agree

e Neutral

e Disagree

e Strongly disagree.

3. Estimate the additional preparation time to includethree ARS questions in this lecture.

4. Do you believe the investment in preparation timewas justified on the basis of student response andclassroom atmosphere? Yes/No

Statistical AnalysisDescriptive statistics were performed using commercialsoftware ( JMP 7.0.1). Logistic regression was used toevaluate potential relationships between question typeand the probability that students would select a specificresponse (i.e., ‘‘actively listening’’) to an assessment ques-tion when accounting for lack of independence resultingfrom instructor and year in school. Therefore, all resultsare the predicted probability that a student would selectthis distracter on the basis of question type. All analysiswas performed with a commercially available statisticspackage (SAS/STAT 9.2).c

RESULTS

Response RateThe mean (eSE) overall student response rate to assess-ment questions was 76.1% (e2.0) of enrolled studentswhen accounting for year in school, instructors, andquestion type. There was no difference (p ¼ 0.13) in re-sponse rates on the basis of question type and no signifi-cant interaction (p > 0.05) between question type and

Box 2: Assessment Questions

1. Which of the following best describes your focuson the lecture immediately before this question?

e Actively listeninge Scanning ahead at upcoming informatione Reviewing previous lecture materiale Passively listeninge Thinking about or doing something else

entirely.2. How are you currently feeling during this

lecture?

e Very interestede Somewhat interestede Neither bored nor interestede Somewhat borede Very bored.

3. What percentage of time during this lecture doyou believe you were fully focused on lecturecontent?

e < 25%e 26%–50%e 51%–75%e > 75%.

4. Why did you participate in the audience responsequestions presented during this lecture?

e I wanted to participate.e I had to participate.e I had nothing else to do.e I didn’t participate.

5. The ARS questions in this lecture have enhancedmy attention during this lecture.

e Strongly agreee Agreee Neutrale Disagreee Strongly disagree.

JVME 37(2) 6 2010 AAVMC 147

year in school. The class of 2009 (third-year curriculum)had higher (p < 0.05) overall participation rates (87.2%)than both the class of 2010 (71.8%) and the class of 2011(69.2%).

Question 1: Which of the Following Best Describes Your Focuson the Lecture Immediately Before This Question? Students(all classes, all question types) selected ‘‘actively listen-ing’’ 56.6% of the time and ‘‘thinking about or doingsomething else entirely’’ 9.6% of the time in response tothe first assessment question. The evaluation of the pro-portion of students responding with ‘‘actively listening’’or ‘‘doing something else entirely’’ to question 1 revealedsignificant differences by question type (Table 1). Presen-tation of knowledge and discussion questions (see defini-tions in Box 1) during the lecture resulted in a higher per-centage (p < 0.05) of students responding that they wereactively listening (66.0% and 64.7%, respectively) com-pared with polling, psychological investment, or no ARSquestions (57.2%, 49.2%, and 56.0%, respectively). Morestudents responded that they were doing something elseentirely when psychological investment questions (stu-dents must predict the correct answer on the basis ofupcoming lecture content) were asked compared withdiscussion, knowledge, or polling questions.

Question 2: How Are You Currently Feeling During This Lec-ture? Student selection of ‘‘very interested’’ (26.5%) and‘‘somewhat interested’’ (42.7%) was consistently higherthan selection of ‘‘bored’’ (14.1%) or ‘‘very bored’’ (9.4%)for all classes. The percentage of students respondingthat they were ‘‘very interested’’ varied by question type(Figure 1). Presentation of knowledge and discussionquestions resulted in a higher (p < 0.05) percentage ofstudents responding that they were ‘‘very interested’’(30.1% and 27.3%, respectively) compared with pollingor psychological investment questions (21.8% and 19.6%,respectively). In control lectures (no ARS), fewer studentsresponded they were very interested (22.6%) comparedwith responses to knowledge-base questions, but re-sponses did not differ from other question types.

Question 3: What Percentage of Time During This Lecture DoYou Believe You Were Fully Focused on Lecture Content?The most common response regarding percentage oftime focused on lecture content was ‘‘greater than 75%,’’which was selected by 48.6% of students regardless ofquestion type; students selected ‘‘less than 25%’’ of timefocused on lecture 10.2% of the time. The percentage ofstudents responding that the percentage of time spentfocused on lecture content was ‘‘greater than 75%’’ or‘‘less than 25%’’ differed by question type (Table 2).Discussion and knowledge-base questions resulted in ahigher percentage of students indicating ‘‘greater than75%’’ compared with polling and psychological invest-ment. Discussion questions resulted in a lower per-centage of students with poor focus (less than 25%) onlecture content than any other question type.

Question 4: Why Did You Participate in the Audience Re-sponse Questions Presented During This Lecture? Similarto other investigation questions, all students respondedwith ‘‘I wanted to participate’’ more frequently whenknowledge base and discussion questions were presentedduring lecture than polling or psychological investmentquestions (Figure 2). This question was not posed duringlectures with no ARS questions (control lectures).

Question 5: The ARS Questions in This Lecture Have En-hanced My Attention During This Lecture Overall, studentsresponded with ‘‘agree’’ or ‘‘strongly agree’’ 64.4% ofthe time and selected ‘‘disagree’’ or ‘‘strongly disagree’’13.1% of the time. The percentage of students whoanswered ‘‘strongly agree’’ was higher for discussion

Table 1: Percentage of students responding that they were actively listening or paying attention tosomething else entirely on the basis of question type

Question type Actively listening Doing something else entirely

% students SE (%) % students SE (%)

Discussion 64.7a 7.9 5.5c 2.7

Knowledge 66.0a 7.8 5.7c 2.8

No ARS 56.0b 8.6 9.1a,b 4.3

Polling 57.2b 8.5 7.3b,c 3.5

Psych 49.2c 8.7 11.5a 5.2

Note: Superscripts represent significant differences in mean percentage of students responding by question type.ARS ¼ Audience Response System; Psych ¼ psychological investment.

Figure 1: Mean percentage (GSEM) of students re-sponding ‘‘very interested’’ to question 2, ‘‘How areyou currently feeling during this lecture?’’ Contentquestions with the same superscript are not signi-ficantly different from each other. ARSF audienceresponse system.

148 JVME 37(2) 6 2010 AAVMC

and knowledge questions than for polling and psycho-logical investment questions (Figure 3). This questionwas not posed during lectures with no ARS questions(control lectures).

Discussion QuestionsThe percentage of correct responses before and after peerdiscussion was available for 22 of 30 questions posed byfaculty during discussion questions. During a discussionquestion, faculty posed a complex knowledge applicationquestion to students, revealed the classroom response,did not reveal the correct answer, and allowed studentsto discuss the question with peers for 1 minute. The ques-tion was then re-presented to the students for their finalresponse. After accounting for year and instructor, meanpercentage of improvement in correct responses was14.0% (SE ¼ 0.2). Of 22 questions for which data wereavailable, students improved the correct response rateon 18 questions and performed less favorably on thesecond attempt on four occasions.

Faculty ResponsesIn response to the question ‘‘What percentage of the stu-dents appeared engaged in lecture immediately beforethis question?’’ faculty indicated ‘‘greater than 75%’’ on10 occasions, ‘‘51%–75%’’ on 30 occasions (most frequentresponse), and ‘‘26%–50%’’ on five occasions. Faculty didnot select ‘‘less than 25%.’’ In response to the question‘‘Students appear fully engaged for the duration of thislecture,’’ faculty responded with ‘‘strongly agree’’ on fiveoccasions, ‘‘agree’’ on 24 occasions (most frequent re-sponse), ‘‘neutral’’ on four occasions, and ‘‘disagree’’ onfour occasions. Faculty did not select ‘‘strongly disagree.’’The impact of question type on faculty perception ofstudent engagement was not demonstrated. Data werenot available for five post-lecture faculty surveys.

The average time estimated by faculty to prepare threeARS questions of the same type was shorter (p < 0.05)for knowledge-base questions (22.3 minutes) comparedwith discussion and psychological investment questions

Table 2: Model-adjusted mean percentage of students answering they spent ‘‘greater than 75%’’ or ‘‘lessthan 25%’’ of the lecture fully focused on lecture content

Question type Greater than 75% time fullyfocused on lecture

Less than 25% time fully focusedon lecture

% students SE % students SE

Discussion 53.5a 6.6 4.3b 1.6

Knowledge 54.3a 6.6 6.9a 2.5

No ARS 48.4a,b 6.6 7.3a 2.6

Polling 45.5b 6.6 8.7a 2.9

Psych 42.8b 6.6 8.3a 2.9

Note: Model accounted for year in school and included a random effect for repeated measures on instructor. Superscriptsdiffering by column indicate significant differences between rows. ARS ¼ Audience Response System; Psych ¼ psychologicalinvestment.

Figure 2: Mean percentage (GSEM) of students in-dicating ‘‘I wanted to participate’’ in response toquestion 4, ‘‘Why did you participate in the ARSquestions?’’ Content questions with the same super-script are not significantly different from each other.

Figure 3: Mean percentage (GSEM) of students re-sponding ‘‘strongly agree’’ to question 5, ‘‘The ARSquestions in this lecture have enhanced my atten-tion during this lecture.’’ Content questions withthe same superscript are not significantly differentfrom each other.

JVME 37(2) 6 2010 AAVMC 149

(38.6 minutes and 34.7 minutes, respectively). Time toprepare polling questions (22.2 minutes) was less thanthat to prepare discussion questions, but not differentfrom other types. Faculty responded ‘‘yes’’ on 35 of 37occasions when asked ‘‘Do you believe the investment inpreparation time was justified based on student responseand classroom atmosphere?’’ Two ‘‘no’’ responses wereassociated with psychological investment questions.

DISCUSSIONIn this investigation, ARS enhanced student-reportedattention during lecture when used to pose questionsthat require application of existing knowledge base andallow for peer interaction. Questions that polled studentsabout their background or experiences resulted in stu-dent attention rates similar to those with the use of noARS technology at all. Questions that required studentsto predict the correct answer on the basis of upcom-ing lecture content (psychological investment) detractedfrom student attention. Students reported disengagementfrom lecture content when they did not have adequatebaseline knowledge to answer the question. These gen-eral trends in student responses to specific question typeswere consistent across three classes of veterinary medicalstudents at different stages of their professional educationwith questions delivered by 10 instructors in 10 coursesacross a broad range of the professional curriculum.

Question TypesKnowledge-base questions resulted in a higher percentageof student-reported interest, attention, focus, and willing-ness to participate than did polling or psychologicalinvestment questions. These questions ask students torecall and apply lecture content presented earlier in thelecture period or during the previous day of lecture.Faculty-reported knowledge-base questions require lesstime to prepare for first-time presentation than discussionor psychological investment questions. In addition toenhanced attention, knowledge-base questions have beenshown to enhance in-class learning.14 A recent reportdemonstrated higher student performance on examina-tion questions covering material taught with knowledge-base ARS questions in two undergraduate courses.14 Theimprovement in examination performance was more dra-matic for non-major students taking introductory biologythan for biology majors enrolled in a genetics course.Retention of knowledge base was enhanced four monthsafter course completion for non-major students, but reten-tion was not enhanced for major students. Knowledge-base questions were suspected to have a greater impacton learning for students who are less familiar with lecturecontent (non-majors) than for students who have founda-tional knowledge.

Similar to knowledge-base questions, discussion ques-tions enhanced student-reported interest, attention, focus,and willingness to participate compared with polling orpsychological investment questions. Discussion questionsare designed to be more difficult than knowledge-basequestions and require recall, application, and integrationof lecture content. After a first attempt, students discussthe question with peers for 1 minute and then indivi-dually re-answer the question. In this investigation, stu-

dents performed better (14% more correct answers) onthe second attempt in most cases. Occasionally, the classperformed more poorly on a second attempt. Smith etal.13 demonstrated that peer discussion (undergraduatebiology) improved student performance on the secondattempt (17% more correct) in most cases and furtherimproved student performance (an additional 4%) onquestions posed later in the lecture period that requiredapplication of the same principles needed to solve thefirst question (isomorphic question). The study concludedthat peer influence by a knowledgeable student duringthe discussion period could not solely explain enhancedlearning during lecture. Improved understanding oflecture content was demonstrated by student successwith the isomorphic question, even when none of the stu-dents in the original discussion group knew the correctanswer. Students valued the opportunity to talk to oneanother to practice cognitive skills, which prepared themto analyze subsequent problems. Difficult discussion ques-tions are not a deterrent to student participation; studentsenjoy the opportunity to respond to challenging questionseven if they select the wrong answer, providing thatbaseline knowledge had been available to them. Prepara-tion of discussion questions took the greatest amount offaculty time; successful discussion questions require inte-gration of related lecture content and strong distractersto be effective. If more than 80% of students select thecorrect response on the first attempt, peer discussion isunnecessary. Effective discussion questions have first-attempt correct response rates ranging from 20% to50%.13

Polling questions did not enhance or detract from studentattention. Polling the classroom provides rapid informa-tion about the students’ background and experiences,which allows the lecturer to adjust the depth of contentdelivery.1 These questions are particularly valuable inthe delivery of continuing education, in which the audi-ence’s experience level is less familiar to the lecturer. Inaddition, polling questions allow students to becomeaware of a relative strength or weakness in their ownexperiential background. For instance, if more than 75%of the students responded that they have seen, workedon, or owned an animal with a specific disease, studentsthat have never heard of the condition recognize the needfor attention to lecture content. Polling questions mayprovide a non-threatening mode of communication frominstructor to student that improves their familiarity withone another, yet does not directly affect the understand-ing of course content.

Psychological investment (teaser) questions are designedto develop student interest in lecture content that is to beimminently delivered.1 These questions contain phrasessuch as ‘‘Which of the following do you suspect, predict,or believe is likely, common, indicated, contraindicated,or causes a specific disease or condition?’’ This questiontype is recommended by authors who advocate that stu-dents who commit to an answer, even when guessing,are ‘‘emotionally’’ or ‘‘psychologically invested’’ in thequestion and pay better attention to the discussion thatfollows.20,21 In our setting, psychological investmentquestions resulted in the lowest attention rates. Studentsindicated they disengaged from the lecture when thesequestion types were used. Professional students appear to

150 JVME 37(2) 6 2010 AAVMC

prefer to have the opportunity to answer correctly, even ifthe question is difficult. Rather than generating interest inlecture content, questions that required students to guesswithout all of the information ultimately made studentsfeel tricked, resulting in disengagement from the lecture.One faculty participant (Thomas Schermerhorn) describedthe investigational lecture using psychological invest-ment questions in this way: ‘‘After I delivered the firstquestion, I knew the students were gone. I wanted toskip the next two questions.’’ This was the only ques-tion type that triggered faculty to answer that the invest-ment of time (which was higher than for knowledge-base or polling questions) was not worth the result. Theuse of psychological investment questions by some ofthe authors was common before this investigation, andthese instructions have since minimized the use of thesequestions. The authors are not aware of other studiesidentifying negative student response to this questiontype.

ParticipationStudent participation in this investigation was high. Ofstudents, 76% enrolled in all three classes responded tocontent and assessment questions. Student participationwas anonymous and voluntary, and there was no incen-tive (extra credit) to participate. Previous studies haveestablished that students enjoy using ARS in veterinarymedicine and other science-based disciplines.9–12 Thehistogram display of responses provides insight into thediversity of ideas and understanding in the classroom22

and reassurance to students that they are not alonewhen they respond incorrectly.21,23 Students particularlylike the anonymity,24 the potential to reinforce learning,25and the ability to compare one’s answers with those ofthe rest of the class.25

The authors predicted that first-year students woulddemonstrate greater enthusiasm and participation withARS, given the relative novelty of the professional class-room. However, student participation was higher withincreasing experience in the professional classroom: firstyear < second year < third year. Although not specifi-cally determined in this study, there may be a number ofreasons why third-year students participated at a higherrate in content and assessment questions. Third-year stu-dents may have a greater appreciation for techniques thateffectively enhance attention and focus, given the chal-lenge to their attention span over their cumulative num-ber of hours in the lecture hall. Additionally, third-yearstudents are savvier and may recognize that ARS ques-tions represent a window to the instructor’s examination.

In most reports of ARS, student perception was docu-mented by using a survey after course completion. Inthis investigation, the experience-sampling method wasused to capture real-time student perception by inter-rupting the lecture to assess point-in-time subject re-sponses.26 This method has been validated as a reliableway to evaluate the self-reported impact of a givencommunication tool. Students responding to assessmentquestions were unaware that question type was the vari-able of interest during this study. Therefore, students didnot consciously vote for their favorite question type;rather, they provided an assessment of their level of

interest, focus, and attention during a particular lecture.This approach was used to obtain a representative assess-ment of the impact of specific questions rather than theperceived popularity of different question types.

Faculty participating in the current study indicated thatthey enjoyed using the system and believe that the in-vestment in preparation time was justified on the basisof the impact on classroom environment. The system pro-vides faculty with a quick and convenient way to assessstudent understanding. Wood27 provided this quote re-garding first-time recognition of the value of discussionquestions:

For me, this was a moment of revelation . . . forthe first time in over 20 years of lecturing Iknew . . . that over half the class didn’t ‘‘get it’’. . . Because I had already explained the phe-nomenon as clearly as I could, I simply askedthe students to debate briefly with their neigh-bors and see who could convince whom aboutwhich answer was correct. The class eruptedinto animated conversation. After a few minutes,I asked for a revote, and now over 90% gavethe correct answer.

Although pressing a single button on a response carddoes not immediately appear to qualify as active engage-ment, instructors reported that students who use ARSbecome visibly active participants and are more likely toask and answer questions,19,28 more likely to participatein discussion,1 and less likely to sleep during lecture.24Some faculty did detect a negative classroom responseto the psychological investment questions, but this re-sponse was not detected by all faculty.

Best PracticesThe goal of the study was to gather data regarding spe-cific question types to support best-practice recommen-dations using ARS technology in a large classroom. Likemost technology, ARS does not inherently impart value.This theme is frequently repeated in the ARS literature.The effectiveness or impact on learning depends heavilyon the intent and thought behind question design.22,24,27In this investigation, knowledge base and discussionquestions consistently enhanced student attention acrossa broad range of courses for students with diverse expe-rience levels in the professional curriculum. Nonethe-less, the authors do not recommend exclusive use ofknowledge-base and discussion question types; rather,they recommend an emphasis on questions that allowstudents to apply, discuss, and demonstrate knowledgeand sparing use of questions that share peer experiencesor require supposition and speculation.

Student populations representing other disciplines mayreact differently to specific question types than this popu-lation of professional veterinary medical students. Veteri-nary medical students are challenged by one-way deliveryof a large volume of material in the same lecture hall(probably in the same chair) several hours each day.Veterinary medical students have indicated that theyvalue teaching strategies with effective and, more impor-tant, efficient use of time. Students studying theoreticaldisciplines may appreciate the opportunity to speculate

JVME 37(2) 6 2010 AAVMC 151

or debate upcoming lecture content, and students fromless time-intensive venues may be more interested inresponding to questions about themselves.

Best practices that appear widely accepted in the ARSliterature are the frequency of question delivery and theneed for classroom review after responses are revealed.The average human attention span is no more than 20minutes, and recall of information diminishes after 15–20 minutes.29 The periodic break achieved using an ARSquestion relieves cognitive fatigue, restarts the attentionclock, and is recommended every 10–20 minutes duringa 50-minute lecture.1,9,30 Educators recommend debrief-ing the class after the distribution of responses is revealedto ensure students understand the reasoning behind cor-rect and incorrect responses.9,16,31 It is considered vital bymost researchers in the field that instructors respond tothe responses by modifying the subsequent direction ofthe lecture, if indicated by the distribution of responses.

CONCLUSIONSIn this investigation, veterinary medical students indi-cated that multiple-choice questions delivered using ARStechnology that require knowledge recall and allow peerdiscussion enhanced attention to lecture content. Ques-tions that polled students about their background orexperiences resulted in student attention rates similarto the use of no ARS technology at all. Questions thatrequired students to predict the correct answer, basedon upcoming lecture content (psychological investment),detracted from student attention. Faculty stated that theinvestment in time to prepare ARS questions was justi-fied on the basis of the positive impact on classroomatmosphere. These findings were generated by polling alarge population of veterinary medical students across abroad range of the professional curriculum: three classesof students in 10 different courses with 10 instructors.Student participation across the pre-clinical veterinarycurriculum was high. Experiential sampling indicatedthat ARS enhanced attention and focus during lecturewhen used to pose questions that required application ofexisting knowledge base and allowed for peer interaction.

NOTES

a PowerPoint, Microsoft Corporation, Redmond, WA<www.microsoft.com>.

b TurningPoint, Thomson Higher Education andTurning Technologies, Youngstown, OH<http://www.turningtechnologies.com/>.

c JMP 7.0.1 and SAS/STAT 9.2, SAS Institute, Cary,NC <http://www.sas.com/>.

REFERENCES1 Caldwell JE. Clickers in the classroom: Currentresearch and best-practice tips. CBE Life Sci Educ 6(2):9–20, 2007. doi:10.1187/cbe.06-12-0205

2 Kaneshiro KN, Emmett TW, London SK, Ralston RK,Richwine MW, Skopelja EN, Brahmi FA, Whipple E. Useof an audience response system in an evidence-based

mini-curriculum. Med Ref Serv Q 27:284–301, 2008.doi:10.1080/02763860802198861

3 Latessa R, Mouw D. Use of an audience responsesystem to augment interactive learning. Fam Med 37:12–14, 2005.

4 Pradhan A, Sparano D, Ananth CV. The influence ofan audience response system on knowledge retention: Anapplication to resident education. Am J Obstet Gynecol193:1827–1830, 2005. doi:10.1016/j.ajog.2005.07.075

5 Nayak L, Erinjeri JP. Audience response systems inmedical student education benefit learners andpresenters. Acad Radiol 15:383–389, 2008.

6 Elashvili A, Denehy GE, Dawson DV, CunninghamMA. Evaluation of an audience response system in apreclinical operative dentistry course. J Dent Educ72:1296–1303, 2008.

7 Pileggi R, O’Neill PN. Team-based learning usingaudience response system: An innovative method ofteaching diagnosis to undergraduate dental students.J Dent Educ 72:1182–1188, 2008.

8 Holmes RG, Blalock JS, Parker MH, Haywood VB.Student accuracy and evaluation of a computer-basedaudience response system. J Dent Educ 70:1355–1361,2006.

9 Cain J, Robinson E. A primer on audience responsesystems: Current applications and future considerations<http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2576416/pdf/ajpe77.pdf>. Accessed 03/09/10.Amer J Pharm Educ 72(4):77, 2008.

10 Trapskin PJ, Smith KM, Armistead JA, Davis GA.Use of an audience response system to introduce ananticoagulation guide to physicians, pharmacists, andpharmacy students. Am J Pharm Educ 69:190–197, 2005.

11 Plant JD. Incorporating an audience response systeminto veterinary dermatology lectures: Effect on studentknowledge retention and satisfaction. J Vet Med Educ34:674–677, 2007. doi:10.3138/jvme.34.5.674

12 Molgaard LK. Using a wireless response system toenhance student learning. J Vet Med Educ 32:127–128,2005. doi:10.3138/jvme.32.1.127

13 Smith MK, Wood WB, Adams WK, Wieman C,Knight JK, Guild N, Su TT. Why peer discussionimproves student performance on in-class conceptquestions. Science 323:122–124, 2009. doi:10.1126/science.1165919

14 Crossgrove K, Curran KL. Using clickers innonmajors- and majors-level biology courses: Studentopinion, learning, and long-term retention of coursematerial. CBE Life Sci Educ 7(1):146–154, 2008.doi:10.1187/cbe.07-08-0060

15 Stein PS, Challman SD, Brueckner JK. Usingaudience response technology for pretest reviews in anundergraduate nursing course. J Nurs Educ 45:469–473,2006.

16 Premkumar K, Coupal C. Rules of engagement—12tips for successful use of ‘‘clickers’’ in the classroom. MedTeach 30:146–149, 2008. doi:10.1080/01421590801965111

152 JVME 37(2) 6 2010 AAVMC

17 DeBourgh GA. Use of classroom ‘‘clickers’’ topromote acquisition of advanced reasoning skills. NurseEduc Pract 8:76–87, 2008. doi:10.1016/j.nepr.2007.02.002

18 Burnstein RA, Lederman LM, DeBourgh GA. Usingwireless keypads in lecture classes. Phys Teach 39:8–11,2001.

19 Elliot C. Using a personal response system ineconomics teaching. Int Rev Econ Educ 1(1):80–86, 2003.

20 Wit E. Who wants to be . . . The use of a personalresponse system in statistics teaching. MSOR Connections3(2):14–20, 2003.

21 Beatty ID, Gerace WJ, Leonard WJ, Dufresne RJ.Designing effective questions for classroom responsesystem teaching. Am J Phys 74(1):31–39, 2006.doi:10.1119/1.2121753

22 Roschelle J, Penuel WR, Abrahamson L. Thenetworked classroom. Educ Leadership 61:50–54, 2004.

23 Knight JK, Wood WB. Teaching more by lecturingless. Cell Biol Educ 4:298–310, 2005. doi:10.1187/05-06-0082

24 Robertson LJ. Twelve tips for using a computerizedinteractive audience response system. Med Teach 22:237–239, 2000.

25 Bunce DM, Van den Plas JR, Havanki KL.Comparing the effectiveness on student achievement of astudent response system versus online WebCT quizzes.J Chem Educ 83(3):488–493, 2006. doi:10.1021/ed083p488

26 Csikszentmihalyi M, Larson RJ. Nervous Mental Dis175:526–536, 1987.

27 Wood WB. Clickers: A teaching gimmick that works.Dev Cell 7:796–798, 2004 p797.

28 Beekes W. The ‘‘Millionaire’’ method for encouragingparticipation. Active Learn High Educ 7(1):25–36, 2006.doi:10.1177/1469787406061143

29 Middendorf J, Kalish A. The ‘‘change-up’’ in lectures.Natl Teach Learn Forum 5(2):1–5, 1996.

30 Ruhl KL, Hughes CA, Schloss PJ. Using the pauseprocedure to enhance lecture recall. Teach Educ andSpecial Educ 10:14–18, 1987. doi:10.1177/088840648701000103

31 Allen D, Tanner, K. Infusing active learning into thelarge-enrollment biology class: Seven strategies, from thesimple to complex. Cell Biol Educ 4:262–268, 2005.doi:10.1187/cbe.05-08-0113

AUTHOR INFORMATION

Bonnie R. Rush, DVM, MS, Dipl ACVIM, is Head of theDepartment of Clinical Sciences and Professor of EquineInternal Medicine, College of Veterinary Medicine,Kansas State University, Manhattan, KS 66506 USA.E-mail: [email protected].

McArthur Hafen Jr., PhD, LCMFT, is Clinical Instructor, Collegeof Veterinary Medicine, Kansas State University, 112 TrotterHall, Manhattan, KS 66506 USA.

David S. Biller, DVM, Dipl ACVR, is Professor of Radiology,Department of Clinical Sciences, College of VeterinaryMedicine, Kansas State University, Manhattan, KS 66506 USA.

Elizabeth G. Davis, DVM, PhD, Dipl ACVIM (Large Animal), isAssociate Professor, Department of Clinical Sciences andDepartment of Anatomy and Physiology, College of VeterinaryMedicine, Kansas State University, Manhattan, KS 66506 USA.

Judy A. Klimek, DVM, MS, is Associate Professor of Anatomy,Department of Anatomy and Physiology, College of VeterinaryMedicine, Kansas State University, Manhattan, KS 66506 USA.

Butch Kukanich, DVM, PhD, Dipl ACVCP, is AssistantProfessor of Pharmacology, Department of Anatomy andPhysiology, College of Veterinary Medicine, Kansas StateUniversity, Manhattan, KS 66506 USA.

Robert L. Larson, DVM, PhD, Dipl ACT, Dipl ACVPM(Epidemiology), Dipl ACAN, is Professor of ProductionMedicine and Theriogenology, Department of ClinicalSciences and Department of Anatomy and Physiology,College of Veterinary Medicine, Kansas State University,Manhattan, KS 66506 USA.

James K. Roush, DVM, MS, Dipl ACVS, is Professor of SmallAnimal Orthopedic Surgery, Department of Clinical Sciencesand Department of Anatomy and Physiology, College ofVeterinary Medicine, Kansas State University, Manhattan, KS66506 USA.

Thomas Schermerhorn, VMD, Dipl ACVIM (SAIM), isAssociate Professor of Small Animal Internal Medicine,Department of Clinical Sciences and Department of Anatomyand Physiology, College of Veterinary Medicine, Kansas StateUniversity, Manhattan, KS 66506 USA.

Melinda J. Wilkerson, MS, DVM, PhD, Dipl ACVP, is InterimAssociate Dean of Academic Programs and AssociateProfessor of Immunology, Department of DiagnosticMedicine and Pathobiology, College of Veterinary Medicine,Kansas State University, Manhattan, KS 66506 USA.

Brad J. White, DVM, MS, is Assistant Professor of BeefProduction and Management, Department of ClinicalSciences and Department of Anatomy and Physiology,College of Veterinary Medicine, Kansas State University,Manhattan, KS 66506 USA.

JVME 37(2) 6 2010 AAVMC 153