facilitating peer review in an online collaborative learning environment for computer ... · 2014....

7
Facilitating Peer Review in an Online Collaborative Learning Environment for Computer Science Students Cen Li, Zhijiang Dong, Roland H. Untch, and Michael Chasteen Department of Computer Science, Middle Tennessee State University, Murfreesboro, TN, USA Abstract - High student drop out and failure rates in entry- level computer science (CS) courses have contributed to a lower number of qualified CS graduates nationwide. Among various underlying causes that lead to this phenomenon, several unsatisfactory behavioral traits have been identified in CS students, including lack of motivation and combativeness towards the opinions of peers. Many CS educators have attempted to address these problems by developing learning tools to increase students’ motivation through engaging educational activities. An alternative, and possibly better, solution is to combine learning activities with social activities to create an engaging learning environment that foster collaboration among student peers. In this work, we experiment with this idea by developing a peer review system as an integral component of an online social network that features many of the collaborative features found in popular websites, such as Facebook. Results from controlled experiments and student feedbacks indicate that not only did students’ learning performance increases with use of the system, so did the students’ sense of community. Keywords: peer review; peer learning; computer supported collaborative learning; online social network; CS2 1 INTRODUCTION Distressingly high rates of failure and dropout have been reported nationwide among students enrolled in entry level computer science (CS) courses [13]. Studies investigating this problem have identified a lack of motivation, persistence, and passion towards course material, procrastination on assignments, unwillingness to support or aid others, and combativeness towards the opinions of peers as contributing factors [27]. To mitigate some of these factors, educational researchers have found that peer review and the fostering of learning communities can effectively enhance the learning process and make these courses more enjoyable to the students ([17], [23]). However such enhancements come with a cost of time and effort. This paper describes a successful system (PeerSpace) that, using web technology to reduce burdensome administrative overhead, integrates peer review into a peer- friendly, collaborative social learning environment. We report on our findings from experiments that peer review, as facilitated using PeerSpace, improves student learning outcomes. Additionally, we observe that students using PeerSpace have a demonstrably higher sense of community than those who do not. Section 2 provides background information on peer review and the PeerSpace system. It is also here that we state our conjectural thesis that peer review integrated into a peer-friendly learning environment is effective in improving student learning and promulgating a sense of community. Section 3 deals with the mechanics of peer review in PeerSpace. Section 4 contains the results of our experiments with our approach. Section 5 concludes the paper and suggests future work. 2 BACKGROUND 2.1 Peer Review In a course setting, peer review is a process wherein students evaluate the creative work or performance of other (peer) students and give constructive feedback [24]. It presents students with the opportunity to learn through evaluation and reflection [2], thereby strengthening the overall cognitive thinking capacity of participating students. The most common use of peer review is found in English composition courses where peers swap essay drafts with each other and leave feedback in the form of corrections and suggestions to the author. Students who participate in peer review are shown to be better readers and have developed stronger self-editing skills. With guidance from their teachers, the students learn to perform reviews that are constructive and helpful rather than destructive and harsh, which benefits students receiving feedback and cultivates their ability to accept criticism from others. Peer review has been applied to many fields of study such as biology, psychology, nursing, and computer science with successful results [24]. The most common occurrence of peer review in computer science appears in introductory CS courses ([10], [11], [18], [21], [25], [26], [28]). These courses typically appear in the first year and introduce programming in a language such as C++ or Java. Results in this area are very good and range from an increased sense of community [10] to better understanding of the material presented [18]. This success has been carried over to more advanced courses such as operating systems [6], computer architecture [7], scientific writing [1], web technologies [9], software engineering [29], parallel processing [8] and game design [19]. Peer review has also been used in atypical manners, for example, in a compiler

Upload: others

Post on 01-Jan-2021

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Facilitating Peer Review in an Online Collaborative Learning Environment for Computer ... · 2014. 1. 19. · Facilitating Peer Review in an Online Collaborative Learning Environment

Facilitating Peer Review in an Online Collaborative

Learning Environment for Computer Science Students

Cen Li, Zhijiang Dong, Roland H. Untch, and Michael Chasteen Department of Computer Science, Middle Tennessee State University, Murfreesboro, TN, USA

Abstract - High student drop out and failure rates in entry-level computer science (CS) courses have contributed to a lower number of qualified CS graduates nationwide. Among various underlying causes that lead to this phenomenon, several unsatisfactory behavioral traits have been identified in CS students, including lack of motivation and combativeness towards the opinions of peers. Many CS educators have attempted to address these problems by developing learning tools to increase students’ motivation through engaging educational activities. An alternative, and possibly better, solution is to combine learning activities with social activities to create an engaging learning environment that foster collaboration among student peers. In this work, we experiment with this idea by developing a peer review system as an integral component of an online social network that features many of the collaborative features found in popular websites, such as Facebook. Results from controlled experiments and student feedbacks indicate that not only did students’ learning performance increases with use of the system, so did the students’ sense of community.

Keywords: peer review; peer learning; computer supported collaborative learning; online social network; CS2

1 INTRODUCTION Distressingly high rates of failure and dropout have been reported nationwide among students enrolled in entry level computer science (CS) courses [13]. Studies investigating this problem have identified a lack of motivation, persistence, and passion towards course material, procrastination on assignments, unwillingness to support or aid others, and combativeness towards the opinions of peers as contributing factors [27]. To mitigate some of these factors, educational researchers have found that peer review and the fostering of learning communities can effectively enhance the learning process and make these courses more enjoyable to the students ([17], [23]). However such enhancements come with a cost of time and effort. This paper describes a successful system (PeerSpace) that, using web technology to reduce burdensome administrative overhead, integrates peer review into a peer-friendly, collaborative social learning environment. We report on our findings from experiments that peer review, as facilitated using PeerSpace, improves student learning outcomes. Additionally, we observe that students using

PeerSpace have a demonstrably higher sense of community than those who do not. Section 2 provides background information on peer review and the PeerSpace system. It is also here that we state our conjectural thesis that peer review integrated into a peer-friendly learning environment is effective in improving student learning and promulgating a sense of community. Section 3 deals with the mechanics of peer review in PeerSpace. Section 4 contains the results of our experiments with our approach. Section 5 concludes the paper and suggests future work.

2 BACKGROUND

2.1 Peer Review

In a course setting, peer review is a process wherein students evaluate the creative work or performance of other (peer) students and give constructive feedback [24]. It presents students with the opportunity to learn through evaluation and reflection [2], thereby strengthening the overall cognitive thinking capacity of participating students.

The most common use of peer review is found in English composition courses where peers swap essay drafts with each other and leave feedback in the form of corrections and suggestions to the author. Students who participate in peer review are shown to be better readers and have developed stronger self-editing skills. With guidance from their teachers, the students learn to perform reviews that are constructive and helpful rather than destructive and harsh, which benefits students receiving feedback and cultivates their ability to accept criticism from others.

Peer review has been applied to many fields of study such as biology, psychology, nursing, and computer science with successful results [24]. The most common occurrence of peer review in computer science appears in introductory CS courses ([10], [11], [18], [21], [25], [26], [28]). These courses typically appear in the first year and introduce programming in a language such as C++ or Java. Results in this area are very good and range from an increased sense of community [10] to better understanding of the material presented [18]. This success has been carried over to more advanced courses such as operating systems [6], computer architecture [7], scientific writing [1], web technologies [9], software engineering [29], parallel processing [8] and game design [19]. Peer review has also been used in atypical manners, for example, in a compiler

Page 2: Facilitating Peer Review in an Online Collaborative Learning Environment for Computer ... · 2014. 1. 19. · Facilitating Peer Review in an Online Collaborative Learning Environment

course [22] where groups can borrow code from other students to use as a framework for the final portion of a term project if the group’s own framework didn’t turn out as well as expected.

In a typical peer review process, each student occupies two roles: as a peer reviewer to evaluate other students’ work and as an author to receive feedback from peers. As a peer reviewer, each student is required to practice critical analysis skills and give constructive feedback. Additionally, peer reviewers have the opportunity to see multiple solutions to the same problem. In the context of computer science, a student performing a review learns to appreciate good programming and good documentation as they strive to understand the work they are reviewing; consequently these students are more likely to practice better programming and documentation in their future work. As an author, each student will receive multiple feedbacks from peers, and learn to accept constructive criticisms and suggestions from their peers. It has been observed that students tend to be more willing to accept criticism from their peers than from the instructor. Learning from their peer’s programs and suggestions can be more effective in improving the quality of their own programs. Additionally, this process generates a greater sense of community because students are helping each other evolve into a more mature and skilled programmer.

Next we review a number of existing systems developed to support peer review.

2.2 Related Work

In an effort to improve an already verified method, Hundhausen et al. moved their paper based peer review to an online studio-based learning environment (OSBLE) [11]. The system supports and uses both individual and group review techniques. Reviews are done individually before the group comes together to create one review out of all the members’ reviews. Students can place comments on specific lines or line ranges, along with marking the severity of the problem, and then drag them over to the group review if the group likes the assessment that was made. The students can do individual reviews from any computer, but the group reviews are done during lab time under a moderator. After the review process, students can view their feedback and resubmit their work. It is important to note that OBSLE does not support any kind of anonymous reviewing and does not use a rubric like most implementations. While this is in contradiction with most methods, results from the system seem to have been positive. Overall, Hundhausen et al. report that the transition to an online environment sped up the review process and made reviewing work a more efficient process for students. However, OSBLE is more than just a peer review tool as it supports basic course management. Future work has been planned to add more social features to OBSLE such as profiles and giving authors a way to rate the feedback they received.

Harri Hämäläinen et al. originally experimented with an open-source peer review system called MyReview [16] for a

master’s level course [8]. They concluded that advanced students could see a benefit in looking at each other’s code. But the use of a system not designed with code review in mind was too distracting. Therefore, Ville Hyyrynen et al. (including Hämäläinen) decided to create a custom implementation of a peer review system called MyPeerReview ([12], [15]). This design is different from the previous examples because it is built off of an existing content management system (CMS) called Drupal [4] which already has basic features including users, profiles, and navigation tools. Because of this, only a module was needed to add support for assignment submission and peer review. The system itself does not support anonymous peer review, but the students were asked, not forced, to remove identifying information from their files before making a submission. Reviews are done individually and guided by a rubric constructed inside Drupal. In reference to the procedure, instructors have to manually “open” an assignment for students to submit work, and put it on “hold” when the deadline is reached. There were not a lot of features implemented in MyPeerReview initially, but the results were positive. Students felt like peer review was a worthwhile assignment, and the process of submitting and reviewing assignments went smoothly. However, several improvements were suggested, including adding a rating system, automated timing for deadlines, and using a rubric more similar to what the instructor uses.

Like the plans for online studio-based learning environment (OSBLE) developed by Hundhausen et al. [11] and the MyPeerReview system developed by Ville Hyyrynen et al. ([12], [15]) the peer review system developed for this work is part of a larger system that supports many features in addition to peer review. Like the approach taken by MyPeerReview, a content management system (CMS) is used as the basis of the system development. Elgg CMS [5] was chosen because it comes with many desirable social network features such as user profile, blog, and forum.

2.3 PeerSpace

PeerSpace ([3], [14]) is an innovative online collaborative learning environment developed for CS education with the purpose of bringing real-world activities into a virtual environment. One of the major benefits of this is the elimination of time and effort wasted in organizing an event because of scheduling or other conflicts found in real-life. PeerSpace’s asynchronous, online environment creates a more efficient work flow that allows students to participate in tasks such as a discussion or group assignment without the instructor or other students being present.

PeerSpace is made up of individual components, called modules, that are classified as either social or learning modules. The corresponding social and learning activities supported by the modules interact as depicted in Figure 1.

The social modules provide the students with a variety of tools to support asynchronous and synchronous peer

Page 3: Facilitating Peer Review in an Online Collaborative Learning Environment for Computer ... · 2014. 1. 19. · Facilitating Peer Review in an Online Collaborative Learning Environment

communication. The social modules stimulate the development of long-term social connections for CS students and support features found in popular social networking sites, such as Facebook, including profiles, statuses, blogs, forums, chats, and games. These modules are designed to help the students in building peer support and trust and thus help the students feel they are part of a learning community. These are essential in making collaborative learning among student peers successful and enjoyable.

Figure 1. Module interaction model

The learning modules provide a set of tools that support collaborative learning activities such as group homework, peer code review and collaborative programming. The shared knowledge these activities instill affords the students a deeper understanding of course-related concepts and a broader knowledge of course-related topics. The idea of putting the learning activities amid the social activities strengthens the support among students and expands a student’s social network with new friends.

It is our conjecture that peer review, a proven learning approach, integrated into a peer-friendly, collaborative learning environment is effective in improving student learning while providing a constructive atmosphere for students to develop a sense of community.

3 PEER REVIEW IN PEERSPACE

3.1 System Architecture

The two main components supporting peer review in PeerSpace are the Assignment module and the Peer Review module. Both modules interact with the Subversion (SVN) server through the Grading Using Subversion (GUS) component. Each module provides two distinguished sets of features for two different types of users, i.e. the instructors and the students. Figure 2. illustrates the architecture of the peer review tool in PeerSpace.

The Assignment module handles the process of creating and submitting assignments. Instructors can create assignments through the instructor side of the Assignment module. Once an assignment is created, students can submit one or more of those assignments. The submissions are stored in the SVN server for the instructors to retrieve and grade. The instructors can subsequently put graded/marked assignments

back to the SVN server so that students can obtain them through the Assignment module.

Figure 2. Architecture of Peer Review

3.2 Procedure to Perform Peer Review

The Peer Review module allows the students to review their classmates’ assignments. First, an instructor creates a peer review assignment of an existing regular assignment for his/her class in the Peer Review module. Instructors must provide two dates for each peer review assignment: assign date that specifies the time to assign students’ works to reviewers, and deadline date marks the end of the allotted review time. Then when the assign date reaches, the Peer Review module will assign students’ submissions to reviewers. During this phrase, all submitted assignments will be retrieved from the SVN server and. copies of these assignments are stored on the PeerSpace server for the reviewers to view or download. Now reviewer can perform the evaluation on assigned works before the deadline. All completed peer reviews are stored in the Elgg database.

For CS projects, a student typically needs to submit a number of files, including source code, data files, and executable files. For peer review, instructors can specify which of the files and any extra files students should have for reviewing. Additionally, instructors can specify the files as viewable files, downloadable files, or both. Files marked as viewable are converted into images to prevent plagiarism. Since the peer review process is anonymous, all identification information is removed from viewable files during conversion. Students can retrieve downloadable files. Furthermore, instructors can specify files to be used in the review process that were not listed in the required files for the assignment. This allows instructors to do things such as upload special test cases from their personal computer or select a file that was added to the students’ repositories after the submission deadline.

Page 4: Facilitating Peer Review in an Online Collaborative Learning Environment for Computer ... · 2014. 1. 19. · Facilitating Peer Review in an Online Collaborative Learning Environment

Students may be assigned to review one, two, or three programs for each review assignment. This allows students a better chance to learn through the peer review process because they can see more than one additional response to the assignment and receive more feedback about their solutions. The form to create peer review assignments is shown in Figure 3.

Figure 3. Peer review assignment creation form

Instructors can guide the peer review process by providing a review rubric for students to follow when performing a review. This allows customization of peer review assignments to fit the assignment being reviewed. The rubric allows instructors to create custom point values for criteria provided in the rubric as well as section headers to control the flow of the review and comments to clarify how to assess specific parts of the assignment. Instructors can preview how their rubric will be displayed to the students.

3.3 Manage and Monitor Peer Review Assignments

Once a peer review assignment has been created, it can be assigned to the students manually or automatically. If the instructor wants to assign the peer review assignment immediately after creating it, then a peer review can be assigned manually immediately. Otherwise, it can be assigned automatically at pre-specified time.

After a peer review assignment has been created, it can be edited if small changes need to be made. If the peer review assignment has yet to be assigned, then instructors can alter the assign date and time. The instructor can also change the deadline of the peer review assignment.

Peer Review assignments can be deleted at any time. Deleting a peer review assignment before it was assigned will simply delete the template used to assign the peer reviews to the student. If the peer review assignment is deleted after it has been assigned then the entire peer review assignment will be purged from the system, including all of the students’ files and peer reviews. Peer review assignments can be deleted one at a time or collectively. The collective delete operation can be used at the end of a semester to clear all student assignments from PeerSpace.

After a peer review assignment has been assigned, instructors can view the progress of the class as well as the individuals through the “Grade” view. A list of all assigned peer reviews for a particular assignment is available in the Peer Review module for the instructor to see which reviews have been completed. A fully completed review listing contains the author’s name, picture, and score the author received as well as the reviewer’s name, picture, and rating the reviewer received. The list of peer reviews can be sorted by author, score, reviewer, or rating to make it easier to view the progress and performance of the review assignments.

Figure 4. Partial view of a peer review completed by a student

The instructor can view an individual peer review

assignment as soon as it has been reviewed. The instructor view contains all of the author’s work, the reviewer’s assessment and comments, as well as the score. Figure 4 shows a partial view of a peer review completed by a student.

Page 5: Facilitating Peer Review in an Online Collaborative Learning Environment for Computer ... · 2014. 1. 19. · Facilitating Peer Review in an Online Collaborative Learning Environment

After the review, the author has a chance to rate the feedback he or she received. The rating comes from the author’s perceived quality of the review. The author does this by evaluating the reviewer’s accuracy, helpfulness, knowledge, and fairness. Each category used in the rating process can be given a score of one at the lowest and five at the highest. Therefore, the lowest possible rating is a four and the highest is a twenty. Both the score and the rating are shown to the instructor when an individual peer review assignment is viewed. An example can be seen in Figure 5.

Figure 5. Results of an individual peer review assignment

4 EXPERIMENTAL ANALYSIS Students from two sections of CS2 participated in this study. The same instructor taught both sections, and the same set of assignments and exams were used. Each section has 27 students. One section is chosen as the experiment group and the other as the control group at the beginning of the semester. Students from both groups have access to PeerSpace and all its tools. The only difference between the two is: students in the experiment group performed peer code review in their class with the support using the peer code review tool of PeerSpace, while no peer code review was performed in the control group. Peer code review started from the first part of the semester. Each student was required to review 2 programs submitted by other students in the class. During the semester, average scores on programming assignments, tests, and final course averages were collected and compared between the experiment and the control sections. Furthermore, a Modified Group Environment Questionnaire (MGEQ) was given to students in both sections at the end of the semester to collect and compare their sense of group learning environments. The Modified Group Environment Questionnaire (MGEQ) [20] features four subcategories that consider the individual attraction to group activities and group integration aspects of both social actions and learning tasks. These categories are made from groupings of the eighteen questions found in the MGEG. Table 1 lists all the questions under their respective categories.

All questions use a Likert scale with “strongly agree” having a value of 1 and “strongly disagree” having a value of 5. For questions 1, 2, 3, 4, 6, 7, 11, 13, 14, 17, and 18, “strongly disagree” is desired so a higher score is better. Therefore, a lower score is better on all the remaining questions.

Each category has an overall value calculated by adding the results from each question in the category together.

Questions where “strongly agree” is desired are flipped so that all questions are uniform in their worth. This allows comparisons to be done for each category. Both grades and survey results are analyzed using a two-tailed t-test with a threshold of 0.1.

Table 1. MGEQ questions in four categories

# A. Individual Attraction to Group (Social): 1 I do not enjoy interacting with members of this class.* 3 I am not going to miss the members of this class when the

semester ends.* 5 Some of my best college acquaintances/friends are from this

class. 7 I enjoy the atmosphere in other classes more than this class.* B. Individual Attraction to Group (Learning): 2 I’m unhappy that this class is not challenging enough.* 4 I am unhappy with the amount of effort my fellow classmates

devote to this class.* 6 This class does not give me enough opportunities to improve

my programming skills.* 8 I like the way group activities enhance the class. 9 This class is one of my most important college classes. C. Group Integration (Social): 11 Members of our class would rather work alone than work in

groups.* 13 Class members rarely interact with one another.* 15 Members of this class would enjoy spending time together

outside of class. 17 Members of this class do not interact outside of lectures and

labs.* D. Group Integration (Social): 10 Class members help one another. 12 Instead of blaming someone else, members of our class

generally take personal responsibility if they are unable to complete an assignment.

14 Class members have conflicting aspirations for what they want out of the class.*

16 If members of this class have problems with the material, other members of the class express concern and help them with these problems.

18 Class members are reluctant to communicate with one another.*

4.1 Study the effectivenss of peer code review on improving student learning

Table 2 presents the average scores on programming assignments, tests, and final course averages of the two groups. Statistically significantly better student learning performance was observed in the experiment class in all three areas. The programming assignments (PA) were examined individually to see if peer review had an effect from assignment-to-assignment. Because the experiment section had a higher average on every PA, the progression and details of each PA were reviewed. Table 3 contains the average of each PA for the control and experiment sections. The first peer review assignment was over PA2. Therefore, it is only to be expected that the first two PAs show little difference between

Page 6: Facilitating Peer Review in an Online Collaborative Learning Environment for Computer ... · 2014. 1. 19. · Facilitating Peer Review in an Online Collaborative Learning Environment

the control and experiment section. More importantly, PA3, PA7, PA8, and PA9 are all statistically significant when compared to the control section. As the experiment section completed more peer review assignments, the difference between the two sections grew larger. These results show that peer code review activity lead to better student performance in programming assignments as well as in the overall learning in the course.

Table 2. Average scores of the experiment and the control section students

Control Avg. (Std.)

Experiment Avg. (Std.)

T-Test

Programs 73.05 (21.84) 88.93 (12.37) 0.013 Tests 65.08 (26.76) 82.29 (17.73) 0.033 Overall 67.63 (25.03) 87.09 (11.88) 0.006

Table 3. Average scores of the 9 programming assignments from in the experiment and control sections

CS2 PAs

Control Avg. (Std.)

Experiment Avg. (Std.)

Difference T-Test

PA 1 92.22 (17.41) 94.78 (19.58) 02.56 0.691 PA 2 92.38 (08.70) 93.78 (10.82) 01.40 0.682 PA 3 89.31 (11.24) 95.72 (09.30) 06.41 0.078 PA 4 73.31 (32.96) 82.00 (34.84) 08.69 0.462 PA 5 71.38 (37.45) 85.94 (25.48) 14.57 0.190 PA 6 72.94 (37.59) 85.00 (23.46) 12.06 0.264 PA 7 58.19 (41.94) 92.39 (09.91) 34.20 0.002 PA 8 57.38 (47.78) 90.06 (24.14) 32.68 0.015 PA 9 50.31 (44.94) 80.67 (31.71) 30.35 0.028

4.2 The effectiveness of peer review on improving students’ sense of collaborative learning environment

Responses from the CS2 students in the experiment group show, at the end of the semester, they have a significantly higher sense of learning community and better group learning environment than those in the control group. Group environment survey results (Table 4) show the difference observed in the mean values of Individual Attractions to the group (social), Group Integration (social), and (c) Group Integration (learning) are statistically significant between the two groups, i.e., the two tailed t-test results are less than 0.1. T-test results along individual questions show the student response from the experiment group are significantly higher than those from the control group along the following set of questions: a. I do not enjoy interacting with members of this class. b. I am unhappy with the amount of effort my fellow

classmates devote to this class. c. Class members help one another. d. Members of our class would rather work alone than work

in groups.

e. Class members rarely interact with one another. f. Members of this class would enjoy spending time together

outside of class. g. If members of this class have problems with the material,

other members of the class express concern and help them with these problems.

h. Members of this class do not interact outside of lectures and labs.

This shows that the students in the experiment group interact more with their classmates, are more willing to work collaboratively, and are willing to help each other, thus a better overall sense of classroom learning community.

Table 4. CS2 MGEQ subcategory results

Category Control Avg. (Std.)

Experiment Avg. (Std.)

T-Test

Individual Attraction to Group (Social)

11.77 (2.12) 13.94 (2.79) 0.0290

Individual Attraction to Group (Learning)

18.23 (2.83) 19.63 (2.98) 0.2117

Group Integration (Social) 10.00 (4.18) 15.00 (2.97) 0.0008 Group Integration (Learning) 16.30 (3.75) 18.88 (2.55) 0.0375

4.3 The effectiveness of peer review on

improving students’ sense of community

Table 5 compares student sense of community between the experiment and the control group by comparing the effects on individual attraction to group and group integration along learning (Learning in community) vs. social interaction (Connectedness in community). It is observed that the students in the experiment group have a significantly higher sense of belonging to a learning community than the students from the control group. While the response also shows the average sense of connectedness among the students in the experiment group is slightly lower than that from the control group, the overall sense of community among the experiment group is still higher (though not statistically significantly higher). Table 5. Comparing student responses about their sense of community between the control and the experiment groups

Control Avg(sdv)

Experiment Avg(sdv)

T-Test

Connectedness in community

32.6(6.9) 31.9(2.3) 0.72

Learning in community 29.8(5.8) 33.1(3.3) 0.06 Overall sense of community

62.4(9.5) 65(4.0) 0.32

5 CONCLUSIONS AND FUTURE WORK In this paper, we described an automated peer review system comprising a complete suite of instructor tools to facilitate a peer review process. We promulgated an approach that combines peer review with a social network to create a centralized place for students to complete work and contribute

Page 7: Facilitating Peer Review in an Online Collaborative Learning Environment for Computer ... · 2014. 1. 19. · Facilitating Peer Review in an Online Collaborative Learning Environment

to the community, allowing social and learning components to feed each other and drive students. We observed that this allowed for a peer review process that is seamless and provides both instructors and students with a continuous feel.

Additionally we provided the results of experiments that sought to ascertain effectiveness of peer review on student learning and on promoting collaborative learning. The results show our approach to peer review benefits students both in learning and in supporting a peer-friendly collaborative environment. Results from the learning performances show the experiment section had significantly higher grades in all types of assignments as well as a better sense of community. We conjecture that the sense of community grew because peer review is a collaborative learning activity that fosters constructive criticism that pulls students closer together.

As future work, we plan to give instructors the ability to manually pair peer reviewers; this would allow instructors to exercise greater control over the peer review process. Additionally, we are continuing to gather data from controlled experiments in additional course offerings and subjects with a view of establishing greater external validity of our positive results.

6 ACKNOWLEDGMENT

This work is funded by the NSF CCLI/TUES grant (DUE-0837385), NSF REU grant (IIS-1038172), and the Sponsored Research program at MTSU.

7 REFERENCES

[1] C. Bauer, K. Figl, M. Derntl, P. P. Beran, and S. Kabicher, “The student view on online peer reviews,” SIGCSE Bull. 41(3), pp. 26-30, July, 2009.

[2] B. Bloom. (1956).Taxonomy of Educational Objectives. Handbook I: Cognitive Domain. David McKay Co..

[3] Z. J. Dong, C. Li, and R. H. Untch, “Build peer support network for CS2 students,” In Proceedings of the 49th Annual Southeast Regional Conference. Kennesaw, GA, USA, pp. 42-47, 2011.

[4] Drupal. Retrieved February 29, 2012, from Drupal: http://drupal.org. [5] Elgg. Retrieved February 29, 2012, from Curverider Limited:

http://elgg.org. [6] E. F. Gehringer, “Electronic peer review and peer grading in computer-

science courses,” SIGCSE Bull. 33:1, pp. 139-143, February, 2001 [7] E. F. Gehringer, “Building resources for teaching computer architecture

through electronic peer review,” In Proceedings of the 2003 workshop on Computer architecture education: Held in conjunction with the 30th International Symposium on Computer Architecture (WCAE '03). San Diego, CA, USA, Article 9, 2009.

[8] H. Hämäläinen, J. Tarkkonen, K. Heikkinen, J. Ikonen, and J. Porras. “Use of Peer-Review System for Enhancing Learning of Programming,” Ninth IEEE International Conference on Advanced Learning Technologies (ICALT ’09), Riga, Latvia, July 15-17, pp. 658-660, 2009.

[9] H. Hämäläinen, V. Hyyrynen, J. Ikonen, and J. Porras, “MyPeerReview: an online peer-reviewing system for programming courses,” In Proceedings of the 10th Koli Calling International Conference on Computing Education Research, Berlin, Germany, pp. 94-99, 2010.

[10] C. Hundhausen, A. Agrawal, D. Fairbrother, and M. Trevisan, . “Integrating pedagogical code reviews into a CS 1 course: an empirical

study,” In Proceedings of the 40th ACM technical symposium on computer science education. Chattanooga, TN, USA, pp. 291-295, 2009.

[11] C. Hundhausen, A. Agrawal, and K. Ryan. “The design of an online environment to support pedagogical code reviews,” In Proceedings of the 41st ACM technical symposium on computer science education. Milwaukee, WI, USA, pp. 182-186, 2010.

[12] V. Hyyrynen, H. Hämäläinen, J. Ikonen, and J. Porras. MyPeerReview: an online peer-reviewing system for programming courses. In Proceedings of the 10th Koli Calling International Conference on Computing Education Research, Berlin, Germany, pp. 94-99, 2010.

[13] P. Kinnunen, and L. Malmi, “Why students drop out CS1 course?” In: Proceedings of the 2nd International Workshop on Computing Education Research. Canterbury, UK, pp. 97-108, Sept. 2006.

[14] C. Li, Z. J. Dong, R. H. Untch, M. Chasteen, and N. Reale, “PeerSpace - An Online Collaborative Learning Environment for Computer Science Students,” IEEE 11th International Conference on Advanced Learning Technologies, Athens, GA, pp. 409-411, 2011.

[15] MyPeerReview. Retrieved February 29, 2012, from SourceForge: http://sourceforge.net/projects/mypeerreview/.

[16] MyReview. Retrieved February 29, 2012, from SourceForge: http://myreview.sourceforge.net.

[17] A. Perez-Prado, and M. A. Thirunarayanan, “Qualitative comparison of online and classroom-based sections of a course: Exploring student perspectives,” Educational Media International, 9(2), pp.195-202, 2002.

[18] K. Reily, P. L. Finnerty, and L. Terveen. “Two peers are better than one: aggregating peer reviews for computing assignments is surprisingly accurate,” In Proceedings of the ACM 2009 international conference on Supporting group work (GROUP '09), Sanibel Island, FL, pp. 115-124, 2009.

[19] A. Repenning, A. Basawapatna, and K. H. Koh, “Making university education more like middle school computer club: facilitating the flow of inspiration,” In Proceedings of the 14th Western Canadian Conference on Computing Education (WCCCE '09), Roelof Brouwer, Diana Cukierman, and George Tsiknis (Eds.), Burnaby, British Columbia, USA, pp. 9-16, 2009.

[20] A. P. Rovai, “Development of an instrument to measure classroom community,” The Internet and Higher Education. 5:197-211, 2002.

[21] J. Sitthiworachart and M. Joy, “Effective peer assessment for learning computer programming,” In Proceedings of the 9th annual SIGCSE conference on Innovation and technology in computer science education. Leeds, UK, June 28-30, pp. 122-126, 2004.

[22] H. Sondergaard, “Learning from and with peers: the different roles of student peer reviewing,” In Proceedings of the 14th annual ACM SIGCSE conference on Innovation and technology in computer science education (ITiCSE '09), Paris, France. pp. 31-35, 2009.

[23] K. Swan, “Building learning communities in online courses: the importance of interaction,” Education Communication and Information, 2(1), pp. 23-49, 2002.

[24] K. Topping, “Peer Assessment Between Students in Colleges and Universities,” Review of Educational Research, 68(3), pp. 249 -276, 1998.

[25] A. Trivedi, D. C. Kar, and H. Patterson-McNeill, “Automatic assignment management and peer evaluation,” J. Comput. Small Coll. 18(4), pp. 30-37, April, 2003,.

[26] D. A. Trytten, “A design for team peer code review”, In Proceedings of the 36th SIGCSE technical symposium on computer science education. St. Louis, MO, USA, pp. 455-459, 2005.

[27] W. M., Waite, and P. M. Leonardi, “Student culture vs. group work in computer science,” In Proceedings of the 35th SIGCSE Technical Symposium on Computer Science Education, Norfolk, VA, USA, pp. 12-16, March 2004.

[28] Y. Q. Wang, Y. J. Li, M. Collins, and P. Liu, “Process improvement of peer code review and behavior analysis of its participants,” SIGCSE Bull. 40(1), pp. 107-111, March 2008.

[29] W. J. Wolfe. “Online student peer reviews.” In Proceedings of the 5th conference on Information technology education, Salt lake city, UT, USA, pp. 33-37, 2004.