sigcse 2016

30
Animated Examples as Practice Content in a Java Programming Course Roya Hosseini, Teemu Sirkiä, Julio Guerra Peter Brusilovsky, Lauri Malmi SIGCSE 2016 1

Upload: roya-hosseini

Post on 12-Apr-2017

155 views

Category:

Science


0 download

TRANSCRIPT

PowerPoint Presentation

Animated Examples as Practice Content in a Java Programming Course Roya Hosseini, Teemu Sirki, Julio Guerra Peter Brusilovsky, Lauri Malmi

SIGCSE 2016

1

Hello everyone! In this talk I will present the joint work with our collaborators at Aalto University in Finland.

www.techinasia.com

2

To learn to program, we first need to understand how program works.Perhaps, this comic is a good illustration for showing importance of understanding program dynamics.

Understanding Program Dynamics

3

Understanding program dynamics, i.e. how program execution is carried out in computer memory, has been one of the central challenges of learning programming.

Its important to make execution process visible for students, especially novices who are new programming to help themeasily see and follow important steps , andavoid programming misconceptions that are caused by the code not behaving as expected.

So, many methods and tools were developed to support students in understanding program dynamics.One advanced method is program visualization that offer animated examples to promote deeper understanding of code semantics

Jeliot 34[Moreno et al., 2004]

One well-known tool for this is Jeliot 3.As you can see from this example, it provides visual representation of the programs execution.

Python Tutor5

[Guo, 2013]

Another well-known tool is Python Tutor. As you can see from this example, Using this tool, teachers and students can write Python programsin the web browser, step forwards and backwards through execution to view the run-time state of data structures.

Were They Successful? 6

So as we saw, these are amazing tools! But the question is were they successful in practice?

7

Labs

Yes, they Worked excellent in labs

8

Classrooms

and no because they did not work in classrooms , why? stress the problem.According to SIGCSE working group report [Naps et al. 2003], they are usually underused in classrooms by teachers and students

Practice System9

So, people have been looking for ways to solve this low usage problem. One solution that they suggest is packaging interactive learning content into online practice systems that students can access for their own benefit, as they receive no no additional credit for that.

10

Teachers use it Students dont use it

A good example of these practice systems is CodingBat.com that is an online site of interactive coding problems. Students can write the code for each problem and receive immediate feedback.

This is really good, But this does not solve the problem completely! it engages teachers, they need no longer to master these tools to integrate the interactive contents into their teaching. Instead, they just provide students with a link to the practice system.

But it still has the problem of student not using the systems,And can we simply achieve it by turning these examples from practice to mandatory content? We apparently no! because it can lead to mindless clicking through the content to gain points without any understanding.

So, our goal was to solve this problem by motivating students to use these interactive examples and then examining value of them on students learning

11EngagementSocial ComparisonOpen Student Modeling Organized Course Structure

We introduced a novel context where interactive learning resources such as animated examples can be used.This novel context is a special practice system that benefits from three main features to engage students:1) having all content in one place, organized by course topics2) open student modeling that let student monitor their own progress and 3) social comparison that let students compare their progress with the progress of other students in class. We have evidence from past work that by these features in the practice system, we can increase students engagement which can in turn positively affect their learning.

Mastery Grids[Loboda et al., 2014]12

The practice system that we used in our study is called Mastery Grids that was developed in our group. This screenshot shows the interface of MG.

Mastery Grids organizes the content in topics that are shown as the columns of the gird. The color of a cell in a topic shows progress of the student in that topic and it gets darker as the student completes more contents within that topic.

The first row labeled by Me, shows the progress of student in each topic and the other two rows provides social comparison visualizations. The last row labeled by Group let the student see the progress of the class in each of the topics, and the middle row labeled by Me vs Group let the student compare his/her progress with the group.

13

If the student clicks on a cell in the grid, she/he can see resources related to that topic. And a click on a resource cell will show that resource in a new window that overlays Mastery Grids

Mastery Grids integrates three kinds of content: Problems, examples, and animated examples all hosted by external content servers.

Demo14

Problems:

Problems are parameterized code evaluation exercises, which the subject could repeat several times with different parameters.

Problems include automatic evaluation and feedback for student, saying

whether the response was correct or wrong.

15

Annotated examples:

Annotated examples uses a simpler technology to provide interactive exploration of programming examples. They turn an example code into a a worked-out example by

Adding explanation to example lines to show how specific goal can be achieved.

Students worked with examples such as which lines were clicked or how much time they spent on each line, etc.

16

Animated examples:Animated examples are created using a JavaScript library called jvsee.Every time when a student uses the controls, a log entry with the timestamp and current step number is created.

Classroom Studies17PITT 2014 (n=56)PITT 2015 (n=33) WSSU 2015 (n=20)

Practice System: Mastery Grids Domain: Java Programming3 types of interactive content in 19 topics

To see if animated examples brings more advantage than examples with simpler technology, we compared them with annotated examples.

Value of animated examples

Baseline: annotated examples18Data Analysis

In this paper, we examine the value of animated program examples as practice content for both educational impact and prospects for engagement. To examine the added value of animated examples, we compare them to annotated examples, which are a more traditional way to support students in learning to understand programs.

MeasuresEngagementProblem solving performanceLearning gainGrade19

So we wanted to examine the value of animated examples in Mastery Grids. We analyzed collected data from student use of the system and looked at four measures: .

20Avg. %examples viewed by userAnnot. : 33%(SE=3%)Anim. : 24%(SE=3%)

Animated examples were more engaging!To measure the engagement, we looked at the amount of work with practice content done by the students. We calculated avg. completion that by dividing the average clicks actually made by the total number of clicks needed to view the whole example.Only started examples were taken into account. For annotated examples, a click means an action to view an explanation, andfor animated examples, a click means a step to move forward.

As you can see from this chart, percentage of completion was significantly higher in animated ex. This indicates that animated ex. motivated students to follow more lines, 23.6 % more than annotated ex. (the difference was sig. using non-param test)

21Avg. total time (sec) spent on examples by user Annot. : 2196.5 (SE=285.6) Anim. : 1585.7 (SE=426.5)

Student not only completed more animation, but they also stayed longer in each animated ex. As you can see from this chart, the time spent on each ex. Is about twice more than the time spent in annotated ex. This difference was also significant (using non-parametric test)

One thing that I want to mention is the difference btw nature of animated examples and annotated examples.In anim. they need to clicks on every lines to get to a line of their interest whereas in anno. they can freely select a line they want. The interesting observation was that despite the nature of anim. they were still willing to see all lines and completed on average 95% of the ex. they started.

22Regression Analysisexample attemptsknowledge

+

-

we ran step-wise regression to see impact of examples on grade, performance, and on lgain, our factors were attempts, mg-actions, pretest, gender, etc.

One thing I want to mention here is that there is a 2-way correlation between work with example and knowledge level.One is that examples do help students to increase knowledge (positive connection between student work with examples and performance), but the other way is negative, due to free content choice, lower knowledge and failures can led to the increase use of examples (negative connection between examples and performance) Despite this two way effect, regression can show which of these two ways has a more dominant effect.

23IVImpact on #correct problems#Animated ex.+Time on annotated ex.-

Step-wise regression (N = 109)DV: #correct problemsIV: pretest, gender, group, usage measures

This table shows the reliable predictors (sig. 0.05) related to work on examples in best model found by step-wise regression.We found that attempting an animated ex. is positively affecting number of correct problem attempts. Students had likely learned more from animated examples as they answered more questions correctly. So, positive impact of animated ex. overcame the process associating examples with poor knowledge. In contrast, annotated ex. appeared to be less useful for learning allowing the reverse process to overcome the positive impact on knowledge.

24Step-wise regression (N = 75)DV: post-testIV: pretest, gender, group, usage measures IVImpact onpost-test#Animated ex.+#Annotated ex.-

Switch from annotated to animated ex. turned the connection between amount of work and post-test score from negative to positiveRegression analysis showed that a view of an annotated ex. decreases the posttest while each explored animated ex. increases the posttest.This shows that Animated examples had positive effect on students learning, increasing their posttest score.

25IVImpact ongrade#Animated ex.+#Annotated ex.-

Step-wise regression (N = 75)DV: gradeIV: pretest, gender, group, usage measures

Another case where switch to animated ex. turns the balance positivenumber of views on annotated ex. had negative effect on course grade while animated ex. views positively influenced the course gradeRegression analysis showed that a view of an annotated ex. decreases the posttest while each explored animated ex. increases the posttest.This shows that Animated examples had positive effect on students course grade

Technology Pays Back!Example completion Student performance More work higher performance26

Compared to traditional annotated examples, animated examples:

better engaged students increasing their interest in completing examplesprovided better impact on several performance measures such as problem solving success, post-test scores, and course grade turned the relationship between the amount of work with examples and performance from negative to positive

Guiding student within animationsaves timeassures viewing must-seen parts

27

letting users skip parts of animation that they already knowrecommending lines of example that help user with what they do not know

Selected References Mastery GridsLoboda et al. (2014). Mastery grids: An open source social educational progress visualization. In Open Learning and Teaching in Educational Communities (pp. 235-248)Animated ExamplesSirki, T. (2013). A JavaScript library for visualizing program execution. The 13th Koli Calling International Conf. on Computing Education Research (pp. 189-190)WebExBrusilovsky et al. (2009). Problem solving examples as first class objects in educational digital libraries: Three obstacles to overcome. Journal of Educational Multimedia and Hypermedia, 18(3) (pp. 267288)

28

29Jeliot 3Moreno et al. (2004). Visualizing programs with Jeliot 3. Proc. of the working conference on Advanced visual interfaces (pp. 373-376)Python TutorGuo, P. J. (2013). Online python tutor: embeddable web-based program visualization for cs education. Proc. of the 44th ACM technical symposium on Computer science education (pp. 579-584)QuizJETHsiao et al. (2008). Web-based parameterized questions for object-oriented programming. Proc. of the World Conference on E-Learning (pp. 17-21)

30http://people.cs.pitt.edu/~hosseini/mg.html

Contact: [email protected] System