andrianna jobin k-6 software evaluation evaluating software for young learners

28
Andrianna Jobin K-6 Software Evaluation Evaluating software for young learners

Upload: susan-smith

Post on 26-Dec-2015

217 views

Category:

Documents


2 download

TRANSCRIPT

Andrianna Jobin

K-6 Software Evaluation

Evaluating software for young learners

Agenda

1. Overview of selection process 1. Overview of selection process

2. Evaluation criteria 2. Evaluation criteria

3. Justification of criteria 3. Justification of criteria

4. Critical analysis4. Critical analysis

5. References5. References

Evaluation Cycle

NeedsAnalysis

Site Survey

FormativeEvaluation

SummativeEvaluation

Evaluation Cycle

We are here

DevelopEvaluationCriteria

CriticalEvaluation

Current Evaluation Process

Critical Analysis

Critical Analysis

DetailedEvaluationCriteria &Ratings

DetailedEvaluationCriteria &Ratings

SiteSurvey

&Basic

Criteria

SiteSurvey

&Basic

Criteria

Phase 1Phase 1 Phase 2Phase 2 Phase 3Phase 3

Basic Criteria

AGECLASSROOM

EASE OFUSE

CURRICULAR VALUE

Could it be used by K-6 aged students?

Could it beused bystudents on their own after the first time?

Does it teach orreinforceone of thestandards orsomething in the curriculum?

Could it be used in a classroom setting?

Site Survey

www.exploratorium.edu www.aplusmath.com www.kidsmath.com

www.scienceu.com/geometry

www.primarygames.com http://members.aol.com/_ht_a/iongoal/index.htm

http://education.jlab.org/indexpages/elementgames.html

www.funbrain.com www.visualfractions.com

www.brainpop.com/ http://www.auditory-processing.com/gamegoo/gooey.html

www.mathcats.com

According our basic criteria, we selected this one

Categories for Detailed Evaluation

Structure Goals Pedagogical approaches & Learning styles Feedback & Interaction Motivational elements Ease of Use Personalization Relevance Curriculum and Content matter Visual Design & Technical Consideration (??) Interface Design (do we have/need this??)

Rating system

RATING DESCRIPTION RATING

5 Very strong in this area

4 Good in this area

3Not especially good or weak in this area

2 Weak in this area

1Very weak or totally lacking in this area

The rating scale is a classic 5 level rating scale, including a middle/ neutral rating, so there is no forced positive or negative decision.

Overall ratings per category

CATEGORY RATING

I Structure II Goals

III Pedagogy & Behaviorism

IV Feedback & Interaction

V Motivational elements

VI Ease of Use

VII Personalization

VIII Relevance

IX Curriculum and Content matter

X Visual Design

Detailed ratings in each category

I. STRUCTURE RATING

11. Does the program have correct answers?

2

2. Does the program have easy to follow solutions?

33. Are the rules easy for the learner to follow?

44. In addition, does the program use a limited number of rules.

55. Is the program organized so the learner can anticipate what is going to happen?

66. Is the program organized prescriptively?

77. Does the learner need to come up with alternative solutions?

88. Does the program have clear boundaries?

99. Is the problem clearly stated?

1010. Are the objectives clear to the learner?

Detailed ratings in each category

II. GOALS RATING

1Are the goals for the program well defined?

2 Have the goals of the program been attained by the user?

3Does the learner understand the goals the program wants?

4Does the program give the learner support if he or she needs help?

5Is the learner acquiring new skills while using the program?

6Is the learner being challenged to think critically about the goals?

7How does the learner demonstrate understanding of goals?

8What resources are needed to help the learner obtain the goals?

Detailed ratings in each category

III. PEDAGOGY & BEHAVIORISM RATING

1Is the program directing the learner's learning process?

2 What desired behaviors is the learner expected to do?

3 Does the feedback reinforce the behavior the program intends for the learner to do?

4Can the desired behaviors by the learner be observed?

5What interventions does the program have in place?

6Does the program offer suggestions on how the learner can improve?

7In what ways the program offer the learner time to practice the desired result?

8Does the program objective fit the desired behaviors of the learner?

9Is there any instruction that leads the learner to the desired outcome?

10Is the feedback throughout the program or at the end?

Detailed ratings in each category

IV. FEEDBACK & INTERACTION RATING

1 Do users know easily if they made a mistake? Is the signal for error (wrong answer) clear to users?

2Are the correct answers reinforced by positive feedback?

3Based upon observation, do the children appear to enjoy the positive feedback? Does it build confidence and feeling of success?

4Does the feedback reinforce content?

5Does feedback employ meaningful graphic and sound capabilities?

6Is the correct response provided? Does this program efficiently explain why the users’ answer was incorrect?

7Does the program recommend remediation to users? Does the feedback adjust according to the child’s input?

8Do students have a chance to correct errors?

9Is the program forgiving of input errors such as format, capitalization, etc.?

10When students successfully complete a challenging activity, is it followed by a “fun” activity?

Detailed ratings in each category

V. MOTIVATIONAL ELEMENTS RATING

1Is this program enjoyable to use?

2Based upon observation, are the graphics appealing to the children?

3Is the theme of the program meaningful and attractive to users?

4Do the children return to this program time after time?

5Can users select their own level?

6Does it encourage users to obtain correct answer?

7Is it responsive to a user's actions?

8Do the program elements match users’ direct experiences?

9Does the program provide opportunities to explore and arouse curiosity?

10Does the duration of time for each activity match with student attention spans?

Detailed ratings in each category

VI. EASE OF USE RATING

1Based upon observation, can children use the program independently after the initial use?

2Are skills needed to operate the program in range of the child’s ability level?

3Are key menus easy to find? Is getting to the first menu quick and easy?

4Is reading ability a prerequisite for using the program?

5Is it easy to print?

6Is it easy to enter or exit out of any activity at any point?

7Are written materials helpful for doing activities?

8Are users given enough opportunities to review instructions on the screen, if necessary?

9Are icons or menu bars large and easy to select with a moving cursor?

10Do learners feel at home with the program interface? Does it have an intuitive metaphor for the learner to know how to use the interface?

Detailed ratings in each category

VII. PERSONALIZATION RATING

1Responsiveness to user preferencesDoes the program adjust the difficulty of tasks or information according to the children’s responses, giving more/less complicated tasks as appropriate?

2Can the interface of the program be customized by user preferences?

3Learner controlDo the children feel like they have control and interesting choices?

4Does the program allow for the children to make choices and does it adjust subsequent choices accordingly?

5Does the program allow students an active role in developing personal knowledge?

Does it help students to explore ideas and develop own personal knowledge?

6To what extent are learners guided in creating any content of their own?

7User tracking:Does the program track and record student progress?

8While using the program, can the children see which activities they have already completed and which ones are still to be done?

9Does the program provide periodic indication of how well the child is meeting the goals?

10When exiting the program, does it automatically save student progress? When returning to program, can children carry on where they left off? Are they shown an overview of what has been completed and not completed?

Detailed ratings in each category

VIII. RELEVANCE RATINGAuthenticity

1Does the program provide authentic situations and rich contexts in which to explore the subject matter?

2Based on observation, do children think the program feels “real” and interesting? How well can students relate to it? Practicality

3Can children use what they learn in real life contexts?

4Does the program show students how the learning is useful?

5Does the program encourage children to imagine themselves in a context where they can use the information they are learning?

6Does the program involve real life situations or problems which children this age would encounter?

7Does the program help students apply their learning to their own lives?

Detailed ratings in each category

IX. CURRICULUM AND CONTENT RATINGInstructions

1Are clear instructions available?

2How easily can instructions be bypassed?

New terms, concepts, and vocabulary

3Are new terms defined in words understandable to a young learner?

Challenge

4Does the content follow an age appropriate progression of skills?

5Does the level of difficulty increase with progress gradually building on prior knowledge?

6Is the pacing of the activities age-appropriate and challenging?

7Based on observation, does the pacing of the activities maximize children’s attention spans?

Detailed ratings in each category

X. VISUAL DESIGN RATING

1 2

3

4

5

6

7

8

9

10

Critical Evaluation Phase

What age group or grade level is this most appropriate for?

What is the intended purpose of the software? Where is the software intended to be used? Which learning theories can be found in this

program? What do you like about the software? What makes this software Way Cool? What don't you like about the software? Is this software appropriate for classroom use?

How can it best be used in the classroom?

Critical Evaluation Phase

What age group or grade level is this most appropriate for?

Ages 5-9 K-4

Critical Evaluation Phase

What is the intended purpose of the software?

To help children practice basic skills such as letter recognition, computer motor-skills, typing, and basic concepts such as fact vs. fiction.

Critical Evaluation Phase

Where is the software intended to be used?

It seems intended for independent use at home, but could be used as a reward or supplement in the classroom.

Critical Evaluation Phase

Which learning theories can be found in this program?

Behaviorism features strongly in this site with all of its instant audio-visual feedback and progress ratings. However, some games do suggest extension activities which would fit a constructivist model.

Critical Evaluation Phase

What do you like about the software? What makes this software Way Cool?

It is very attractive and takes metaphors such as skateboarding jumps which kids are familiar with to make succeeding on learning tasks seem equally cool. In addition, it is cheerful and colorful with cute sounds.

Critical Evaluation Phase

What don't you like about the software?

It may be overly stimulating to some children. It may be distracting to other children in the classroom who are doing other tasks.

Critical Evaluation Phase

Is this software appropriate for classroom use? How can it best be

used in the classroom?

It is appropriate for an adjunct role in the classroom, as a reward for quick workers. It would be an excellent site for teachers to familiarize their students with for extra-curricular use.

Marketing Diagram

Very good software for supplemental use!

ExcellentVisuals

Age appropriateskills practice

EngagingAudio

Use of behavioristprinciples