jessica 690tcc 4.22.14(final)
TRANSCRIPT
E V A L U A T I N G
U S E R S ’EX P ER I E NC
Ethe
JESSICA E. LEAUANAE
University of Hawaii, 2014 Candidate M.Ed., Educational Technology
BACKGROUND
2011-2014: Instructional Designer
2003: BA English, BYU Provo
2011: Accepted to Master’s Educational Technology Program, University of Hawaii
This presentation is in partial fulfillment of my Master’s Project for ETEC 690
TRUE ?
A. If you teach someone how drinking is bad for them, they won’t become an alcoholic.
B. If someone goes to a management training class, they’ll be good managers.
C. If someone takes a really good photography class, they’ll be a good photographer.
D. None of the above.
PROJECT PURPOSE:
Evaluate the User Experience of an e-learning environment and implement
improvements using iterative rapid prototyping.
REAL LIFE CASE SCENARIO
June 2013Creation of Online Training Site Using Rapid Prototyping
August 2013250+ new hires begin utilizing site
October 2013Need for Site Improvements and Redesign
Journey Begins
SYSTEMATIC APPROACH
How can we approach improvements systematically
under tight budget
and even tighter timeframes?
WHAT DO THE EXPERTS SAY?
INSTRUCTIONAL DESIGN
IDEAL• ADDIE Method of Design• Formative and Summative
Assessments• Large sampling• Can be a long and lengthy
process• Can be costly
REALITY• Alternative methods of ID-
Rapid Prototyping• Targeted, real-time
evaluations• Small sampling• Fast iterative revisions
based on end user• Less costly• Must provide Proof of
Concept before investments are made
• Discount Usability Testing• User centered design-user is the
central focus • Suitable for in-house, e-learning
modules produced by content experts.
• Reduce costs, faster results, increased customer satisfaction, yields a more successful product
*Designing an Affordable Usability Test for e-Learning Modules
STEVE KRUG:DON’T MAKE ME THINKROCKET SURGERY MADE EASY
DONALD KIRKPATRICK:EVALUATING TRAINING PROGRAMS
• Simplistic approach based on user reactions
• Dominant model evaluating training programs for last 30 years
• Strategic alignment- the degree to which training programs conforms to and advances the goals of the organization
• Systematic matching of goals and outcomes
*Evaluating Distance Delivery and E-Learning. Is Kirkpatrick’s Model Relevant?
IN TODAY’S DIGITAL WORLD
THE USER IS KING
{UX}
{UX}Usability Testing
1.
{UX}Usability Testing Learnability Testing
(Learner Experience)
1.
2.
ETEC 690 MASTER’S PROJECT…
• Usability testing and learner experience evaluations
• Collect, gather and analyze data• Redesign based on user feedback• Rapid succession of build, test, re-design,
deploy
TEST
What: Test different versions of the site
TEST
• What: Test different versions of the site
• Who: Current Virtual Instructors,
Lead status
PARTICIPANT DEMOGRAPHIC
48 Participants Overall• Virtual Instructors• 90% Female, 10% Male• Ages 35-44• 69% Masters Degree• 52% Taught Online• 94% Taken an online course• 92 % Passed Tech Savvy Test
TEST
• What: Test different versions of the site
• Who: Current Virtual Instructors,
Lead status
• Variables to test:Kirkpatrick’s
Learnability + Krug’s Usability
KIRKPATRICK’S TRAINING EVALUATION
• 4 different evaluations measuring learners’ experience: Learner reaction Technology dimension Course dimension Design dimension
• 48 participants total• 138 evaluations completed• 3 different iterations of site tested• embedded online within the site
KRUG’S USABILITY TESTING FRAMEWORK
1 Usability Test Testing 3 different site versions Small test group: 8 participants divided across
3 cycles 18 Tasks to complete: Think aloud Testing is remote via Blackboard Collaborate 45-60 Minutes long Session is recorded Low fidelity prototype
USABILITY TESTING DATA COLLECTION
TEST REVISE RE-DESIGN
{X 3 versions}DA
TA AN
ALYS
IS
USAB
ILITY
&
LEAR
NABIL
ITY
RAPID
PR
OTOTY
PING
Learning experiences are like journeys…
Learning experiences are like journeys…
HOW WAS THE JOURNEY?
JOURNEY
TRAINING SITE
1
TIME: Built in 2 months
MONEY: Lack of resource, Prototype built in Google site, Google Aps for education
PROCESS: Convert analog content to digital. More than a transcription of paper-based content
PEOPLE: MeOne man band- content developer, graphic artist, video editor, voiceover, tech support etc.,
QUALITY ASSURANCE:None, no feedback
SITE VERSION 1: WHERE DO I BEGIN ?
Or do I read the Mission?
Ohhh…surveys! I like surveys, should I jump
over and start completing them?
Do I start with the Training Modules?
ITERATIVE 1: 11.07.13
LEARNER EXPERIENCE SUMMARYLEARNER REACTION:58% Navigation confusing
TECHNOLOGY DIMENSION:41% Login not intuitive
COURSE DIMENSION:61% Good scaffolding of concepts47% Scope and sequence appropriate
DESIGN DIMENSION:74% Enjoyed multi-modalites63% Video and audio syncing not aligned39% Different video delivery is distraction to learning.
JOURNEY
TRAINING SITE
2
SITE VERSION 2: CLUTTER IS DISTRACTING
LEARNER EXPERIENCE SUMMARYLEARNER REACTION:49% Navigation confusing
TECHNOLOGY DIMENSION:33% Login not intuitive48% Technology allowed self-pacing
COURSE DIMENSION:72% Good scaffolding of concepts56% Scope and sequence appropriate
DESIGN DIMENSION:74% Enjoyed multi-modalites70% Video and audio syncing not aligned44% Different video delivery is distraction to learning.
JOURNEY
TRAINING SITE
3
SITE ITERATION 3: CHOICES ALTER OUTCOMES
LEARNER EXPERIENCE SUMMARYLEARNER REACTION:27% Navigation confusing
TECHNOLOGY DIMENSION:39% Login not intuitive57% Fast and responsive
COURSE DIMENSION:88% Good scaffolding of concepts62% Scope and sequence appropriate
DESIGN DIMENSION:62% Enjoyed multi-modalites59% Video and audio syncing not aligned41% Different video delivery is distraction to learning.
JOURNEY
TRAINING SITE
4
SITE ITERATION 4: THE USER IS KING
4 versions later…I FINALLY GOT IT
RIGHT!
USABILITY TEST FINDINGS
TAKEAWAYS
• Improvements based on User• Fast deploy• CMS/LMS issues not resolved• Journey, rough start, smoother finish• Alternative design approach using systematic
evaluation methods
Mahalo!