josé paulo leal | ricardo queirós cracs & inesc-porto la faculdade de ciências, universidade...
TRANSCRIPT
José Paulo Leal | Ricardo QueirósCRACS & INESC-Porto LA
Faculdade de Ciências,Universidade do Porto
Rua do Campo Alegre, 10214169-007 Porto PORTUGAL
A programming exercise evaluation service
for Mooshak
Outline1. Introduction
Context Motivation Goal
2. Architecture eLearning Frameworks E-Framework Evaluation service (service genre, expression and
usage model)
3. Design4. Conclusion
1. Introduction: Context Experience of projects with evaluation
components Mooshak - contest management system for ICPC
contests EduJudge - use of UVA programming exercises
collections in LMSs
Emergence of eLearning frameworks advocate SOA approaches to facilitate technical
interoperability based on a survey the most prominent is the E-
Framework (E-F)
1. Introduction: Motivation Integration of systems for automatic evaluation of
programs program evaluators are complex difficult to integrate in eLearning systems (e.g. LMS) program evaluators should be autonomous services
Modelling evaluation services communication with heterogeneous systems
Learning Objects Repositories (LOR) Learning Management Systems (LMS) Integrated Development Environments (IDE)
conformance to eLearning frameworks improves interoperability
1. Introduction: Goal1. Architecture
1. Integration of evaluation service on eLearning network
2. Definition of an evaluation service on eLearning framework
3. Formalise concepts related to program evaluation
2. Design1. Extend existing contest management system2. Expose evaluation functions as services3. Reuse existing administration functions
2. Architecture
eLearning frameworks Specialized software frameworks Advocates SOA to facilitate technical interoperability Types:
Abstract: creation of specifications and best practices for eLearning systems (e.g. IEEE LTSA, OKI, IMS AF)
Concrete: service designs and/or components that can be integrated in implementations of artifacts (e.g. SIF, E-F)
Survey: E-F and SIF are the most promising frameworks they are the most active projects both with a large number of implementations worldwide.
2. Architecture
E-Framework initiative established by JISC, DEEWR, NZ MoE and SURF aims to facilitate system interoperability via a SOA approach has a knowledge base to support its technical model
Components Description User roleService genre Collection of related behaviors that
describe an abstract capability No technical expert (e.g. IT Manager)
Service expression
A specific way to realize a service genre with particular interfaces and standards
Technical expert(e.g. Developer)
Service Usage Model
The relationships among technical components (services) used for applications
Domain expert(e.g. Business
Analyst)
http://www.e-framework.org/
2. Architecture support of the online community (developers wiki)
contribution for the E-Framework: Service Genre (SG) Service Expression (SE) Service Usage Model (SUM)
2. Architecture - SG
Text File Evaluation Service Genre responsible for the assessment of a text file
text file with an attempt to solve an exercise exercise described by a learning object
supports three functions ListCapabilities EvaluateSubmission GetReport
2. Architecture - SG ListCapabilities function:
list all the capabilities supported by a specific evaluator capabilities depend strongly on the evaluation domain
computer programming evaluator: programming language compiler
electronic circuit simulator: collection of gates that are allowed on a circuit
2. Architecture - SG EvaluateSubmission function:
requests an evaluation for a specific exercise request includes:
reference to an exercise as a learning object held in a repository
text file with an attempt to solve a particular exercise evaluator capability necessary for a proper evaluation of
the attempt response includes
ticket for a later report request or a detailed evaluation report
2. Architecture - SG GetReport function:
get a report for a specific evaluation report included in the response may be transformed
in client side: based on a XML stylesheet able to filter out parts of the report calculate a classification based on its data
2. Architecture - SE
The Evaluate-Programming Exercise SE requests
program source code reference to programming exercise as
a Learning Object (LO) resources
learning objects retrieved from repository LO are archives with assets (test cases, description) and
metadata responses
XML document containing evaluation report details of test case evaluations
Source code + LO reference
EvaluationEngine
report
input output
resource
LO
The E-Framework model contains 20 distinct elements to describe a service expression (SE)
Major E-Framework elements:1. Behaviours & Requests2. Use & Interactions3. Applicable Standards4. Interface Definition5. Usage Scenarios
2. Architecture - SE
2. Architecture - SE1. Behaviours & Requests
details technical information about the functions of the SE
the 3 types of request handled by the SE: ListCapabilities:
provides the client systems with the capabilities of a particular evaluator
EvaluateSubmission: allows the request of an evaluation for a specific programming exercise
GetReport: allows a requester to get a report for a specific evaluation using a ticket
2. Architecture - SE2. Use & Interactions
illustrates how the functions defined in the Requests & Behaviours section are combined to produce a workflow
LEARNINGMANAGEMENT
SYSTEM
EVALUATION ENGINE
(correction and classification)
REPOSITORYLEARNINGOBJECTS
LO reference and attempt
LO reference LO
Report1
2
4
3
2. Architecture - SE3. Applicable Standards
enumerates the technical standards used on the SE content (IMS CP, IEEE LOM, EJ MD) and
interoperability (IMS DRI)
2. Architecture - SE4. Interface Definition formalizes the interfaces of the service expression syntax of requests and responses of SE functions functions exposed as SOAP and REST web services
Function Web Service
Syntax
ListCapabilities SOAP ERL ListCapabilities()REST GET /evaluate/ > ERL
EvaluateSubmission
SOAP ERL Evaluate (Problem, Attempt ,Capability)REST POST /evaluate/$CID?id=LOID < PROGRAM > ERL
GetReport SOAP ERL GetReport(Ticket)REST GET $Ticket > ERL
2. Architecture - SE4. Interface Definition Evaluation Response Language (ERL)
covers the definition of the response messages of the 3 functions
formalised in XML Schema
2. Architecture - SE
5. Usage Scenarios
Learning
Examples Issues/Features
Curricular
(classes)
Self-evaluation
feedback for wrong submissions
Assigments feedback & evaluation
Exams computes a grade
Competitive
(contests)
IOI points for accepted test cases
ICPC penalizations for wrong submissions
IEEExtreme high number of participants
2. Architecture - SUM Text File Evaluation SUM
describes the workflows within a domain composed by SG or SE template diagram from E-F two business processes
1. Archive Learning Objects2. Evaluate Learning Objects
2. Architecture - SUM
BusinessProcesse
s
Roles Description
Archive Learning Objects
Teacher
searches an exercise in a Learning Objects Repository (LOR)
links the most appropriate in a Learning Management System (LMS)
Evaluate Learning Objects
Student
gets the exercise from the LMS
solves the exercise in a specialized resolution environment
submits the resolution to a evaluation engine (EE)
receives a notification with a evaluation report
3. Design Evaluation Service: Design principles & decisions
support e-framework architecture extend existing contest management system –
Mooshak reuse existing functions rather than implement new
ones
create service front controller for service maintain administration web interface map service concepts in Mooshak concepts
3. Design Evaluation Service: Mapping service concepts to
mooshak Service -> Contest
only contests marked as serviceable several contests served simultaneously same contest can be served and managed
Capability -> Contest + Language service request specify contest & language (whiting contest) controls evaluation context produces evaluation report (XML)
3. Design Evaluation Service: Mapping service concepts to
mooshak Service requester -> Team
IDs based on remote IP address & Port basis for authentication useful also for auditing
Learning Object -> Problem LOs downloaded from remote repositories converted to Mooshak problems downloaded problems used as cache
4. Conclusion Definition of a evaluation service Contribution to the E-Framework with a new
Service Genre, Service Expression and Service Usage Model
Validation of the proposed model with a extension of Mooshak contest management system
Current and future work first prototype already available communication with repositories still in development integration in network of eLearning systems full evaluation of this service planned for next fall
Questions?
Authors
José Paulo [email protected]
http://www.dcc.fc.up.pt/~zp
Ricardo Queiró[email protected]
http://www.eseig.ipp.pt/docentes/raq
Thanks!