evaluation of the interactivity of students in virtual learning environments using a multicriteria...

14
This article was downloaded by: [Tufts University] On: 03 December 2014, At: 14:09 Publisher: Taylor & Francis Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK Behaviour & Information Technology Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/tbit20 Evaluation of the interactivity of students in virtual learning environments using a multicriteria approach and data mining Angel Cobo a , Rocio Rocha b & Carlos Rodríguez-Hoyos c a Department of Applied Mathematics and Computer Science, University of Cantabria, Santander, Spain b Department of Business Administration, University of Cantabria, Santander, Spain c Department of Education, University of Cantabria, Santander, Spain Accepted author version posted online: 25 Oct 2013.Published online: 06 Dec 2013. To cite this article: Angel Cobo, Rocio Rocha & Carlos Rodríguez-Hoyos (2014) Evaluation of the interactivity of students in virtual learning environments using a multicriteria approach and data mining, Behaviour & Information Technology, 33:10, 1000-1012, DOI: 10.1080/0144929X.2013.853838 To link to this article: http://dx.doi.org/10.1080/0144929X.2013.853838 PLEASE SCROLL DOWN FOR ARTICLE Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) contained in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views expressed in this publication are the opinions and views of the authors, and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon and should be independently verified with primary sources of information. Taylor and Francis shall not be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use of the Content. This article may be used for research, teaching, and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form to anyone is expressly forbidden. Terms & Conditions of access and use can be found at http:// www.tandfonline.com/page/terms-and-conditions

Upload: carlos

Post on 07-Apr-2017

214 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Evaluation of the interactivity of students in virtual learning environments using a multicriteria approach and data mining

This article was downloaded by: [Tufts University]On: 03 December 2014, At: 14:09Publisher: Taylor & FrancisInforma Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House,37-41 Mortimer Street, London W1T 3JH, UK

Behaviour & Information TechnologyPublication details, including instructions for authors and subscription information:http://www.tandfonline.com/loi/tbit20

Evaluation of the interactivity of students in virtuallearning environments using a multicriteria approachand data miningAngel Coboa, Rocio Rochab & Carlos Rodríguez-Hoyosc

a Department of Applied Mathematics and Computer Science, University of Cantabria,Santander, Spainb Department of Business Administration, University of Cantabria, Santander, Spainc Department of Education, University of Cantabria, Santander, SpainAccepted author version posted online: 25 Oct 2013.Published online: 06 Dec 2013.

To cite this article: Angel Cobo, Rocio Rocha & Carlos Rodríguez-Hoyos (2014) Evaluation of the interactivity of students invirtual learning environments using a multicriteria approach and data mining, Behaviour & Information Technology, 33:10,1000-1012, DOI: 10.1080/0144929X.2013.853838

To link to this article: http://dx.doi.org/10.1080/0144929X.2013.853838

PLEASE SCROLL DOWN FOR ARTICLE

Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) containedin the publications on our platform. However, Taylor & Francis, our agents, and our licensors make norepresentations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of theContent. Any opinions and views expressed in this publication are the opinions and views of the authors, andare not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon andshould be independently verified with primary sources of information. Taylor and Francis shall not be liable forany losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoeveror howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use ofthe Content.

This article may be used for research, teaching, and private study purposes. Any substantial or systematicreproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in anyform to anyone is expressly forbidden. Terms & Conditions of access and use can be found at http://www.tandfonline.com/page/terms-and-conditions

Page 2: Evaluation of the interactivity of students in virtual learning environments using a multicriteria approach and data mining

Behaviour & Information Technology, 2014Vol. 33, No. 10, 1000–1012, http://dx.doi.org/10.1080/0144929X.2013.853838

Evaluation of the interactivity of students in virtual learning environments using a multicriteriaapproach and data mining

Angel Coboa∗, Rocio Rochab and Carlos Rodríguez-Hoyosc

aDepartment of Applied Mathematics and Computer Science, University of Cantabria, Santander, Spain; bDepartment of BusinessAdministration, University of Cantabria, Santander, Spain; cDepartment of Education, University of Cantabria, Santander, Spain

(Received 23 January 2013; accepted 24 September 2013 )

This work seeks to provide a new multicriteria approach to evaluate and classify the level of interactivity of students inlearning management systems (LMS). We describe, step by step, the complete methodological development process of theevaluation model as well as detailing the results obtained when applying it to a higher education teaching experience. Thisresearch demonstrates that the combined use of multicriteria decision methodologies and data mining prove to be particularlysuitable for identifying behavioural patterns of the users through the analysis of records generated in LMS. The results revealthat the behavioural patterns in LMS offer certain indicators as to students’ academic performance, although the study doesnot permit to state that those students who adopt passive attitudes in these spheres may necessarily produce low academicperformance.

Keywords: data mining; multicriteria analysis; AHP; e-learning; b-learning; learning management systems

1. IntroductionE-learning is very much present on an international scalein all education levels, as well as in the permanent occupa-tional training run by public or private organisations. One ofthe education levels that have experienced greater growthover the last few years has been higher education. E-learningis defined as an educational process in which communica-tion is achieved by means of synchronic and asynchronicelectronic tools that permit the construction and elaborationof knowledge (Garrison 2011). Meanwhile, blended learn-ing is associated with the educational processes that areorganised combining on-site training with that carried outvia Internet (Garrison and Kanuka 2004). At times on-sitemeetings are programmed for activities: that have an emi-nently practical aspect; to consolidate concepts; or whichare required at the end of the experience to recapitulate onthe most relevant ideas.

Various research studies have been carried out in dif-ferent countries aiming to analyse the key factors andchallenges of the implementation of e-learning in highereducation institutions (O’Neill et al. 2004, Tuparova andTuparov 2007). Research on e-learning shows its use isincreasing, but the teaching methods employed are essen-tially the same as those used in traditional classroominstruction since they are reproducing traditional method-ological strategies and pedagogical models (Aoki 2010).In this sense, we have developed this research with theaim of providing answers to the training needs detected

in the previous research and to promote the improvementof tutoring processes in e-learning. We do so by developinga multicriteria model which allows teachers to calculate anindicator of interactivity of those students who participatein training processes run by means of learning managementsystems (LMS). The model also allows teachers to betterknow how their students make use of new technology andeven categorise them in terms of their behavioural patterns.One of the substantial contributions of our work is that ofobtaining indices of interactivity by means of integration onthe one hand, and on the other hand, through the exploitationof data with data mining (DM) techniques.

The calculation of an interactivity index can provideparallel advantages for teachers when it comes to knowingstudent needs, for instance, by identifying those studentswho show lower levels of activity and thereby offer an indi-vidualised response to their problems. Likewise, they mayidentify students with high levels of interactivity who mayin turn lead collaborative learning processes. The proposedapproach uses different records, related to the different gen-res of participation, which can be easily obtained throughthe very same on-line training platforms. In any case, itis worth underlining here that this model is not set outto automatise the teaching work in LMS. Mostly, it hasbeen conceived to promote the development of the ideaof autonomous and reflexive professionalism in teaching(Zeichner 2009) that in turn facilitates decision-making dur-ing the processes of curriculum development. This implies

∗Corresponding author. Email: [email protected]

© 2013 Taylor & Francis

Dow

nloa

ded

by [

Tuf

ts U

nive

rsity

] at

14:

09 0

3 D

ecem

ber

2014

Page 3: Evaluation of the interactivity of students in virtual learning environments using a multicriteria approach and data mining

Behaviour & Information Technology 1001

conceiving training in LMS in a flexible and open man-ner, in such a way that the teachers can be autonomousand responsible in order to redirect the ongoing learningprocesses during its development.

In order to achieve this objective, the work has beenorganised as follows: after this initial introduction, wepresent a review of the literature on interactivity andbehavioural patterns of students in LMS. Following this,we examine the techniques used for implementing themodel, with particular attention to the analytical hierarchyprocess (AHP) technique and unsupervised classificationalgorithms. Afterwards, we present the model and demon-strate its application in a specific case in a b-learninginitiative. The work finishes with the main conclusions ofthe research.

2. Literature reviewThe evaluation of the interactivity of students in LMS hasbeen one of the lines of research developed over last yearsin this area of knowledge. First of all, it is necessary topoint out that the very concept of interactivity has a remark-able polysemic nature (Martínez-Gutiérrez 2010). Althoughthis concept was originally used to refer to the dialogicexchanges taking place between emitters and receivers inface-to-face communication processes, for some years nowit is used to designate the possibilities of certain tech-nological applications or devices for simulating humancommunication that can be used in a synchronised or unsyn-chronised manner (Muirhead and Juwah 2004). There are,therefore, different views of interactivity when approachingthe concept that range from a more technological descrip-tion of the idea (Gértrudix 2006) to those authors that stresson its communicative slant (Stokes 2004).

When it comes to defining the parameters of studentinteractivity in e-learning processes, three basic categoriesof interrelationship have been analysed: the interchangesproduced between students and the media; those betweenstudent and teacher; and those between students themselves(Peñalosa 2010). Although the development of interac-tivity in e-learning practices is necessarily linked with atechnological dimension, research has shown that this isnot enough to guarantee or promote student participation.Pena-Shaffa and Nicholls (2004) demonstrated that theintrinsic characteristics of bulletin board discussions wereinsufficient for promoting participation. That participationdepended mostly on other factors such as curriculum design,teaching practices or the very characteristics of the students.Research carried out by García and Suárez (2011) in unsyn-chronised bulletin board discussions in a higher educationexperience showed that interaction requires a communica-tional competence that goes beyond technical skills relatedwith connectivity. Other studies have shown that allocatingroles to students in the unsynchronised communication pro-cesses may have positive effects when it comes to improvingthe construction of knowledge (Schellens and Valcke 2005,

De Wever et al. 2010). Elsewhere, the possibilities offeredby unsynchronised bulletin board discussions for support-ing student participation in educational contexts have alsobeen studied (Dringus and Ellis 2005). Some studies haveshown that the amount of contributions to asynchronousdiscussion groups predict a higher level of knowledgeconstruction (Schellens et al. 2007).

The last years has seen the publication of variousresearch studies which have attempted to categorise ordefine behavioural patterns in e-learning experiences interms of the levels of student interaction. Hong (2001)defines some of the parameters that can be used to measureactivity levels in e-learning processes. This paper differen-tiates between two types of tasks: active ones (documentsor messages emitted by a participant) and passive ones (thenumber of pages read or resources accessed). This researchcategorises active students as those who do not only partic-ipate and read documents placed in the on-line classroomor the messages sent via the communication tools, but alsoas those that also produce them. Talavera and Gaudioso(2004) were able to identify six student behavioural pat-terns in terms of their level of interactivity in an LMS. Afirst group was characterised as being highly collaborative,initiating debates and establishing links with their peers. Asecond group was made up of those who were very moti-vated to participate in an LMS but their contributions hardlyaroused the interest of the rest of their classmates. A thirdgroup, who showed a completely passive behaviour patternand who either presented problems to work in a collabora-tive way or were not sufficiently motivated. A fourth groupwas composed of those who interacted and promoted dis-cussions on topics irrelevant to the course objectives. A fifthgroup would be those with a level of medium interactionwho carried out tasks quickly. Finally, a sixth group wouldbe those showed intermediate levels of interaction but whofrequently used the synchronic communication tools, usingthe file storage service and news alerts to keep up with latestevents about the course.

The above studies have shown that levels of interactiv-ity in experiences will depend, to a large extent, on factorssuch as: communication models that underline the educa-tional practices designed in each case; decision taking oncurricular elements; or the teaching activity itself.

In this work, we have sought to define a single index thatallows the measurement of the degree of student interac-tion in LMS in order to facilitate the identification processof their behavioural patterns. The categorisation may becarried out automatically using statistical algorithms ofclustering. In short, in accordance with the basic objectiveof clustering techniques, each group of students will presenta level of interactivity or behavioural pattern, in such a waythat students of a same group will have close interactivitylevels between each other, and in turn, different from the lev-els of students in other groups. In that sense, this research islinked to other works that have attempted to understand therole played by students in the processes developed over the

Dow

nloa

ded

by [

Tuf

ts U

nive

rsity

] at

14:

09 0

3 D

ecem

ber

2014

Page 4: Evaluation of the interactivity of students in virtual learning environments using a multicriteria approach and data mining

1002 A. Cobo et al.

Internet through the analysis of their interactions (Changet al. 2011).

3. MethodologyOur aim is to construct a model that permits the measure-ment of the degree of student interactivity in an on-lineteaching–learning environment, and to be able to identifypatterns or groups of students with similar behaviour. Toachieve this objective, we have used multicriteria decisionmethodologies (AHP) and DM (unsupervised classificationor clustering). The former are especially useful for obtaininga single index of interactivity based on different previouslyselected indicators or criteria. Furthermore, they also allowus to incorporate subjective appreciations or opinions fromthe teacher on the indicators that may prove more signifi-cant when it comes to measuring this degree of interactivity.As regards the unsupervised classification techniques, thesewill permit automatic categorisation of students.

The next section includes a brief description of the twomethodologies used in the model. There are also refer-ences of possible uses and applications in the study areaof teaching–learning models carried out via an LMS.

3.1. Analytical hierarchy processAHP is a multicriteria decision technique proposed by Saaty(1980) to solve problems of planning needs and manage-ment of scarce resources that, in time, has become one ofthe most widely used techniques in decision-making pro-cesses on multiple criteria. In general, this technique canbe applied for solving problems that require an evaluationand measuring in which different and very often opposedcriteria intervene. The AHP technique is developed throughsix key stages:

(1) Define the problem and establish clear objectivesand expected results.

(2) Deconstruct a complex problem in a hierarchicalstructure with elements of decision. At a high levelof hierarchy, general objectives and the criteria aredivided into particular objectives or subcriteria forreaching the lowest level in which the alternativesare situated.

(3) To carry out pair comparisons between decision ele-ments, forming comparison matrices based on theestablishment of the relative importance betweenthe factors of each hierarchical level. The scale from1 to 9 defined in Table 1 is used in such a way thatfor each pair of criteria i and j a priority value aijis defined. In addition, we must take into accountthat the characteristic of reprocity aij = 1/aji mustbe complied with, and obviously, the comparison ofan element with the same must carry a value aii = 1.At each level of the criteria hierarchy, we obtain asquared matrix.

Table 1. Definition of weights in the AHP scale.

1 The factor i is equally important than the factor j3 The factor i is moderately more important than factor j5 The factor i is significantly more important than factor j7 The factor i is strongly more important than factor j9 The factor i is extremely more important than factor j2,4,6,8 Intermediate values

(4) Check the consistency properties of the matrices inorder to guarantee that the judgements issued by thedecision-makers are coherent and consistent.

(5) Estimation of the relative weights of the decisionelements for achieving the general objective. Inorder to calculate the weights, there are differentalternatives; one of the most common is the calcula-tion of an eigenvector associated with the dominanteigenvalue of the comparison matrix.

(6) Make an evaluation of the alternatives based on theweights of the decision elements.

AHP methodology has been successfully used to tackledifferent problems related with e-learning. Thus, for exam-ple, identification of the critical factors that bear on theacceptance of e-learning systems in developing countries(Bhuasiri et al. 2012); evaluation of externalisation ofe-learning systems (Kang et al. 2011); analysis of the sus-tainability of on-line learning communities (Hong 2010);selection of e-learning platforms (Liu et al. 2009); designof contents for an on-line course based on the opinionsof different agents implied in the process (Sharma et al.2011); and evaluation of the effectiveness of e-learningprogrammes (Tzeng et al. 2007).

3.2. DM in educationDM comprises a set of methods that permit the automaticand semi-automatic exploration and analysis of huge vol-umes of information. In a nutshell, DM seeks the extractionor ‘mining’ of knowledge based on large amounts of data(Han and Kamber 2006). DM techniques have been used fortrying to predict behavioural patterns, generate forecasts,identify trends or changes thereto, as well as to discoverrelationships between the information in order to optimisedecision-making. Thus, there is no doubt as to the practicalapplication in those processes where a large amount of datamust be handled. This explains, therefore, why this area ofknowledge has drawn the attention of different sectors ofthe information industry in recent years.

The DM is currently being used for a wide variety ofpractical applications in the education sphere. In particular,the development of LMS has meant that the different typesof on-line interactions can be recorded: student–teacher;student-medium and student–student. DM techniques per-mit the analysis of these records in order to develop adeeper understanding of what occurs in LMS which then

Dow

nloa

ded

by [

Tuf

ts U

nive

rsity

] at

14:

09 0

3 D

ecem

ber

2014

Page 5: Evaluation of the interactivity of students in virtual learning environments using a multicriteria approach and data mining

Behaviour & Information Technology 1003

facilitates the adoption of measures leading to improve-ment in e-learning processes. More precisely, educationaldata mining (EDM) is an area of knowledge geared tothe application of statistical methods, automatic learningand DM algorithms on different types of educational data(Romero and Ventura 2010). EDM has, in recent years,become an area of research involving professionals fromdifferent lines of work. Romero and Ventura (2010) haveundertaken a general review of work carried out in the areaof EDM in which they describe the most relevant studiesgrouped together in terms of different criteria. They alsoidentify the principal tasks and DM techniques used in edu-cation. Elsewhere, Baker (2010) suggests four key areasof application of EDM: improvement of student models;improvement in domain models; the study of pedagogi-cal support provided by learning software and scientificresearch in student learning. This author also identifies fiveapproaches or methods in EDM: forecasting, clustering,relationship mining, correction of human judgement dataand discovery of models.

In recent years, we have been able to observe a trendtowards the combined use of DM techniques and auto-matic learning for activity data analysis. Systems basedon the individual treatment of user activities tend to begeared towards forecasting student performance or towardsthe identification of different student types. They alsofocused on characteristic interaction behaviour and howthis behaviour was related to acquired learning (Kardanand Conati 2011). These techniques have also been usedin other ways. Anaya and Boticario (2011) used themto infer the most significant characteristics of collabora-tive students. García and Zorrilla (2011) used them foranalysing behaviour of students in on-line courses andtheir navigation procedures. Johnson et al. (2011) usedthem to analyse data obtained from activity records (logs)generated by automatic tutoring systems. Palmer (2013)used student data stored in institutional systems to predictstudent academic performance. Gil-Herrera et al. (2011)focused on academic performance prediction using roughset theory. Bodea et al. (2012) presented an educational rec-ommender engine to increase the formative values of thee-assessment.

Along these lines, LMS generate a huge amount of datawhich, conveniently analysed, present enormous informa-tion possibilities. This data are not always used by teachersto the maximum in order to extract new knowledge or pro-pose more individualised curriculum strategies in terms ofthe activities each student carries out. LMS automaticallystore data related to users such as the temporal and physi-cal distribution of access, activity register, pages visited aswell as indicators on their performance in evaluation tasksand activities proposed. All these data are capable of beinganalysed through DM techniques.

Any DM process in education is composed of thefollowing basic phases or stages: data compilation; data pro-cessing (in which it is cleaned, transformed and reduced);

application of DM (determining the model to use, carryingout statistical analysis and graphically visualising data toobtain a first approximation); and finally, interpretation andevaluation of results obtained (Romero et al. 2007).

4. Measurement of interactivity and studentcategorisation in an LMS

With the aim of constructing a model that allows us to clusterand measure the degree of student interactivity on an on-linecourse and taking as reference the basic stages of the twomethodologies described before, we proceeded to carry outthe actions illustrated in Figure 1. Each of the stages isexplained in detail.

4.1. Definition of objectivesThe model we have developed aims to achieve a doubleobjective. First, to assign each student a unique interactiv-ity level calculated on different records extracted from theLMS. All these indicators must be easily quantifiable andnot take into account aspects of academic performance, butsimply activity. However, these indicators may be used laterto try to forecast student performance based on their activity.Second, to carry out an automatic categorisation of stu-dents identifying groups that respond to similar behaviouralpatterns in the LMS.

The proposed model is described in Figure 2. The inputvalues are different activity indicators generated by theLMS. AHP methodology will be used in order to obtaininteractivity levels for students and DM techniques willallow obtaining groups of student with similar behaviouralpatterns. These groups are the output of the proposed model.

With the aim of checking the effectiveness of the model,we sought its practical application in a b-learning experi-ence, carried out on the Moodle platform of the Universityof Cantabria with the participation of 343 students. Theywere first year students in the module ‘Optimisation Theory’in which the basic principles of mathematical modellingof optimisation problems and operation research were pre-sented. During the course, students had access to the Moodleon-line teaching platform, and besides having access tobasic teaching materials, they could also carry out a seriesof programmed activities (with a time period previouslyestablished by the teaching team) and non-programmedactivities. This initiative provided students with a tool forimproving interaction with the medium, with the teachingstaff and with fellow students. It also allowed them to con-struct a collaborative learning environment and to promoteactive student participation in the different activities car-ried out. The module was structured in four large thematicblocks in which a set of on-site and on-line activities wereprogrammed. Online activities were carried out on Moodleby means of different methodological strategies for collab-orative work: practical cases; project work; self-learningactivities for the use of optimisation software.

Dow

nloa

ded

by [

Tuf

ts U

nive

rsity

] at

14:

09 0

3 D

ecem

ber

2014

Page 6: Evaluation of the interactivity of students in virtual learning environments using a multicriteria approach and data mining

1004 A. Cobo et al.

Definition of the objective

Criteria identification and hierarchical structure construction

Pairwise comparison between criteria and sub-criteria

Weights calculation

Data preparation: Pre-process of activity data & activity indexes calculation

Application of unsupervised classification techniques

Model evaluation and analysis of results

An

alyt

ical

hie

rarc

hy

pro

cess

(AH

P)

Dat

a m

inin

g

Data compilation about student activity in the virtual learning environment

Consistency analysis of the comparisons

Figure 1. Stages for the construction of a model for measuring student activity and categorisation. Integration of AHP and DMmethodologies.

Figure 2. Proposed evaluation and classification model of student interactivity using activity indicators and obtaining a clusterclassification of students.

4.2. Identification of criteria hierarchyIn order to achieve the proposed objectives, we decided toselect quantitative indicators that were easy to observe andmeasure and which could give information as to the degreeof use on the part of the students of the different resourcesoffered by an on-line teaching platform. We opted to usethe Moodle administration functions to obtain the followingdata related to each student:

(1) Number of messages read in the bulletin boarddiscussions (RM).

(2) Number of answers given to messages of other users(MR).

(3) Number of new debate topics opened (ND).(4) Number of active days, with access to the platform

(AD).(5) Number of consultations made to curricular mate-

rial (RV).(6) Number of subscriptions made to bulletin board

discussions or other LMS resources (S).(7) Number of readings of messages and indications

from teachers (RI).(8) Number of times they have participated in on-line

consultations or polls posed in the course (PP).(9) Number of occasions they had taken self-

assessment tests on the platform (ET).

Dow

nloa

ded

by [

Tuf

ts U

nive

rsity

] at

14:

09 0

3 D

ecem

ber

2014

Page 7: Evaluation of the interactivity of students in virtual learning environments using a multicriteria approach and data mining

Behaviour & Information Technology 1005

(10) Number of assignments handed in via the plat-form (A).

As can be appreciated, in the model we used quantitativeindicators but in no case did we take into account academicresults. Thus, for example, we computed the attempts madeby the students in the self-assessment tests but not the resultsobtained.

We decided to hierarchically organise the 10 previousindicators into three main categories (see Figure 3). The firstis related with aspects of social interaction. The second isthat of interaction with materials and resources available onthe on-line course. The final category of indicators groupsthose related with tasks or activities proposed by the teacherthrough the platform.

4.3. Pairwise comparison between criteriaWhen evaluating the level of interaction of a student on anon-line course, each teacher may have a different apprecia-tion as to which criteria or indicator is most significant. Forinstance, could we say that a student has read all the mes-sages sent to the bulletin boards has had a greater level ofinteraction than another who despite not having read manymessages has opened new topics for debates? In short, theAHP methodology proves to be very appropriate so that theteacher may make these comparative valuations. In termsof the specific context in which the experience takes place,the teachers may carry out quantitative valuations using thediscrete 1–9 scale designed by Saaty in order to identifythose criteria which, in their opinion, are more significantwhen it comes to analysing student interaction.

In this work, we asked two teachers of the module, withwide experience in e-learning models and applied mathe-matics teaching, to carry out in a consensual manner thesaid comparisons using the Saaty rating scale presented inTable 1. First, we asked for a pairwise evaluation of thethree main criteria (first level). For example, what is therelative importance to the teachers of the social interactionof student as opposed to the interaction with materials andresources of the virtual course? They were asked to choosewhether social interaction is very much more important,rather more important, as important, and so on down to verymuch less important, than interaction with materials. Eachof these judgements was assigned a number on a scale 1–9.In addition, we must take into account that the characteristicof reprocity aij = 1/aji must be complied. The results areshown in Table 2. As can be seen from the table, accordingto the teachers, social interaction (activity in the bulletinboards) is considered as an indicator of major relevance,although to a moderate degree (values 2 and 3 on the Saatyscale) when it comes to evaluating the student interactionin on-line courses. In the matrix of Table 2, for example, itis rated as 3 in the cell social-interaction/materials and then1/3 in the reciprocal cell (materials/social-interaction).

Once the comparisons at the first hierarchical level weremade, the same process was made comparing the differentcriteria that stem from the general criteria in the hierarchy.The comparisons obtained are shown in Table 3. For exam-ple, seeing this data we can discover how comparing theimportance awarded by the teachers to the starting of newdebates by the student (ND) with respect to the mere read-ing of messages on bulletin boards (RM), the first criteriais considered very much more important (value 7 on theSaaty scale) in order to measure the global interactivity ofthe student.

4.4. Consistency analysis of comparison judgementsSeeking to contrast the consistency of comparative judge-ments between the criteria made by the teachers, the AHPmodel proposes the calculation of a radius of consistency. Inthis sense, Saaty (1980) recommends a radius of consistencyof 0.1 or lower so that pairwise comparisons undertakenby the decision-maker can be considered as acceptable. Inthe case proposed, we used the Expert Choice software toimplement the AHP model, to obtain the radius of con-sistency and to calculate the weights of the criteria. Asregards the radius of consistency, the values obtained wereas follows:

• For first level comparisons (Table 2), we obtained aradius of consistency of 0.01.

• For second level comparisons (Table 3), we obtainedradius of consistency of 0.06, 0.04 and 0.00, respec-tively.

All radii of consistency were, therefore, perfectlyadmissible.

4.5. Calculation of the relative weights of eachcriterion

Once the pairwise comparison has been performed, the AHPmethodology allows us to calculate weights for each cri-terion which will influence their importance in achievingthe final goal. After constructing a matrix expressing therelative values of a set of criteria, the weights of each crite-rion are obtained finding a vector w of order n (number ofcriteria) such that Aw = λw, where λ = n for a consistentmatrix. However, for matrices involving human judgement,pairwise comparison matrices are inconsistent to a greateror lesser degree. In such case, the vector w has to sat-isfy Aw = λmaxw, where λmax is the greatest eigenvalue ofmatrix A. There are several methods for calculating thiseigenvector; in Ishizaka and Lusti (2006) a comparativestudy is presented. In this way, we can compute a list ofthe relative weights, importance, or value, of the factors ineach level of the hierarchy of comparison matrices. In thelowest level of the hierarchy, the weights are calculated bymultiplying the weights corresponding to the higher levels.

Dow

nloa

ded

by [

Tuf

ts U

nive

rsity

] at

14:

09 0

3 D

ecem

ber

2014

Page 8: Evaluation of the interactivity of students in virtual learning environments using a multicriteria approach and data mining

1006 A. Cobo et al.

Social interaction

Materialand resources

Activities and assessments

Active days (AD)

Resource view (RV)

Subscriptions (S)

Read instructions and tutor indications (RI)

Participation in polls (PP)

Evaluation tests (ET)

Assignments (A)

Read messages (RM)

Message/responses (MR)

New debates (ND)

Onlineinteractivityevaluation

Figure 3. Hierarchy of criteria (tasks or activities proposed by the teacher through the virtual education platform) for measuring level ofstudent interactivity.

Table 2. Pairwise comparisons using Saaty scale in the first hierarchical levelof criteria.

Social Materials Activities andinteraction and resources assessments

Social interaction 1 3 2Materials and resources 1/3 1 1/2Activities and assessments 1/2 2 1

In our case, in order to calculate the weights of the10 activity indicators, we used the Expert Choice softwareagain. This software extracts a relative weight for each cri-terion by computing an eigenvector by raising the matrixto an undefined high power, such that any further expo-nentation yields a stable eigenvector. Table 4 shows thefinal values obtained for the lowest level weights of thehierarchy. As we can see, the criteria that have a greaterweight when it comes to evaluating the degree of global

interactivity are those that correspond to the start of newdebates in the bulletin board discussions, participation insurveys proposed by the teacher and answers to messagesand questions proposed by classmates. It is worth highlight-ing that these do not have any bearing on the final markobtained by the student in the subject. It may be surpris-ing that participation in bulletin board discussions receivesgreater weight than the delivery of tasks and assessments.However, it should be noted that these activities are already

Dow

nloa

ded

by [

Tuf

ts U

nive

rsity

] at

14:

09 0

3 D

ecem

ber

2014

Page 9: Evaluation of the interactivity of students in virtual learning environments using a multicriteria approach and data mining

Behaviour & Information Technology 1007

Table 3. Pairwise comparison at the second hierarchical level of criteria.

RM MR ND AD RV S RI PP ET A

RM 1 1/5 1/7 AD 1 3 1 3 PP 1 3 3MR 5 1 1/3 RV 1/3 1 1/2 1/2 ET 1/3 1 1ND 7 3 1 S 1 2 1 3 A 1/3 1 1

RI 1/3 2 1/3 1

Table 4. Final weights assigned to the criteria using the codes defined in Figure 3.

Criterion RM MR ND AD RV S RI PP ET A

Weight (distribute mode) 0.039 0.151 0.35 0.062 0.02 0.057 0.025 0.178 0.069 0.069

stimulated enough, because they have a direct impact onthe final mark.

4.6. Compilation of data and student activity indicatorsThe data compiled were obtained directly through theadministration options of the Moodle platform. During thedevelopment of the module, a total of 112,212 activityrecords were registered. Each record includes the studentidentification, date, hour, IP address from which the accesswas produced as well as the activity carried out in accor-dance with the classification shown in Table 5. Based onall activity records, it was possible to obtain values of indi-cators from the hierarchical model for each of the studentsinvolved, using in each case the activity records linked toactions that are also shown in Table 5. For each indicator,

we counted the number of records associated with each stu-dent in each of the activities shown in the table. Table 6, onthe other hand, shows a statistical summary on the localisa-tion and dispersion of the values of the 10 indicators on thesample of students. Among other questions, we can high-light the fact that there is a wide dispersion in data relatedto the activity carried out by students in the bulletin boarddiscussions.

4.7. Pre-processing of data and interactivity indexcalculation

Seeking to eliminate the differences of scale in the indicatorvalues, these were normalised to give values of between 0and 1. Last row in Table 6 shows the average values of indi-cators after normalisation. With the normalised indicator

Table 5. Activity indicators obtained directly from Moodle and its link with the AHPmodel criteria.

AHP criterion Moodle actions registered

RM Forum view forum, forum view discussionMR Forum add post, forum delete post, forum update postND Forum add discussion, forum delete discussionAD Number of different days that appear in the student activity recordsRV Resource viewS Forum subscribe, forum unsubscribeRI Forum view forum, forum view discussion linked with the news

bulletin board (a bulletin board in which student can only readteachers’ instructions)

PP Choice choose, choice choose again, choice viewET Quiz attempt, quiz close attempt, quiz continue attempt, quiz

review, quiz viewA Assignment upload, assignment view

Table 6. Statistical summary of the indicator values.

RM MR ND AD RV S RI PP ET A

Minimum 0 0 0 1 0 0 0 0 0 0Maximum 3445 172 9 122 186 7 112 136 35 5Mean 180.77 5.330 0.54 37.07 49.44 0.20 34.49 25.80 13.25 0.65Standard deviation 311.03 13.98 1.18 21.76 35.23 0.82 19.52 14.90 5.10 0.61Coefficient of variation 1.721 2.62 2.17 0.59 0.71 4.10 0.57 0.578 0.39 0.94

Dow

nloa

ded

by [

Tuf

ts U

nive

rsity

] at

14:

09 0

3 D

ecem

ber

2014

Page 10: Evaluation of the interactivity of students in virtual learning environments using a multicriteria approach and data mining

1008 A. Cobo et al.

values and the weights obtained by the AHP model, eachstudent was assigned a unique quantitative value in the inter-val [0,1] as a measurement of their interactivity with theon-line course. The said assigned value is based on a sim-ple lineal combination of normalised values of indicatorsmultiplied by the respective weights, that is:

Interactivity level

= 0.039∗RMs + 0.151∗MRs + 0.35∗NDs

+ 0.062∗ADs + 0.02∗RVs + 0.057∗Ss + 0.025∗RIs

+ 0.178∗PPs + 0.069∗ETs + 0.069∗As,

where s represents a student and the values of the variablesof the above formula correspond to the normalised indi-cator values for this student. The weights were computedusing the Expert Choice software, and the weighting AHPmethodology. The values were presented in Table 4 andwere obtained by computing eigenvectors of the pairwisecomparison matrices.

The interactivity level formula is course-dependentbecause it depends on the pairwise judgements. Accord-ing to the context of the subject and the virtual proposedactivities, each teacher can have a different opinion on theactivities that can be considered more important.

As a reference of the levels of interaction obtained, wecould say that the student with the lowest level correspondsto a value of 0.0006 and that of the highest level 0.5993.The average value was 0.1301 and the standard deviation0.0873.

4.8. Application of unsupervised classificationtechniques

Once the global interaction level of each student wasobtained, the following objective was its use to identifygroups of students with a similar behavioural pattern in theon-line course. To do so, we used clustering algorithms,which constitute one of the main techniques used for dataanalysis in very diverse fields (Kaufman and Rousseeuw2008). The term clustering refers to a wide selection oftechniques for delimiting groups or clusters in sets of data,in such a way that these are capable of capturing the naturalstructure of the data.

Many clustering algorithms have been introduced inthe literature and a description of the most common canbe found in Pedrycz (2005). There are some algorithmswhich require advance indication of the number of groupsto be created; an example of this type of algorithms is thek-means algorithm. However, bearing in mind that in theresearch undertaken the number of groups to be createdwas not specified in advance, another algorithm was cho-sen. To be precise, we used the expectation maximisation(EM) algorithm. This is a probabilistic clustering methodwith an iterative technique for calculating the maximum

likelihood used to find an estimation of the set of parame-ters of problems in which certain hidden data exists. Thisalgorithm can either decide how many clusters it is neces-sary to create, based on a crossed validation, or the numberof groups to be generated can be programmed in advance.We opted to use the EM algorithm due to its facility for con-structing groups and since it had been used successfully inother DM projects in the education field. Thus, for example,Boticario and Anaya (2009) used this algorithm to obtaingroups of students in terms of their level of collaborationbased on statistical indicators of interaction in bulletin boarddiscussions.

In order to apply the EM algorithm on levels of studentinteraction, we opted for the use of implementation of EMalgorithm in Weka software. The algorithm automaticallyidentified four clusters of students. With the aim to analysewhether the level of student interaction bore any relationto their success rate at the end of the subject, we analysedthe percentage of students who finally passed in each of theidentified groups. This information is also shown in Table 7.

As can be appreciated, success rate appears to be higherthe greater the average value of interaction of the group.This is significant if we take into account the fact that inorder to calculate the interaction index we did not take intoaccount any variable related to academic performance. Itcan be noted how a small group of students (6 students ofcluster 4) had a high rate of activity which is accompaniedby a success rate of 100%. In the most numerous group(cluster 2), interaction measured with the model is low andan average success rate was produced. This group corre-sponds to a prototype of passive student, more numerousthan normal due to the model of b-learning used. Note thatthe participation in the virtual learning environment is notessential in order to follow the course. The group of stu-dents that demonstrated a lower involvement in the LMS(cluster 1) was made up of those who were unmotivatedor little prepared to tackle the subject successfully. In factonly 4% of students of this group managed to finally passthe course.

Seeking to check the structure of groups revealed by thealgorithm, Figure 4 shows the level of global interactionand the cluster assigned to each of the 343 students. Inorder to obtain clearer graphical representation, studentswere previously ranked from a lesser to greater level ofinteraction.

5. DiscussionResults demonstrate that the constructed model allows oneto analyse the degree of interactivity in LMS. The rate ofinteractivity obtained allows us to analyse the behaviourof students and to identify those less active. Furthermore,the model allows one to carry out a first prediction ofstudent performance given that there seems to be a cor-relation between the level of interactivity and the rate of

Dow

nloa

ded

by [

Tuf

ts U

nive

rsity

] at

14:

09 0

3 D

ecem

ber

2014

Page 11: Evaluation of the interactivity of students in virtual learning environments using a multicriteria approach and data mining

Behaviour & Information Technology 1009

Table 7. Groups of students identified automatically and success rates in the different groups.

Cluster 1 Cluster 2 Cluster 3 Cluster 4

Number of students 67 214 56 6% 20 62 16 2Average interactivity in the group 0.0378 0.1159 0.2127 0.5049Standard deviation of interactivities 0.0192 0.0366 0.0724 0.084Number of students who passed the subject 3 136 49 6Success rate (%) 4 64 88 100

0

0.1

0.2

0.3

0.4

0.5

0.6

1 7 13 19 25 31 37 43 49 55 61 67 73 79 85 91 97 103

109

115

121

127

133

139

145

151

157

163

169

175

181

187

193

199

205

211

217

223

229

235

241

247

253

259

265

271

277

283

289

295

301

307

313

319

325

331

337

343

Interactivity level and cluster assignment

Cluster 1 Cluster 2 Cluster 3

Cluster 4

Student ID

Leve

l of i

nter

erac

�vi

ty

Figure 4. Interactivity levels and clusters assigned to students. Four groups or prototype of students were identified.

success. This correlation must be interpreted as an indi-cator of the dedication and interest of the students in theteaching–learning process as opposed to more than justdata that allows us to reliably predict their academic per-formance. One advantage of the incorporation of AHPmethodology in the model is that it allows us to incorporatethe subjective valuations of the teachers when it comes toanalysing the level of interactivity of the student in eachteaching–learning process based on LMS.

With respect to obtaining the student behavioural pat-terns, it is necessary to point out that the valuations mustbe carried out in the context of the specific experienceanalysed. Thus, it is important to remember that the dataanalysed respond to an initiative of b-learning in whichthere were on-site classes that were reinforced by activi-ties that the students carried out by means of an LMS. This,to a certain extent, may justify the identification of a rela-tively numerous group of students (20%) that present verylow levels of interactivity. The behaviour pattern of thisgroup of students in the on-line environment may be classi-fied as inactive, since they either did not access the LMS or

did so on specific occasions, normally to consult the teach-ing material published or to hand in tasks that had beenprogrammed by the teacher. A tiny percentage of studentsthat show this pattern of behaviour (4%) passed the sub-ject without the need to refer to the activities and resourcesdesigned to be carried out in the LMS. Therefore, we aredealing here with unmotivated students, be this due to thematerial itself or the methodology used, or that they havenot participated in other experiences developed by meansof this modality, unaware, therefore, of the possibilities anddynamics of work in an LMS.

The second student-type identified, and the most numer-ous (62%) encompasses students with a low level ofinteractivity, who, despite using the virtual platform, andin some cases regularly, they limit themselves to consultinginformation and carrying out the minimum tasks or thosethat only be performed using the LMS. These students hada passive behaviour, characterised because they consultedthe bulletin board discussions, but hardly took the initia-tive to contribute or to share their knowledge with the restof their classmates. In any case, it is worthy of note that

Dow

nloa

ded

by [

Tuf

ts U

nive

rsity

] at

14:

09 0

3 D

ecem

ber

2014

Page 12: Evaluation of the interactivity of students in virtual learning environments using a multicriteria approach and data mining

1010 A. Cobo et al.

this group of passive students perhaps was more numer-ous than normal due to the model of b-learning used, giventhat there were on-site activities in which the students couldinteract face to face with classmates and teachers, and so theuse of the social communication tools of the LMS did notprove essential to follow the course subject. Obviously, asthis was such a numerous group, there may be identifiabledifferences in their behaviour. However, the most definingcharacteristic was their passivity in using the LMS.

Within the third group, we were able to identify a sig-nificant increase in the level of interactivity. This is becausethese were students who had greater social interaction,reflected in the contributions made to the discussion forumsanswering other messages, posing new queries or askingfor help from fellow students. As in the case of the pas-sive students, they also used the platform to access materialand to carry out tasks. Their access to the system was fre-quent and they were much more willing to participate incollaborative work activities. Thus, we are dealing here withthe student-type that showed an active behavioural pattern,whose motivation and attitudes placed them in a better situ-ation to be able to achieve the overall targets of the course,as is revealed by the high success rates they obtain (88%).This cluster was not made up of an excessively high num-ber of students (16%) but it was wide enough to give moredynamism and to boost the levels of social interactivityamong the participants in the subject.

Finally, we identified a reduced group of students (2%)who showed levels of interactivity clearly higher than therest. These were highly motivated students who regularlyaccessed the on-line platform, shared information with theirfellows, tried to dynamise the on-line social learning spaces,carried out the activities and generated new material anddebates. We could say that they took on the role of leaderswithin the LMS and showed behavioural traits that boostedcollaboration.

Results show that that the existence of communicationtools in LMS does not improve or increase communicationbetween pupils, as other studies have found (Gunawardenaet al. 2001, Burr and Spennemann 2004). An importantnumber of students in the research did not participateactively in the open dialogue channels nor did they startnew debates. Thus, the use of the communication tools ismore relevant than their pedagogical possibilities. In orderto obtain higher levels of interactivity, it would be necessaryto design methodological strategies that promote active stu-dent participation and collaborative construction of learn-ing. The calculation of the rate of interactivity can help theindividualisation of on-line training. To achieve this, it isnecessary to design flexible education processes, analyseprevious knowledge of students or identify their learningpatterns (Rodríguez-Hoyos and Calvo 2011). Obtaining therate of interactivity must be accompanied by other qual-itative data collation strategies that facilitate global andprocess evaluation (Berge et al. 2000, Cenich and Santos2005). This way, it will be possible to obtain a more complex

and precise picture of what is happening in the LMS duringthe teaching–learning process.

6. ConclusionsThis work has put forward a model that permits us to evalu-ate the level of interactivity of students in LMS by means ofa combination of multicriteria decision techniques and DM.In this sense, this works multiple possibilities for the devel-opment of teaching in LMS since, among other questions, itpermits one to identify a quantitative index of student inter-activity that may vary in terms of the relevance that eachteacher gives to the different indicators used for its construc-tion. In this way, teaching staff has a model that can facilitateindividualisation of teaching to the particular characteris-tics presented by each student on the LMS. Limayem andCheung (2011) demonstrated the importance of teachers topromote sustainable and prolonged use of Internet-basedlearning technologies in university contexts. Furthermore,data can be obtained that improves decision-making in pro-cesses of formative education, destined to the reorientationof those aspects that prove to be more problematic or do notrespond to the objectives defined a priori.

By contrast, although this work is not aimed at forecast-ing academic performance of students in on-line trainingprocesses in terms of their level of interactivity, it has beenshown that there appears to be a certain correlation betweenboth. Nevertheless, we have been able to identify within thegroup with the highest number of students (cluster 2) thatan important percentage of students who show low levels ofinteractivity do end their training process successfully. Thismeans that, just as in the case of offline training processes,there is no lineal relationship teaching and learning and wecannot accurately foresee the development and results ofthe programmed b-learning and e-learning processes. At thesame time, these differences in student performance revealthat in order to understand what happens in e-learning expe-riences it is necessary to see the participants as the keyfigures as well as placing these experiences in the socialand institutional contextual framework in which they arecarried out. This helps us to understand that in student inter-activity levels much is influenced by the fact that we aredealing with a b-learning experience in which the use of theLMS was not a necessary condition to successfully pass thesubject.

Finally, this research has allowed us to affirm that stu-dent behavioural patterns in LMS are very disparate. Thus,students can show different types of behaviour that rangefrom permanent involvement to virtual silence. Further-more, the use of LMS for managing learning processesrequires willingness on the part of the student and seriesof skills that are not are not always achieved to an optimumdegree. In this sense, recent studies demonstrate the needto research into the way students use these kinds of tools(Lust et al. 2012). In order to maximise their potential, itwould be necessary to design methodological strategies that

Dow

nloa

ded

by [

Tuf

ts U

nive

rsity

] at

14:

09 0

3 D

ecem

ber

2014

Page 13: Evaluation of the interactivity of students in virtual learning environments using a multicriteria approach and data mining

Behaviour & Information Technology 1011

promote active student participation and the collaborativeconstruction of learning.

ReferencesAnaya, A.R. and Boticario, F.G., 2011. Towards improvements on

domain-independent measurements for collaborative assess-ment. Proceedings of the 4th international conferenceon educational data mining, Eindhoven, The Netherlands,317–318.

Aoki, K., 2010. The use of ICT and e-learning in highereducation in Japan. World Academy of Science, Engi-neering and Technology, 66, 868–872. Available from:http://waset.org/journals/waset/v42/v42-141.pdf

Baker, R.S., 2010. Data mining for education. In: B. McGaw, E.Baker, and P. Peterson, eds. International Encyclopedia ofeducation. 3rd ed. Vol. 7. Oxford: Elsevier, 112–118.

Berge, Z., Collins, M., and Dougherty, K., 2000. Design guidelinesfor web-based courses. In: B. Abbey ed. Instructional andcognitive impacts of web-based education. Hershey, PA: IdeaGroup Publishing, 32–40.

Bhuasiri, W., et al., 2012. Critical success factors for e-learningin developing countries: a comparative analysis betweenICT experts and faculty. Computers and Education, 58 (2),843–855.

Bodea, C.N., Dascalu, M.I., and Lytras, M.D., 2012. A recom-mender engine for advanced personalized feedback in e-learning environments. International Journal of EngineeringEducation, 28 (6), 1326–1333.

Boticario, J.G. and Anaya, A., 2009. Clustering learners accordingto their collaboration. Proceedings of the 13th interna-tional conference on computer supported cooperative workin design, Santiago de Chile, Chile.

Burr, L. and Spennemann, D.H.R., 2004. Patterns of userbehaviour in university online forums. International Jour-nal of Instructional Technology and Distance Learning, 1(10), 11–28.

Cenich, G. and Santos, G., 2005. A learning proposal based on aproject and collaborative work: An online course experience.Revista Electrónica de Investigación Educativa [online],7 (2). Available from: http://redie.uabc.mx/vol7no2/contenido-cenich.html [Accessed 24 September 2013].

Chang, C., Chen, G., and Wang, C., 2011. Statistical model forpredicting roles and effects in learning community. Behaviour& Information Technology, 30 (1), 101–111.

De Wever, B., et al., 2010. Roles as a structuring tool in onlinediscussion groups: the differential impact of different roleson social knowledge construction. Computers in HumanBehavior, 26 (4), 516–523.

Dringus, L.P. and Ellis, T., 2005. Using data mining as a strategyfor assessing asynchronous discussion forums. Computer &Education, 45 (1), 141–160.

García, A. and Suárez. C., 2011. Interacción virtual y aprendizajecooperativo. Un estudio cualitativo. Revista de Educación,354, 473–498. Available from: http://www.revistaeducacion.educacion.es/re354/re354_19.pdf

García, D. and Zorrilla, M., 2011. E-learning web miner: a datamining application to help instructors involved in virtualcourses. Proceedings of the 4th international conferenceon educational data mining, Eindhoven, The Netherlands,323–324.

Garrison, R., 2011. E-learning in the 21st century: a frameworkfor research and practice. 2nd ed. New York: Routledge.

Garrison, R. and Kanuka, H., 2004. Blended learning: uncoveringits transformative potential in higher education. The Internetand Higher Education, 7 (2), 95–105.

Gértrudix, M., 2006. Convergencia multimedia y educación: apli-caciones y estrategias de colaboración en la Red. Revista decomunicación y Nuevas Tecnologías Icono, 14 (7), 1–17.

Gil-Herrera, E., et al., 2011. Predicting academic performanceusing a rough set theory-based knowledge discovery method-ology. International Journal of Engineering Education, 27(5), 992–1002.

Gunawardena, C.N., Carabajal, K., and Lowe, C.A., 2001. Criticalanalysis of models and methods used to evaluate online learn-ing networks. American Educational Research AssociationAnnual Meeting. Seattle, WA.

Han, J. and Kamber, M., 2006. Data mining. Concepts andtechniques. San Francisco: Morgan Kaufmann Publishers.

Hong, W., 2001. Spinning your course into a web classroom– advantages and challenges. International Conference onEngineering Education. Oslo, Norway.

Hong, F.L., 2010. Determining the sustainability of virtual learn-ing communities in e-learning platform. Proceedings ofthe 5th international conference on computer science andeducation, Miaoli,China, 1581–1586.

Ishizaka, A. and Lusti, M., 2006. How to derive priorities in AHP: acomparative study. Central European Journal of OperationsResearch, 14 (4), 387–400.

Johnson, M., et al., 2011. The EDM Vis tool. Proceedings ofthe 4th international conference on educational data mining,Charlotte, USA, 349–350.

Kang, T., et al., 2011. The study for selecting the consignment per-formance of e-learning of technology college. Internationalconference on multimedia technology, Hangzhou, China,3285–3288.

Kardan, S., and Conati, C., 2011. A framework for capturing dis-tinguishing user interaction behaviours in novel interfaces.Proceedings of the 4th international conference on educa-tional data mining, Eindhoven, The Netherlands, 159–168.

Kaufman, L. and Rousseeuw, P.J., 2008. Finding groups in data:An introduction to cluster analysis. New York: John Wiley& Sons.

Limayem, M. and Cheung, M., 2011. Predicting the continueduse of Internet-based learning technologies: the role of habit.Behaviour & Information Technology, 30 (1), 91–99.

Liu, Q., et al., 2009. E-learning platform evaluation using fuzzyAHP. International conference on computational intelligenceand software engineering, Hengyang, China.

Lust, G., et al., 2012. Content management systems: enrichedlearning opportunities for all? Computers in Human Behav-ior, 28 (3), 795–808.

Martínez-Gutiérrez, F., 2010. Interactividad. Revisión conceptualy contextual. Icono, 14 (15), 9–21.

Muirhead, B. and Juwah, C., 2004. Interactivity in computer-mediated college and university education: a recent reviewof the literature. Educational Technology & Society, 7 (1),12–20.

O’Neill, K., Singh, G., and O’Donoghue, J., 2004. Implement-ing eLearning programmes for higher education: a review ofthe literature. Journal of Information Technology Education,3 (1), 313–323.

Palmer, S., 2013. Modelling engineering student academic per-formance using academic analytics. International Journal ofEngineering Education, 29 (1), 132–138.

Pedrycz, W., 2005. Knowledge-based clustering. New York: JohnWiley & Sons.

Peñalosa, E., 2010. Evaluación de los aprendizajes y estudio de lainteractividad en entornos en línea: un modelo para la inves-tigación. Revista Iberoamericana de Educación a Distancia,13 (1), 17–38.

Pena-Shaffa, J.B. and Nicholls, C., 2004. Analyzing stu-dent interactions and meaning construction in computer

Dow

nloa

ded

by [

Tuf

ts U

nive

rsity

] at

14:

09 0

3 D

ecem

ber

2014

Page 14: Evaluation of the interactivity of students in virtual learning environments using a multicriteria approach and data mining

1012 A. Cobo et al.

bulletin board discussions. Computers & Education, 42 (3),243–265.

Rodríguez-Hoyos, C. and Calvo, A., 2011. The e-tutor figure: find-ings and conclusions of a case-study research project. Revistade Universidad y Sociedad del Conocimiento, 8 (1), 80–92.

Romero, C. and Ventura, S., 2010. Educational data mining: areview of the state of the art. IEEE Transactions on Systems,Man, and Cybernetics, Part C: Applications and Reviews, 40(6), 601–618.

Romero, C., Ventura, S., and García, E., 2007. Data mining incourse management systems: Moodle case study and tutorial.Computers & Education, 51 (1), 368–384.

Saaty, T.L., 1980. The analytical hierarchy process: planning,priority setting, resource allocation. New York: McGraw-Hill.

Schellens, T. and Valcke, M., 2005. Collaborative learning inasynchronous discussion groups: what about the impact oncognitive processing? Computers in Human Behavior, 21 (6),957–975.

Schellens, T., et al., 2007. Learning in asynchronous discus-sion groups: a multilevel approach to study the influenceof student, group, and task characteristics. Behaviour &Information Technology, 26 (1), 55–71.

Sharma, R., Banati, H., and Bedi, P., 2011. Incorporating socialopinion in content selection for an e-learning course. Pro-ceedings of 6th international conference on computer scienceand education, Singapore, 1027–1032.

Stokes, H., 2004. La interactividad en la educación a distan-cia: evaluación de comunidades de aprendizaje. RevistaIberoamericana de Educación a Distancia, 7 (1–2),147–162.

Talavera, L. and Gaudioso, E., 2004. Mining student data tocharacterize similar behavior groups in unstructured col-laboration spaces. Workshop on artificial intelligence inCSCL, 16th European conference on artificial intelligence,Valencia, Spain, 17–23.

Tuparova, D. and Tuparov, G., 2007. e-Learning in Bulgaria –the state of the art. eLearning Papers, 4, 1–20. Availablefrom: http://openeducationeuropa.eu/en/article/e-Learning-in-Bulgaria—the-State-of-the-Art

Tzeng, G., Chiang, C., and Li, C., 2007. Evaluating intertwinedeffects in e-learning programs: a novel hybrid MCDM modelbased on factor analysis and DEMATEL. Expert Systems withApplications, 32 (4), 1028–1044.

Zeichner, K., 2009. Teacher education and the struggle for socialjustice. New York: Routledge.

Dow

nloa

ded

by [

Tuf

ts U

nive

rsity

] at

14:

09 0

3 D

ecem

ber

2014