evaluating usage of an analytics tool to support ...ceur-ws.org/vol-2437/paper5.pdf · program...

14
1 Copyright © 2019 for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0) Evaluating Usage of an Analytics Tool to Support Continuous Curriculum Improvement Isabel Hilliger 1 , Constanza Miranda 1 , Sergio Celis 2 , and Mar Pérez-SanAgustín 1,3 1 Pontificia Universidad Católica de Chile, Santiago, Chile 2 Universidad de Chile, Santiago, Chile 3 Université Paul Sabatier Toulouse III, Institut de Recherche en Informatique de Toulouse (IRIT), Toulouse, France {ihillige, csmirand}@uc.cl, [email protected], mar.perez- [email protected] Abstract. Curriculum analytics (CA) consists of using analytical tools to collect and analyse educational data, such as program structure and course grading, to improve curriculum development and program quality. This paper presents an instrumental case study to illustrate the usage of a CA tool to help teaching staff collect evidence of competency attainment in an engineering school in Latin America. The CA tool was implemented during a 3-year continuous improve- ment process, in the context of an international accreditation. We collected and analysed data before and after tool implementation to evaluate its use by 124 teaching staff from 96 course sections. Data collection techniques included: anal- ysis of documentary evidence collected for the continuous improvement process and teaching staff questionnaires. Findings show that the tool supported staff tasks related to the assessment of competency attainment at a program level. However, usability and functionality issues would have to be addressed to also support course redesign, providing actionable information about students’ per- formance at an individual level. Lessons learned imply that institutions could adapt and adopt existing CA tools to support curriculum analysis by not only investing in tool development, but also in capacity building for its use for contin- uous improvement processes. Keywords: Curriculum Analytics, Case Study, Competency-based Curriculum, Continuous Improvement, Higher Education. 1 Introduction Within higher education, different types of analytics have been implemented to tackle painstaking work required to improve learning results at different levels[1], [2]. One of these analytical approaches is Curriculum Analytics (CA),defined as the collection, analysis and visualization of administrative curricular datasuch as course enrolment and student grades to inform and support decision making at a program level [3]. In specific educational contexts, CA has been proposed as a good solution for lightening task workload required to identify courses where the improvement of learning results

Upload: others

Post on 01-Jul-2020

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Evaluating Usage of an Analytics Tool to Support ...ceur-ws.org/Vol-2437/paper5.pdf · program design or academic program delivery [9]. Building upon this latest idea, we present

1

Copyright © 2019 for this paper by its authors. Use permitted under Creative Commons

License Attribution 4.0 International (CC BY 4.0)

Evaluating Usage of an Analytics Tool to Support

Continuous Curriculum Improvement

Isabel Hilliger1, Constanza Miranda1, Sergio Celis2, and Mar Pérez-SanAgustín1,3

1 Pontificia Universidad Católica de Chile, Santiago, Chile 2 Universidad de Chile, Santiago, Chile

3 Université Paul Sabatier Toulouse III, Institut de Recherche en Informatique de

Toulouse (IRIT), Toulouse, France

{ihillige, csmirand}@uc.cl, [email protected], mar.perez-

[email protected]

Abstract. Curriculum analytics (CA) consists of using analytical tools to collect

and analyse educational data, such as program structure and course grading, to

improve curriculum development and program quality. This paper presents an

instrumental case study to illustrate the usage of a CA tool to help teaching staff

collect evidence of competency attainment in an engineering school in Latin

America. The CA tool was implemented during a 3-year continuous improve-

ment process, in the context of an international accreditation. We collected and

analysed data before and after tool implementation to evaluate its use by 124

teaching staff from 96 course sections. Data collection techniques included: anal-

ysis of documentary evidence collected for the continuous improvement process

and teaching staff questionnaires. Findings show that the tool supported staff

tasks related to the assessment of competency attainment at a program level.

However, usability and functionality issues would have to be addressed to also

support course redesign, providing actionable information about students’ per-

formance at an individual level. Lessons learned imply that institutions could

adapt and adopt existing CA tools to support curriculum analysis by not only

investing in tool development, but also in capacity building for its use for contin-

uous improvement processes.

Keywords: Curriculum Analytics, Case Study, Competency-based Curriculum,

Continuous Improvement, Higher Education.

1 Introduction

Within higher education, different types of analytics have been implemented to tackle

painstaking work required to improve learning results at different levels[1], [2]. One of

these analytical approaches is Curriculum Analytics (CA),defined as the collection,

analysis and visualization of administrative curricular data— such as course enrolment

and student grades —to inform and support decision making at a program level [3]. In

specific educational contexts, CA has been proposed as a good solution for lightening

task workload required to identify courses where the improvement of learning results

Page 2: Evaluating Usage of an Analytics Tool to Support ...ceur-ws.org/Vol-2437/paper5.pdf · program design or academic program delivery [9]. Building upon this latest idea, we present

2

Copyright © 2019 for this paper by its authors. Use permitted under Creative Commons

License Attribution 4.0 International (CC BY 4.0)

is crucial to the program success [4]. This workload is particularly high in competency-

based curriculums because each course emphasizes specific core competencies within

an academic plan [5]. Traditionally, the analysis of this type of curriculum requires to

assess the alignment between program and course level competencies, determine

whether both type of competencies are attained and formulate action plans to improve

teaching and learning [4]. The emergence of CA techniques opens new possibilities to

perform these tasks in a timely and cost-effective manner [4].

However, the CA promise of supporting curriculum analysis is far from fulfilled.

Several tools have been proposed to identify gateway courses in a curriculum and im-

prove its outcomes [4], but managers and teaching staff still perceive that they lack

systematic information for course improvement, such as students’ academic results re-

garding taken courses and core competency attainment [5]. Research about CA adop-

tion is still in an early stage, so most of the tools developed propose solutions to tackle

institutional needs are not necessarily related to any existing improvement process [6].

In some cases, using technology to help stakeholders to visualize institutional data

available is a good means of supporting data-driven decision-making and promoting

transparency in certain educational processes. However, prior work suggests that, in

order to leverage the potential of CA tools, an institution needs to be ready to address

data-driven changes [6], having already implemented processes to examine what is and

is not working at different levels [1].

To illustrate how the use of CA tools could systematically support continuous cur-

riculum improvement for an extended period of time, we present an instrumental case

study about a continuous improvement process implemented in an engineering school

in Latin America (UC-Engineering). This process was implemented between 2015 and

2017 to comply with the North American Accreditation Board for Engineering and

Technology (ABET), and in mid-2016, a CA tool was implemented to support teaching

staff. To evaluate the teaching staff’s usage of this tool in five competency-based pro-

grams, we collected and analysed data before and after tool implementation. The fol-

lowing data collection techniques were used: 1) analysis of documentary evidence re-

ported for the continuous improvement process by 124 teaching staff in 96 course sec-

tions, and 2) teaching staff questionnaires applied to 25 out of 63 teaching assistants

who interacted with the CA tool. These two types of data sources were triangulated to

evaluate tool usage from the teaching staff ‘s perspective, exploring its implications in

terms of tool usefulness and ease-of-use.

2 Curriculum Analytics Tools

Over the past decade, researchers have developed CA tools to address multiple edu-

cational challenges from the perspective of different stakeholders. Table 1 summarizes

some of the tools that have been documented in recent conference proceedings and

journal articles describing the institution in which they were developed, and their dis-

tinctive features for different users. Concerning students, tools 1 and 2 were developed

with the objective of changing their approach to study, using visualizations of their

performance to motivate them to adopt help-seeking behaviours. Regarding teaching

Page 3: Evaluating Usage of an Analytics Tool to Support ...ceur-ws.org/Vol-2437/paper5.pdf · program design or academic program delivery [9]. Building upon this latest idea, we present

3

Copyright © 2019 for this paper by its authors. Use permitted under Creative Commons

License Attribution 4.0 International (CC BY 4.0)

staff and managers, tools 3-10 were developed with the objective of providing infor-

mation to identify students who are facing difficulties in their studies, expecting staff

to reach out to their students and provide guidance. Finally, tools 11-14 were developed

for students or staff, aiming to help them to identify crucial courses in a curriculum,

and anticipate the impact of course-level improvements in competency attainment at a

program level.

Most of the tools presented in Table 1 do not provide information about the method-

ology used to evaluate its impact on any existing institutional processes. By methodol-

ogy, we mean experimental approaches to determine whether the tool is effectively

fulfilling its objective [7]. For instance, little is known about how students perceived

the tools developed to promote help-seeking behaviours (such as tool 1) [8]. Regarding

tools that aim to identify students at risk (such as tool 9), there is no clear relationship

between the use of the tool and student outcomes yet [8]. And, concerning the tools for

monitoring course-level improvement and competency attainment, researchers have

just started to evaluate their adoption by collecting information about user perceptions

[5], without necessarily reporting data about its actual use to drive improvements in

program design or academic program delivery [9].

Building upon this latest idea, we present a case study that evaluates teaching staff

usage of a CA tool in the context of a continuous improvement process that has already

been installed at an institutional level. Our aim is not only to present the results of

evaluating the tool from users’ perspective, but also to analyse how it was used to sup-

port an existing process for curriculum improvement.

Table 1. Curriculum analytics tools documented in the recent literature

Tool name and user Developer Distinctive features [References]

1. Check My Activity (fo-

cused on students)

University of Maryland, Balti-

more, USA

Student visualizations of LMS logs

compared to their peers and their grades [1], [8]

2. E2 Coach (focused on stu-

dents)

University of Michigan, USA Student visualizations of feedback

based on peer performance [1]

3. Student Relationship En-

gagement System (focused on teaching staff)

University of Sydney, Australia

(used in 58 courses)

Interface for customizable analysis

of students’ academic performance datasets and visualizations [9]

4. Risk management model

(focused on managers and teaching staff)

The University of Queensland,

Australia (piloting phase)

A set of risk indicators at a pro-

gram and course level [9]

5. The Ribbon Tool (focused

on managers and teaching

staff)

UC Davis, USA (disseminated

to be used by other higher edu-

cation institutions)

Sankey diagram to understand stu-

dents’ academic trajectories [9]

6. Know your students (fo-cused on teaching staff)

UC Davis, USA Dashboard interface with students’ demographic data at a course level

[9]

7. Departmental Diagnostic

Dashboard (focused on man-agers)

UC Davis, USA Dashboard interface with students’

demographic data at a course level [9]

8. Learning dashboard for In-

sights and Support during

Study Advice-LISSA (fo-

cused on student advisers)

KU Leuven, Belgium Dashboard with information about

student course enrolment, course

credits earned, and grades of one or

more students [10]

Page 4: Evaluating Usage of an Analytics Tool to Support ...ceur-ws.org/Vol-2437/paper5.pdf · program design or academic program delivery [9]. Building upon this latest idea, we present

4

Copyright © 2019 for this paper by its authors. Use permitted under Creative Commons

License Attribution 4.0 International (CC BY 4.0)

Tool name and user Developer Distinctive features [References]

9. Course Signals (focused

on teaching staff)

Purdue University, USA (li-

censed to Ellucian)

A set of risk indicators to classify

students in subgroups [1], [8]

10. Student Flow Diagrams

(focused on managers and teaching staff)

University of New Mexico,

USA

Sankey diagram to understand stu-

dents’ academic trajectories [1]

11. Curricular Analytics (fo-

cused on managers, teaching

staff)

University of New Mexico,

USA

Interactive graphical representation

of the curriculum [1]

12. Visualized Analytics of Core Competencies-VACCs

(focused on students)

Yuan Ze University, Taiwan (currently used in Yuan Ze Uni-

versity)

Visualization of competency attain-ment in radar charts regarding

grades, credit hours and peer per-

formance [5]

13. Competency Analytics Tool-CAT (focused on man-

agers and staff)

Singapore Management Univer-sity, Singapore

Curriculum progression statistics based on competency map and

course information [4]

14. Course University Study

Portal-CUSP (focused on managers and teaching staff)

University of Sydney, Australia Web-based application to model

competency development in 5 ma-turity levels [11]

3 Methods

3.1 Study Design and Objectives

According to Zelkowitz [7], instrumental case studies are useful to determine if a tech-

nological tool makes it easier to produce something compared to prior scenarios. Since

the objective of this study is to evaluate usage of a CA tool throughout a continuous

improvement process, we select instrumental case study as the most appropriate meth-

odology. The case study took place at UC-Engineering between 2015 and 2017. In this

case, we specifically evaluated whether the CA tool supported teaching staff efforts to

collect documentary evidence to account for competency attainment.

3.2 Case Study Context and Proposition

In the early 2000s, UC-Engineering decided to accredit five academic programs by

ABET, which concentrated 35% of its undergraduate enrolment (1,500 students out of

a total of 4,000): 1) Civil Engineering, 2) Electrical Engineering, 3) Computer Engi-

neering, 4) Mechanical Engineering, and 5) Chemical Engineering. In 2015, the man-

agers at the Office for Undergraduate Studies and the Engineering Education Unit de-

signed a continuous improvement process to be implemented for the renewal of the

ABET accreditation of these five programs. The continuous improvement process was

organized in six semesters between the first semester of 2015 and the second semester

of 2017. Every semester, teaching staff were required to undertake the tasks described

in Fig. 1. Before the semester started, they had to participate in a workshop to revise

their course syllabuses to make sure the competencies declared at a course level were

aligned to the ones at assigned program level. At the beginning of the semester, they

had to report an assessment plan to account for competency attainment at their courses

(two competencies per course in most cases). Once the semester finished, they had to

Page 5: Evaluating Usage of an Analytics Tool to Support ...ceur-ws.org/Vol-2437/paper5.pdf · program design or academic program delivery [9]. Building upon this latest idea, we present

5

Copyright © 2019 for this paper by its authors. Use permitted under Creative Commons

License Attribution 4.0 International (CC BY 4.0)

report competency assessment results, which were transformed into percentages of

competency attainment to be revised in an end-of-semester curriculum meeting (curric-

ulum discussions). Additionally, they might have been required to submit a sample of

assessment methods used to assess the competencies assigned to inform curriculum

decision-making (if needed).

At the end of the continuous improvement process, 104 assessment plans

(http://bit.ly/2SYxWxc) and spreadsheets of competency assessment results

(http://bit.ly/2VOdUKx) were expected to be reported by 124 teaching staff from 96

course sections, analysing the attainment of the 11 competencies proposed by ABET

Criterion 3 (http://bit.ly/2SeVzRj).

Fig. 1. Semester tasks that teaching staff had to undertake for the continuous improve-

ment process implemented at UC-Engineering between 2015 and 2017.

By the end of 2015, 38 assessment plans and competency assessment results were

already collected from 29 course sections for an interim report to be sent to ABET in

June 2016. These documents were attached and sent in e-mails by teaching staff, and

then uploaded to Dropbox folders by UC-Engineering managers. For curriculum dis-

cussions, analysts of the Engineering Education Unit had to transform graded assess-

ment results into percentages of competency attainment, so this information was only

available at the end of the semester. By automating this analysis, program chairs and

teaching staff would have access to competency attainment results whenever they

needed them for their decision-making. This motivated UC-Engineering managers to

implement a CA tool developed jointly by the University of Sydney and U-Planner

(https://www.u-planner.com/en-us/home), a Latin American company that provides

technological solutions and research services to higher education institutions. This tool

was originally developed to facilitate the alignment between competencies in a graduate

profile and the teaching and assessment methods of the different courses within a pro-

gram (see Fig. 2), and to collect documentary evidence of competency assessment at a

Page 6: Evaluating Usage of an Analytics Tool to Support ...ceur-ws.org/Vol-2437/paper5.pdf · program design or academic program delivery [9]. Building upon this latest idea, we present

6

Copyright © 2019 for this paper by its authors. Use permitted under Creative Commons

License Attribution 4.0 International (CC BY 4.0)

course level as attachments. In order to include an automated visualization of compe-

tency attainment in the CA tool, U-Planner had to implement an ETL process to inte-

grate course grading results from the LMS, and then create a report that transformed

these results into percentages of competency attainment LMS (see Fig. 3). After vali-

dating the report for 12 courses during the first semester of 2016, the CA tool was

implemented for supporting the continuous improvement process since the second se-

mester of 2016. At the end of 2017, the CA tool was not only used to collect all the

documentary evidence required for the accreditation purposes (syllabuses, course de-

scriptions, and assessment plans), but also to visualize competency attainment results

in each course for different academic periods. The percentages of competency attain-

ment were calculated by dividing each student grade into the maximum score of an

assessment method, and then multiplying this result by 100. Subsequently, each per-

centage of competency attainment was classified into four performance levels accord-

ing to different thresholds established by teaching staff. These performance levels were:

1) unsatisfactory, 2) developing, 3) satisfactory, and 4) exemplary (see Fig. 3).

Fig. 2. Screenshot of the UC-Engineering CA tool for supporting continuous im-

provement. Through this tool, teaching staff performed the tasks required for the con-

tinuous improvement process since the first semester of 2016.

Fig. 4 depicts the period considered for the case study, indicating the period were

the tool was implemented. The tool aimed at supporting the continuous curriculum pro-

cess by means of facilitating the following activities (see Fig. 1 with the semester tasks):

(1) Filling a course description form to describe broadly the teaching and assess-

ment methods.

Page 7: Evaluating Usage of an Analytics Tool to Support ...ceur-ws.org/Vol-2437/paper5.pdf · program design or academic program delivery [9]. Building upon this latest idea, we present

7

Copyright © 2019 for this paper by its authors. Use permitted under Creative Commons

License Attribution 4.0 International (CC BY 4.0)

(2) Indicating the relationship between competencies at both program and course

level (Fig. 2).

(3) Listing performance indicators for competencies at a program level that could

be assessed at a course level.

(4) Aligning performance indicators with graded assessment methods at a course

level.

(5) Generating automated reports on percentages of competency attainment (Fig.

3). This functionality is integrated with the institutional LMS to automatically

capture the students’ grades for align graded assessment methods.

(6) Uploading the following documentary evidence as attachments: Course sylla-

bus; Assessment plans; Competency assessment results; Sample of assessment

methods.

Fig. 3. Screenshot of automated report of percentages of competency attainment at a

course level generated by the CA tool regarding the performance indicator ’Com-

municates constructively with other classmates’ for the effective communication com-

petency.

Fig. 4. Summary of the continuous improvement process implemented at UC-

Engineering. Light grey dots indicate the semesters where the teaching staff tasks

were not supported by the CA tool, and the dark grey dots the periods where the tool

was implemented as part of the process.

Page 8: Evaluating Usage of an Analytics Tool to Support ...ceur-ws.org/Vol-2437/paper5.pdf · program design or academic program delivery [9]. Building upon this latest idea, we present

8

Copyright © 2019 for this paper by its authors. Use permitted under Creative Commons

License Attribution 4.0 International (CC BY 4.0)

3.3 Case Study Participants, Data Gathering Techniques and Analysis

Between 2015 and 2017, 124 members of the teaching staff participated in the contin-

uous improvement process implemented at UC-Engineering. These 124-teaching staff

include 61 teachers (44 faculty members and 17 part-time instructors) who reported

documentary evidence of competency attainment in 96 course sections, and 63 teaching

assistants who supported teachers in outcome assessment tasks (see Table 2).

Table 2. Teaching staff involved throughout the continuous improvement process at UC-

Engineering (before and after tool implementation)

Faculty members

and part-time

instructors

Teaching

Assistants

Staff involved in the continuous improvement process(N) (*) 61 63

Staff involved before tool implementation (N) 19 -

Staff involved after tool implementation (N) (**) 30 63

Staff involved before and after tool implementation (N) (**) 12 0

Civil Engineering staff (N) 13 22

Electrical Engineering staff (N) 8 11

Computer Engineering staff (N) 15 11

Mechanical Engineering staff (N) 16 11

Chemical Engineering staff (N) 9 8

(*) Total number of staff members who reported documentary evidence between 2015 and 2017

(**) Number of staff members who were involved as CA use after tool implementation (42 fac-

ulty members and part-time instructors, and 63 teaching assistants)

The evaluation process was organized into two phases. The first phase consisted in

evaluating how the CA tool was used to facilitate the collection of documentary

evidence for analysing competency attainment in curriculum discussions. The doc-

umentary evidence consisted of any documentation that was reported to account for

competency attainment at a course level, such as: course syllabus, course description,

competency assessment results, and samples of assessment methods (see Fig. 1). In

order to compare the number and the type of documentary evidence generated before

and after the CA tool was implemented, three researchers used a coding scheme to

classify the evidence reported by each teaching staff member in each course section.

This scheme was developed based on a bottom-up coding approach, and each category

was defined by examining the files uploaded in both Dropbox and the CA tool.).From

this bottom-up approach, four categories emerged, so the three researchers used these

categories to assigned scores from 0 to 1 to account for the type of documentary evi-

dence reported every semester (see Table 3): 1) reported assessment plans, 2) reported

a sample of assessment methods, 3) reported competency attainment results, 4) reported

course syllabus, 5) included a course description, and 6) used the automated report of

the CA tool. Then, the scores assigned ranged from 0 to 6 in each course section, in

which a score equal to 0 indicates a minimum amount and variety of evidence reported

(and a score equal to indicates 6 a maximum amount and variety of evidence).

Page 9: Evaluating Usage of an Analytics Tool to Support ...ceur-ws.org/Vol-2437/paper5.pdf · program design or academic program delivery [9]. Building upon this latest idea, we present

9

Copyright © 2019 for this paper by its authors. Use permitted under Creative Commons

License Attribution 4.0 International (CC BY 4.0)

Table 3. Coding Scheme to analyse documentary reported throughout the continuous improve-

ment process implemented at UC-Engineering (before and after tool implementation)

Categories Category description

Reported assessment plans (0) The teaching staff did not report an assessment plan informing how course assessment methods were used to measure compe-

tency attainment.

Reported assessment plans (1) The teaching staff reported an assessment plan to inform how

course assessment methods were used to measure competency at-

tainment at a course section level.

Reported a sample of assessment methods (0)

The teaching staff did not report a sample of assessment methods to account for different levels of competency attainment among

different students.

Reported a sample of assessment methods (1)

The teaching staff reported a sample of assessment methods to ac-count for different levels of competency attainment, such as de-

veloping or satisfactory.

Reported competency attainment

results (0)

The teaching staff did not report competency attainment results at

a course section level. Reported competency attainment

results (1)

The course reported competency attainment results based on

graded assessments at a course level.

Reported course syllabus (0) The teaching staff did not report the course syllabus to comple-

ment the evidence items for the accreditation process. Reported course syllabus (1) The teaching staff reported the course syllabus as a complement

to other evidence items reported for the accreditation process.

Included a course description (0) The teaching staff did not include a course description among the

evidence items uploaded in the CA tool.

Included a course description (1) The teaching staff included a course description among the evi-

dence items uploaded in the CA tool.

Reported percentages of compe-

tency attainment (0)

The teaching staff did not reported percentages of competency at-

tainment.

Reported percentages of compe-tency attainment (1)

The teaching staff reported percentages of competency attainment at a program level.

The second phase consisted in exploring the perceived usefulness and ease-of-

use of the CA tool from the viewpoint of its users. For this purpose, researchers have

recently proposed instruments to evaluate the impact of analytical tools on educational

practices [2], these instruments have still been implemented at a course-level in con-

trolled environments. Therefore, we decided to develop a paper-based questionnaire

based on the prior work of Ali et al [12], considering that his objective was also explor-

ing teaching staff’s perspectives in a real-life context. Our questionnaire consisted of a

closed-ended and an open-ended question section (http://bit.ly/2Jh3VVG). The closed-

ended section consisted of a 5-point Likert scale to determine the level of staff’s agree-

ment on different items related to perceived usefulness and perceived ease-of-use, while

the open-ended section included the following questions to understand usability and

ease-of-use implications from an exploratory approach:

• What use would you give to the CA tool after interacting with it?

• What kind of information would you expect from this tool?

• What do you think the CA tool lacks in terms of information and functionality?

In order to gather information from the user experience once the CA tool implemen-

tation process was well advanced, we applied the questionnaire during a workshop for

teaching assistants held at the beginning of the second semester of 2017 (see semester

tasks in Fig. 1). A total number of 25 teaching assistants who attended this workshop

Page 10: Evaluating Usage of an Analytics Tool to Support ...ceur-ws.org/Vol-2437/paper5.pdf · program design or academic program delivery [9]. Building upon this latest idea, we present

10

Copyright © 2019 for this paper by its authors. Use permitted under Creative Commons

License Attribution 4.0 International (CC BY 4.0)

responded to the questionnaire voluntarily (representing 25 out of the 63 teaching as-

sistants who supported outcome assessment tasks since the second semester of 2016).

After transcribing their responses, we estimated the percentage of respondents who

agreed with these items by counting the number of respondents whose scores were

equal or higher than 4, and then dividing them by the number of total respondents (see

Fig. 6).

Fig. 5. Percentage of teaching assistants who agreed with the questionnaire items re-

lated to the CA tool usefulness and ease of use (N=25, see questionnaire in:

http://bit.ly/2Jh3VVG).

Once all the evidence items and questionnaire were analysed, we triangulated ques-

tionnaire and document analysis results from phase 1 (see Fig. 5 and 6). This process

consisted of contrasting the amount and variety of documentary reported by teaching

staff before and after tool implementation, in addition to analysing the perspectives of

questionnaire respondents on tool usage for curriculum analysis based on evidence of

competency attainment.

4 Case Study Findings

The main findings of the case study are summarized in Table 5. Firstly, the CA tool

helped teaching staff collect a greater number and variety of evidence, providing

visualizations of competency attainment at a program level (Finding 1 in Table 5).

Document analysis results show that the number and the variety of evidence reported

per course section increased from two to five after the CA tool was implemented (see

Fig. 2). In most cases, these three additional items included course syllabuses, course

descriptions, and the percentages of competency attainment. Before the CA tool was

implemented, teaching staff delegated the transformation of graded assessment results

into percentages of competency attainment to professionals in the Engineering Educa-

tion Unit. However, once the CA tool was implemented, 89% of the course sections

relied on the automated report provided by the tool to account for competency

56%

60%

80%

84%

92%

0% 20% 40% 60% 80% 100%

The CA tool allows me to obtain moreinformation about courses than…

The CA tool allows me to obtaininformation easily.

The purpose of using CA tool is clearand understandable.

It is easy to learn how to use the CAtool.

In general, the CA tool seems usefulfor institutional management.

Page 11: Evaluating Usage of an Analytics Tool to Support ...ceur-ws.org/Vol-2437/paper5.pdf · program design or academic program delivery [9]. Building upon this latest idea, we present

11

Copyright © 2019 for this paper by its authors. Use permitted under Creative Commons

License Attribution 4.0 International (CC BY 4.0)

attainment at a course level. This effort to collect a greater number and variety of evi-

dence is not due to greater administrative pressure, since UC-Engineering managers

presented a report to ABET in the first semester of 2016, and all subsequent work was

done solely with the motivation to continuously improve the curriculum.

Additionally, the results of the questionnaire show that 92% of respondents agreed

with the item ‘In general, the CA tool seems useful for institutional management’ (see

Fig. 4). In the open-ended questions, teaching assistants mentioned that the CA tool

facilitated the use of evidence to account for the implementation of a competency-based

curriculum throughout the accreditation process. They also mentioned its potential use

to provide students with information about course methods and its alignment with com-

petencies from the graduate profile.

Secondly, teaching staff identified usability and functionality issues that affect

the use of evidence to inform course redesign and students’ understanding of per-

formance (Finding 2 in Table 5). The results of teaching staff questionnaires show that

56% agreed with the item ‘The CA tool allows me to obtain more information about

courses than other tools (such as the institutional LMS and a web application to search

for course information).’ In the open-ended sections, respondents claimed that the CA

tool had usability issues. For example, respondents indicated that the tool views had

too many tabs and fields, which hindered loading of information. Respondents also

mentioned functionality issues, such as the lack of feedback about the quality of course

information uploaded, and documentary evidence reported. Also, the tool lacks action-

able information on student performance to provide timely and specific feedback

throughout the semester.

Furthermore, questionnaire respondents indicated that, although they were able to

upload the documentary evidence to account competency-attainment, they perceived

the CA tool site was not intuitive enough. The fields to be completed, as well as the

files to be uploaded were not clearly explained. This may partly explain why only 16%

of course sections reported samples of assessment tools. These results suggest that the

CA tool could be improved by including ‘help messages’ or indications related to the

process within the platform.

Table 4. Main findings about the usefulness and the ease-of-use of the CA tool

Findings Document analysis and questionnaire results Supporting data

1. The CA tool

helped teaching staff

collect a greater

number and variety

of evidence, provid-

ing visualizations of

competency attain-

ment at a program

level.

Course sections reported 3 additional evi-

dence items for the accreditation process af-

ter the CA tool was implemented.

Document analysis

results (Fig. 6)

89% of course sections used the automated

reporting feature of the CA tool to account

for competency attainment at a course level

(40 out of 45).

Screenshot of auto-

mated report of com-

petency attainment

(Fig. 3)

92% of teaching staff agreed with the ques-

tionnaire item ‘In general, the CA tool seems

useful for institutional management’ (23 out

of 25).

Teaching staff ques-

tionnaire (Fig. 5)

Page 12: Evaluating Usage of an Analytics Tool to Support ...ceur-ws.org/Vol-2437/paper5.pdf · program design or academic program delivery [9]. Building upon this latest idea, we present

12

Copyright © 2019 for this paper by its authors. Use permitted under Creative Commons

License Attribution 4.0 International (CC BY 4.0)

Findings Document analysis and questionnaire results Supporting data

2. Teaching staff

identified usability

and functionality is-

sues in the CA tool

that affect the use of

evidence to inform

course redesign and

students’ understand-

ing of performance.

56% of teaching staff agree with the question-

naire item ‘The CA tool allows me to obtain

more information about courses than other

tools.’ (14 out of 25)

Teaching staff ques-

tionnaire (see Fig. 5)

16% of course sections reported samples of

assessment methods after the CA tool was

implemented (7 out of 45).

Document analysis

results (Fig. 6)

Fig. 6. Average number of evidence items per course submitted each semester.

5 Discussion and Limitations

This case study showed the results of the usage of a CA tool in a long-term continuous

improvement process. The main finding shows that the use of the CA tool helped teach-

ing staff collect a greater number and variety of documentary evidence of competency

attainment at a program level, allowing staff to upload documents that were previously

shared or stored somewhere else, such as course syllabus.

Furthermore, the CA tool provided staff with a visualization of competency attain-

ment during the whole continuous improvement, information that was previously

shown and discussed only in curriculum meetings at the end of the semester. This

finding not only reflects that CA tools could be a good solution to reduce task workload

for curriculum analysis [2], but also that this type of solutions provides visualizations

of student performance that is typically hidden in institutional processes. This is partic-

ularly valuable for competency-based curriculums because it provides an overview of

the competencies attained by the students at a program level. Teaching staff usually

face difficulties in competency assessment because they tend to be abstract and com-

plicated at a course level, while managers deal with the complexity of understanding

0

1

2

3

4

5

6

1stsemester of

2015

2ndsemester of

2015

1stsemester of

2016

2ndsemester of

2016

1stsemester of

2017

2ndsemester of

2017

Dropbox Curriculum Analytics Tool

Avera

ge n

°of evid

ence ite

ms p

er

cours

e s

ectio

n

Page 13: Evaluating Usage of an Analytics Tool to Support ...ceur-ws.org/Vol-2437/paper5.pdf · program design or academic program delivery [9]. Building upon this latest idea, we present

13

Copyright © 2019 for this paper by its authors. Use permitted under Creative Commons

License Attribution 4.0 International (CC BY 4.0)

learning results in a hierarchical structure of courses [3]. With this new tool, managers

and teaching staff could have more actionable information for making better decisions

to reinforce the required competencies at a program level.

Although this work has illustrated that a CA tool can support curriculum analysis in

the context of an accreditation process, there are still needs of teaching staff that have

not been met by the tool under study. Finding 2 in Table 5 indicates that there are usa-

bility and functionality issues that prevent teaching staff from using the tool to obtain

information to redesign courses and to understand students’ performance at an individ-

ual level. These are staff needs that have already motivated the development of tools

[8]. In order to solve these issues, we could specify redesign requirements for the CA

tool by using information we have already collected to evaluate usage. Thus, lessons

learned from this case study imply that institutions could adopt and adapt existing CA

tools to support curriculum analysis, investing not only in tool development and rede-

sign, but also in capacity building for evidence-informed continuous improvement [1].

Along these lines, one of the main contributions of this paper is that it evaluates the

use of a CA tool in an existing institutional process for an extended period of time,

going beyond current evaluation strategies that mostly employ self-reported data with-

out relying on a technology validation methodology [5]. The document analysis con-

sidered evidence items of the whole 3-year period in which the continuous improve-

ment process was implemented. This long-term period assured collecting enough infor-

mation to determine whether the CA tool facilitated teaching staff efforts to collect

evidence of competency attainment before and after its implementation [7].

Yet, this study has its limitations. Questionnaire results only represented a small

sample of teaching staff members who interacted with the CA tool. To understand the

contribution of the CA tool for a larger group of teachers, future work would have to

explore the type of information most staff need to inform course or curriculum redesign.

We anticipate this work might require monitoring learning results for an extended pe-

riod of time as we did it in this case study [5], so we can expand the current knowledge

on the impact of CA tools on curriculum improvement even further.

6 Conclusions

This case study illustrates the usage of a CA tool to support a continuous improvement

process for a competency-based curriculum in a university setting. Findings show that

CA tools support curriculum analysis when they are implemented to help teaching staff

to cope with tasks they are already performing for an existing institutional process, such

as course planning and competency assessment. In this study, the CA tool not only

facilitated the collection of documentary evidence to account for competency attain-

ment at a program level, but also provided visualizations of competency attainment

results that are usually not available for staff. Despite staff feeling comfortable when

using the tool to upload evidence of competency attainment, usability and functionali-

ties issues should still be improved in future versions of this tool. Future work on CA

adoption should not only focus on tool development, but also on the evaluation of its

impact in the formulation of curriculum improvement actions.

Page 14: Evaluating Usage of an Analytics Tool to Support ...ceur-ws.org/Vol-2437/paper5.pdf · program design or academic program delivery [9]. Building upon this latest idea, we present

14

Copyright © 2019 for this paper by its authors. Use permitted under Creative Commons

License Attribution 4.0 International (CC BY 4.0)

Acknowledgements

This work was funded by the EU LALA project (grant no. 586120-EPP-1-2017-1-ES-

EPPKA2-CBHE-JP). The authors would like to thank Camila Aguirre from U-Planner

for collaborating with this study, and the reviewers for their constructive suggestions.

References

[1] M. D. Pistilli and G. L. Heileman, “Guiding Early and Often: Using Curricular and

Learning Analytics to Shape Teaching, Learning and Student Success in Gateway

Courses,” New Dir. High. Educ., vol. 180, pp. 21–30, 2017.

[2] M. Scheffel, “The Evaluation Framework for Learning Analytics Impact The Evaluation

Framework,” The research reported in this thesis was carried out at the Open

Universiteit in the Netherlands at Welten Institute – Research Centre for Learning,

Teaching and Technology, and under the auspices of SIKS, the Dutch Research School

for Information and Kno, 2017.

[3] X. Ochoa, “Simple metrics for curricular analytics,” in CEUR Workshop Proceedings,

2016, vol. 1590, pp. 20–26.

[4] S. Gottipati and V. Shankararaman, “Competency analytics tool: Analyzing curriculum

using course competencies,” Educ. Inf. Technol., pp. 1–20, 2017.

[5] C. Y. Chou et al., “Open Student Models of Core Competencies at the Curriculum

Level: Using Learning Analytics for Student Reflection,” IEEE Trans. Emerg. Top.

Comput., vol. 5, no. 1, pp. 32–44, 2015.

[6] C. Klein, J. Lester, H. Rangwala, and A. Johri, “Learning Analytics Tools in Higher

Education: Adoption at the Intersection of Institutional Commitment and Individual

Action,” Rev. High. Educ., vol. 42, no. 2, pp. 565–593, 2018.

[7] M. V. Zelkowitz, “An update to experimental models for validating computer

technology,” J. Syst. Softw., vol. 82, no. 3, pp. 373–376, 2009.

[8] N. Sclater, A. Peasgood, and J. Mullan, “Learning Analytics in Higher Education: A

review of UK and internacional practice,” 2016.

[9] J. Greer, M. Molinaro, X. Ochoa, and T. Mckay, “Proceedings of 1st Learning Analytics

for Curriculum and Program Quality Improvement Workshop,” in 6th Invernational

Learning Analytics & Knowledge Conference, 2016, pp. 1–24.

[10] S. Charleer, A. Vande Moere, J. Klerkx, K. Verbert, and T. De Laet, “Learning

Analytics Dashboards to Support Adviser-Student Dialogue,” IEEE Trans. Learn.

Technol., vol. 11, no. 3, pp. 389–399, 2018.

[11] R. Gluga, J. Kay, and T. Lever, “Is Five Enough? Modeling Learning Progression in Ill-

Defined Domains at Tertiary Level,” in IllDef2010: ITS2010 Workshop on Intelligent

Tutoring Technologies for Ill-Defined Problems and Ill-Defined Domains, 2010.

[12] L. Ali, M. Asadi, and D. Ga, “Factors influencing beliefs for adoption of a learning

analytics tool: An empirical study,” Comput. Educ., vol. 62, pp. 130–148, 2013.