faculty of medicine’s online staff development modules ... · 1 evaluating the effectiveness of...

16
1 Evaluating the effectiveness of the Faculty of Medicine’s online staff development modules June-Aug 2013 Selma Omer & Sarah Brien Nov 2013

Upload: tranduong

Post on 13-Jul-2018

213 views

Category:

Documents


0 download

TRANSCRIPT

1

Evaluating the effectiveness of the Faculty of Medicine’s online staff

development modules

June-Aug 2013

Selma Omer & Sarah Brien Nov 2013

2

Index Page About MEDUSA………………………………………………………………………….……3

Purpose of the evaluation ………………………………………………………….…..4

Key Questions ………………………………………………………………………………...4

Data sources …………………………………………………………………………………...5

Findings:

1. The users’ experience with using MEDUSA modules…………...6-9

2. The users’ motivations for using MEDUSA …………………………..9-13

3. How the design principle support learning ………………………....14

4. The influence of MEDUSA on the users’ educator roles ……….15

Summary ………………………………………………………………………………………….16

Acknowledgements:

We would like to thank everyone who contributed to the design and development of our MEDUSA modules. Special thanks to Marcus Parry, Faith Hill and Kevin Galbraith, Sunhea Choi ( and team) who have been the “founders” of MEDUSA modules and still continue to be heavily involved in their development. We would also like to thank Sunhea for her contribution to the development of the staff survey.

3

About MEDUSA The Faculty of Medicine at the University of Southampton runs a highly successful staff development programme, designed to meet the needs of our staff and NHS clinicians in Southampton and from across the Region who teach our medical students. Since it can be difficult for busy clinicians to attend workshops, we have developed online learning resources to improve access to staff development. These can be accessed through our online portal: MEDUSA (Medical Education Development Unit Staff-Development Access). To date we have developed and offered 9 modules on teaching, facilitating learning and assessment shown in Table 1. Table1. Current MEDUSA modules

Teaching • Planning and delivering lectures • The student assistantship • The BM curriculum

Facilitating Learning • Diversity • From classroom to clinical learning (CtoC) • Supervising student projects

Assessment • The role of the OSCE examiner • Assessment of Clinical Competence (MiniCEX) • Giving constructive feedback

Our underlying principle for the development of the content of these modules is Kolb’s Experiential Learning Cycle. The learning activities represent each stage of the learning cycle of experience, reflection, conceptualisation and experimentation (Figure 1) Figure 1. The design of MEDUSA modules based on the structure of Kolb’s experiential learning cycle

4

Purpose of the evaluation: We have been developing MEDUSA modules since 2008 and these are still growing with two new modules developed each year. Although we place a lot of resources in their development, we have not formally evaluated their use. The purpose of this evaluation is to understand our users, reassess their needs and requirements and explore their experiences with using MEDUSA. We hope that this evaluation will help us to:

Improve MEDUSA functions and its modules

Increase uptake of online staff development

Simplify MEDUSA functions and programming

Share lessons with others external to the University of Southampton And ultimately,

Set strategic direction of staff development

Key questions:

1. What are the users’ experiences with using MEDUSA modules? 2. What are the users’ motivations for using MEDUSA? 3. Does the design principle support learning? 4. What is the benefit of MEDUSA modules? Do they change the way that they work as

educators?

5

Data sources: 1. MEDUSA records and inbuilt evaluation MEDUSA records data allowing users to manage their learning and the Faculty to keep

records. Since March 2011-July 2013, 327 users completed modules; 50.2% were female,

49.2% male. Most of the users were non-clinical academics (57.8%), 35.2% were clinical

academics and the remaining 7% had non-teaching roles including pastoral and research

roles.

Each of our MEDUSA modules had inbuilt evaluation survey that users are directed to once they complete the module. We had a high response rate with 86.5% of the total 327 participants who used the modules, completing the evaluation survey.

2. Staff survey In an effort to address the questions that were not captured through the inbuilt evaluation survey, we administered an online survey to further explore the motivations of our staff for using online training. The survey was distributed to the Faculty of Medicine academic staff, GP trainers and NHS clinicians across the south west region between August-September 2013. Data was collected from 99 participants of whom 57.6% were female and 42.4% were male. Most were clinical teachers (44.4%), followed by non-clinical teachers (16.2%) and 39.3% had non-teaching roles that include: administrative, research and pastoral roles.

6

Findings: 1. The users’ experience with using MEDUSA modules

1.1 Module use The 327 participants completed a total of 419 modules (Figure. 2). Overall across the whole group, the most popular module selected was Diversity (completed by 182 participants), this may be due to the fact that its completion became a Faculty requirement prior to the GMC visit. The CtoC module, being relatively new was completed by only 7 participants. Additionally, we received external funding from the Higher Education Academy (HEA) to develop the CtoC module and consequently, there are two versions – an internal version and an open access version that does not require log-in. The data was collected form only the internal version. Please also note that the Supervising Student Projects and BM Curriculum modules were not live at the time of data analysis. Figure 2. MEDUSA module use

Overall, the majority of our users completed 1 module (N= 270, 82.6%) or 2 modules (N=41, 12.5%) and only 4.8% (N= 16) completed 3 or more modules. Figure 3. The majority of MEDUSA users complete only one module

77 90

29 22 12

182

7

Number of participants who completed each module

83%

13%

2% 2%

Freguency of modules completed by participant

1 2 3 >3

7

We wanted to explore the way by which our users use MEDUSA. In the staff survey we asked MEDUSA users to describe whether they fully completed each module, started using it, only logged in, or have not used it at all. The majority of participants did not respond to this question and therefore it is difficult to clearly understand participants’ patterns of using the modules. The numbers of participants reporting having used each of the modules ranged from 1-28. As shown in Figure. 4, although most of the responders (70%) report fully completing the modules, we seem to be losing 30% of our users who either start the modules or only log on without completing. This posed some interesting questions as to why they do not complete the modules. Moreover, as shown in Figure 3, we have 270 users who are familiar with MEDUSA modules and have completed at least one module, what will it take to get users to revisit site in order to complete more than one module? Figure 4. The majority of MEDUSA users fully complete the modules that they started

1.2 When MEDUSA is used The staff survey also explored the time when participants use MEDUSA modules. As shown in Figure. 5, of 48 participants who responded, most reported using it in their own time (n=17, 46%) or in work time (38%), these exclude 11 participants didn't remember where they had used it. We also asked staff what the ideal time module completion. Of the 52 participants who responded, the majority felt between 10-30 minutes was ideal (73.1%), 25% preferred between 30-60 minutes and only 1% felt it should be longer than 1 hour. The inbuilt evaluation survey also asked users how long they spent to complete the module. The majority of our users actually spent between 10-30 (52%) or 30-60 minutes (31%) completing modules. This means that we are designing the modules close to the user’s ideal time!

70%

23%

7%

How participants used MEDUSA modules

Fully completed Started it Only logged on

8

Figure 5. MEDUSA modules offer a flexible learning approach that allows users to access staff development in their own time

. 1.3 Satisfaction with modules MEDUSA users rated each module (as part of the inbuilt evaluation survey) using a 5 point scale (where 1= very dissatisfied, 5 = very satisfied) to report their level of satisfaction with regards to its in Relevance; Meeting of learning outcomes; Maintenance of interest; Amount of Interaction; Type of interaction; Ease of navigation; and overall Structure (Table 2A). The modules were rated very highly with a median score of 4, and over 80% of participants reporting satisfied and very satisfied ratings. Refer to Table 2B for satisfaction ratings per individual module Table 2A. Overall satisfaction ratings Data from 86.5% of the participants who completed the inbuilt evaluation survey

Item Median rating (1-5 scale)

% satisfied/very satisfied

Amount of interaction 4 86.7

Ease of navigation 4 82.8

Maintenance of interest 4 81.0

Meeting Learning Outcomes 4 86.5

Overall structure 4 84.8

Relevance 4 87.8

Type of interaction 4 83.9

Table 2B Median satisfaction ratings for each individual module

Item Mini-CEX N=40

Lecture N=20

OSCE N=62

Feedback N=26

Assistantship N=8

Diversity N=122

Ct0C N=5

Amount of interaction 4 5 4 4 4 4 4

Ease of navigation 4 5 4 4 4 4 4

Maintenance of interest 4 5 4 5 4 4 4

Meeting Learning Outcomes 4 5 4 4 4 4 4

Overall structure 4 5 4 4 4 4 4

Relevance 5 5 4 5 4 4 4

Type of interaction 4 5 4 4 3.5 4 4

38%

46%

16%

When participants use MEDUSA modules

Mostly in work hours Mostly in my own time Both in work and at home

9

2. The users’ motivations for using MEDUSA

2.1. Previous experiences with staff development In the staff survey we explored our staffs’ previous experiences with different modes of staff development available including face-to-face and online training. Of the 99 participants, most had used face-to-face alone (N=38, 38.4%) or a combination of both online and face-to-face (N=37, 37.4%). Only 2 had used online alone (2.0%). Comparing within groups of: clinical teachers; non-clinical teachers; and those with non-teaching roles, most clinical teachers attended face-to-face sessions (n=23, 52.3% of all clinical teachers), then a combination of online and face-to-face (n=14, 31.8%); none had attended online training only. Most non clinical teachers attended a combination of online and face-to-face (n=12, 75.0%), followed by face-to-face training (n=2, 12.5%); like clinical teachers, none had attended only online training. The non-teaching group were most likely to attend face-to-face (n=13, 33.3%), followed by a combination of online and face-to-face (n=11, 28.2%) then online (N=2, 5.1%). Therefore, clinical teachers and those with non-teaching roles had a similar pattern of use compared to non-clinical teachers (Figure 6A). Overall, there was a highly significant difference between groups as to the type of training they undertook (using Fisher’s exact Test, p=0.007). Participants were asked to report the number of face-to-face and online training events they had attended over the previous 2 years. On average, most had attended more than 3 face-to-face training sessions, and 1 or 2 online training courses. There was no group difference in number of face-to-face (p=0.162) or online events attended (p=0.674). Figure 6B shows the median number of face-to-face and online training events attended by clinical teachers, non-clinical teachers and the non-teaching groups. Although clinical teachers attended fewer staff development training events compared to the others groups, they participated in more online training than face-to-face events. The other two groups attend more face-to-face events compared to online training. This is possibly because these groups tend to be campus based where face-to-face training is more readily accessible to them compared to clinical teachers who may be distributed in various locations across the Region.

10

Figure 6. Participants experiences with SD training events at the University of Southampton

Data from 78.8% of the staff survey participants who had participated in staff development

52.3

12.5

33.3

0 05.1

31.8

75

28.2

Clinical teacher Non-clinical teacher non-teaching role

A. Type of staff development activities attended

Face-to face Online Both online and face-to face

1

3 3

1.5

2 2

Clinical teacher Non-clinical teacher non-teaching role

B. Median number of SD training attended over the last 2 years

Face-to face Online

11

2.2 Reasons for using MEDUSA In the staff survey, we asked participants whether they had used MEDUSA modules in the past. Of those who responded, 33 (38.4%) had used MEDUSA modules but the majority had not (n=53, 61.6%). We wanted to know 1) why they did not use MEDUSA 2) why they did use MEDUSA and 3) what would help them use MEDUSA in future. Why they didn't use MEDUSA: Participants selected between 1 and 3 reasons for why they did not use MEDUSA modules from up to 10 options provided on the survey. The most frequently reported reason for not using MEDUSA was that: they didn't know about the module (N=39, 39.4%) followed by not having enough time (N=9, 9.1%). Six or less participants selected the following options: they found it hard to access the modules; they didn't think the modules would benefit them; I only like face to face training; or they did not like online training (Figure. 7A). Why they used the MEDUSA modules (Q4.1) 43 participants selected a reason why they had used the modules. Ten options were provided for them to choose from. The reasons they had used the modules were (in decreasing order): to know more about a topic (n=25); develop skills (n=14); interest in topic (n=14); it was either a University (n=15) or GMC requirement (n=12); learn at their own pace (n=10); they needed to for their appraisal (n=8); it was recommended by a colleague (n=8); or for training other staff (n=7) (Figure. 7B). What would help them use MEDUSA in the future? 86 participants (86.9%) selected at least one of the 8 options. Participants selected the following options (in decreasing order): if they were better advertised (n=53); if the modules were more relevant to their work (n=37); having time release (n=32); if CPD was awarded (n=32); making modules easier to access (n=25) or if a blended approach was used (n=25); or if they were made compulsory (n=16) (Figure. 7C).

12

Figure 7 Users and their motivations for using MEDUSA.

Data from 50.5% participants who completed the staff survey. The mean number of reasons selected was 1.28 (SD 0.57).

Data from 43.4% participants who completed the staff survey. The mean number of options selected was 1.24 (SD 1.76)

Data from 86.9% participants who completed the staff survey. The mean number of options selected was 2.2 (SD 1.59)

0 5 10 15 20 25 30 35 40 45

I did not know about the MEDUSA modules

I do not have enough time to complete the…

I found it hard to access the modules

I do not like online learning

I do not think the modules will benefit me

I only like attending face to face staff training

A. Reasons for not using MEDUSA modules

0 5 10 15 20 25 30

To increase my knowledge or awareness in…

It was a university requirement

To build new skills

I was interested in the topic

It was a GMC requirement

I like to learn at my own time and pace

It was recommended to me by a colleague

It was useful for my appraisal

I used it for training other staff

B. Reasons for using MEDUSA modules

0 10 20 30 40 50 60

If the modules are advertised better

If the modules offer training relevant to my work

CPD when modules are completed

Having time release from my work to complete…

If a blended approach was used

If the modules were easier to access

If completing MEDUSA modules was compulsory

C. What might help people use MEDUSA modules

13

2.3 Future topics Participants who completed the staff survey suggested topics they would want for future modules as shown in Table 3. Table 3. Topics for future MEDUSA modules

Clarification of the Personal tutor role

Methodological research and data management-related; clinical studies etc.

Use of Wikis in assessment

Project management (Prince2 or microsoft Project)|Management skills in order to progress within the University

What is expected of medical students doing research? How to encourage and promote student designed and student led projects

Mental health issues for students

Time management presentations skills career development planning lecturing skills.

Curriculum management roles

A simple overview of the teaching structure for the students. I don't have time to read the curriculum from cover to cover every time it changes.

Simpler guidelines for marking essays and projects.

Assessment theory

Learning theory and practice

Effective uses of technology to improve learning

How to teach what we are supposed to be teaching in an inspiring way. How they will be tested at the end.

Feedback to students and Motivating students

Marking the student at end of attachment

Grading students, what to look for in a Grade A student as opposed to a grade E student

Bench marking mini-Cex's |giving feedback

Duties as a clinical trainer in GP as per GMC guidance

Supporting students

Small group teaching/ facilitation

OSCE training and supporting students in difficulty

Conscious and Unconscious bias|Appraiser/apraisee training for PPDR (or equivalent)

New educational developments

Student mental health issues (eating disorders mood disorders)|Ethical dilemmas |

Health & Safety

Topics for Admin staff

14

3. How the design principle supports learning As previously described, MEDUSA modules were designed around Kolb’s learning cycle. We wanted to know if the module design does in fact support learning, if indeed the different features that we had built into each module for example videos, have a go activities, models, cases, etc, can help to promote learning through experience, reflection, conceptualization and experimentation. On the inbuilt module evaluation, participants reported the features that they liked about the MEDUSA modules in open ended comments. Analysis of the qualitative data identified 5 categories relating to the module content, delivery, activities, structure and resources provided, shown in Table 3 ranked highest to lowest with description or subcategories from each. The majority of the comments (56%) related to the content and in particular the way in which the modules conveyed cases and examples, practical tips, key concepts/theoretical models, related students and examiners views and were thought provoking. What is interesting is that these features that they users spontaneously came up with directly relate to Kolb’s learning cycle. For example, our users liked features such as the cases and examples used, relating students and examiners view that we had designed to simulate a relevant experience based on Kolb’s model. Similarly, users found the content to be thought provoking so it stimulated reflection. The Key concepts and models related to conceptualisation. Also among the top ranked, were the activities and in particular opportunities to practice and get feedback on performance. These promoted active experimentation according to in Kolb’s model. Table 3. MEDUSA features that were liked by users

Qualitative data from a total of 368 comments reported by 225 participants who completed the inbuilt evaluation survey

Rank % Category Description

1 56 Content • Cases and examples used • Highlights practical tips • key concepts & models • Appropriate length • Relates students and examiner views • Relevant, informative, realistic • Thought provoking

2 35 Delivery • Animations & videos used • Interactivity • Multimedia • Ease of use & navigate

3 23 Activities • Chance to practice • Feedback on performance

4 12 Structure • Clear, concise, simple, language, organised, engaging

5 3 Resources • References

15

4. The influence of MEDUSA on the users’ educator roles Participants were able to report the ways in which the module would change their work as an educator. The qualitative data was analysed and categorised into 7 themes/categories based on the users’ comments. 35 % of participants felt that the modules will enhance their roles as educators by increasing their awareness and helping them to gain knowledge and improve their understanding. Also among these themes is helping them to change their practices, or increase their reflection, performance and confidence. Table 4. The users reported the various ways in which MEDUSA modules will change their work as educators

Rank % Category Description

Quotes

1 18 Awareness • Raising awareness, reminding and reinforcing concepts

“Reinforced some things I knew but do not always focus on and a good opportunity to reflect on own skills and course design”

2 17 Knowledge • Gaining knowledge; improved understanding

“I feel more informed, and have a better idea of standard required”

3 13 Change • Changing practices - shifting in the focus or method

“Encouraged me to get students to discuss with each other their feedback after they get it and to offer more opportunity to discuss feedback they get on an assignment”

4 12 Reflection • Making the user reflect on their practice

“I think it will help me to consider again how I present things to students, to enable as wide an inclusion as possible”

5 11 Performance • Building skill, improving performance

“It will improve how I deliver lectures and help me to keep my audience engaged throughout so that I can maximise how much the students get out of it”

6 8 Confidence • Improving confidence “I have more confidence that I'm on the right track!”

6 8 Application • Applying learning into practice

“I took away some useful ideas to try out with my next student…”

7 3 Recommend • Recommending the module, using it to train others

“It is an easily accessible resource to use as a refresher and I will use it as a point of reference for my colleagues and those who provide MINCEX practice assessments for the students.”

Qualitative data from a total of 189 comments reported by 174 participants who who completed the inbuilt evaluation survey

16

Summary We have developed a series of online staff development modules on a range of topics in teaching, facilitating learning and assessment. Over the last two years over 300 of our staff members have used these modules. We have over 1,000 staff members and clinicians who are involved in teaching our students. Our goal is for more of our staff to use MEDUSA modules. We administered a survey to our teaching staff and clinicians to explore the way that they use different modes of staff development and their motivations for using MEDUSA modules. Overall, our staff seem to prefer face-to face events or a combination of face-to-face and online training but not online training alone. The survey data also shows that our clinical teachers participate in fewer staff development events but they are more likely to participate in online training compared to our non-clinical teaching staff and staff with non-teaching roles. We found that the primary reason for using MEDUSA was to increase knowledge and awareness in a given topic. Two thirds of the staff who participated in the survey had not used MEDUSA before, the primary reason being that they had not heard of MEDUSA modules. Consequently the majority of the survey participants feel that if the modules were better advertised they would be more likely to use them more often. Clearly, one area for improvement is better advertising of the modules since this is most frequent reason that staff did not use it as well as the main reason why they would use it in the future. Our findings also showed that the modules were highly rated by participants and may influence the way that they work as educators. Additionally, using learning theory at the center of module design helped to meet the learning needs of our users. MEDUSA modules offer a flexible learning approach that allowed the majority of our survey participants to access staff development in their own time, during non-work hours. We realise that MEDUSA does not replace face-to-face training but may provide an interactive and flexible learning approach, increasing opportunities for staff development to those who would not easily attend face-to face training events.