formative evaluationstorage.googleapis.com/wzukusers/user-16838763/documents/... · web...

18
Formative Evaluation Francisco Rico California State University, San Bernardino

Upload: others

Post on 01-Jan-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Formative Evaluationstorage.googleapis.com/wzukusers/user-16838763/documents/... · Web viewFormative Evaluation Francisco Rico California State University, San Bernardino In order

Formative Evaluation

Francisco Rico

California State University, San Bernardino

Page 2: Formative Evaluationstorage.googleapis.com/wzukusers/user-16838763/documents/... · Web viewFormative Evaluation Francisco Rico California State University, San Bernardino In order

In order to determine if my instructional design was effective I conducted formative

testing. Formative testing is “…testing with representative users and representative tasks on a

representative product where the testing is designed to guide the improvement of future

iterations. ” (Theofanos & Quesenbery 2005 p. 29) The previous statement sounds very broad,

but it can be summarized as a simulation. I conducted both one-on-one evaluations and had a

subject matter expert (SME) review my instructional design product. Formative evaluations were

conducted to ensure that learners could meet the objectives. According to Clark, the main

purpose of conducting a formative evaluation “…is to catch deficiencies ASAP so that the proper

learning interventions can take place that allows the learners to master the required skills and

knowledge.” (2015).

Anytime that an individual is learning new concepts, it is important to build a framework

of knowledge or tap into and build on pre-existing knowledge framework. In order to ensure that

learners will be able to grasp new concepts, I needed to ensure that my product was best

designed to help learners meet the stated goals. Formative evaluations not only identify issues in

knowledge and content, they also allow for the discussion of, “… any ongoing challenges and

brainstorm ways to solve those issues” (Lauby, 2014). During formative evaluation, it may be

discovered that one or many parts of the content being presented are incorrect or presented

poorly. By asking participants questions connected to the summative evaluation, I was able to

gather feedback and alter my product accordingly.

My participants all have their teaching credential and are currently employed; other than

that, my participants are rather diverse. My participants differ in their formal education (BA &

MA), years taught (2-26), subject of expertise (various), ethnicity (mainly white), technology

knowledge (little - in-depth), and age (26-53). I was able to recruit a total of nine participants to

Page 3: Formative Evaluationstorage.googleapis.com/wzukusers/user-16838763/documents/... · Web viewFormative Evaluation Francisco Rico California State University, San Bernardino In order

take both the pretest and posttest. I focused on recruiting only high school history teachers

unfortunately, I only found four high school history teachers. My participants consist of 4 high

school history teachers, 3 middle school history teachers, and 2 high school English teachers. I

was comfortable having two English teachers as participants because the new English curriculum

at our school requires the inclusion of history. Although the website is focused on helping history

teachers, it could potentially benefit teachers in other subjects. Having non-history participants

provides me with a better overall idea of how different teachers respond to the design.

To ensure that I received the best direct feedback I had three History teachers respond to

the one-on-one questionnaire about the design of my website. The three history teachers are

diverse, one is a newer middle school history teacher. Another participant is a high school

history teacher, who had previously taught in middle school. The third participant is currently the

lead history teacher at my charter school. Having a diverse group of teachers allows me to have a

better representation of history teachers in general. Only having one subgroup of history teachers

would not provide me with an accurate picture of the needs of all history teachers.

I used Google forms to collect data for my pretest, posttest, and one-on-one formative

evaluations. The pre-test consisted of ten Latin American history questions, while the posttest

consisted of ten Latin American history questions and five questions pertaining to technology.

The Latin American history questions on the pretest and the posttest were the same. I decided to

keep the questions the same in order to ensure that scores only changed due to the website. The

five technology questions consisted of a mix of simple recall and on having to know how the

technology was used. In order to collect feedback from a SME, I asked the lead history teacher at

my charter a few simple questions through email.

Page 4: Formative Evaluationstorage.googleapis.com/wzukusers/user-16838763/documents/... · Web viewFormative Evaluation Francisco Rico California State University, San Bernardino In order

The instructional material used was my website and the various links and content

associated with the website. The website utilizes informative and how-to videos, primary

sources, descriptions, and secondary sources. All the information on the website should be

utilized to help the teacher get the most out of the website and ensure that they meet the stated

objectives.

Most of the useful information that I gained was provided by a SME. The SME

commented that the website had a slight left bias, minor aesthetic issues, and could use personal

lessons. I did not change much in the content, because although I think the information may

come off as somewhat left, I feel I am presenting the information clearly. I am not arguing that

the right wing in each individual country is wrong, I am merely presenting the information I

deem important. I went over the site and cleaned up all the aesthetic issues on the site. I was not

able to create personal lesson plans that included technology, as I did not find it necessary to the

overall content of the page.

The only information that I received back from the one-on-one questionnaires was that I

should add more information on other countries in the future. I agree that I should have

information for additional countries, but I feel the website has a lot of information on it. One of

the one-on-one participants mentioned that they did not look at the information in depth, which

leads me to conclude that their already is a wealth of information on the website. In the future I

want to create a sister website on four additional Latin American countries. The lack of critique

of my site lead me to conclude that my website was either great or that my questionnaire could

have been worded better. Richardson, Kalvaitis, and Delparte found in their study that a lack of

good feedback was connected to the idea that, “Many instructors and observers reported that the

rigid structure of the forms was limiting” (2014, p.201) and therefore limited the potential of

Page 5: Formative Evaluationstorage.googleapis.com/wzukusers/user-16838763/documents/... · Web viewFormative Evaluation Francisco Rico California State University, San Bernardino In order

feedback. I am also concerned that pushing teachers to complete the questionnaire at the end of

the school year limited their desire to give in-depth critiques (Rantanen, 2013). The overall

positive feedback might also relate to the idea that ‘student’ have a positive response to

instructors they like (Bianchini, 2014). It is possible that my participants favorable feedback

about my website was influenced by their opinion of me personally.

I gave my participants a pretest and a posttest that consisted of the same Latin American

history content questions. I gave my participants a ten question pretest prior to giving them

access to my website. I took the average correct on every question and determined that the

overall average was a 37%. I gave participants access to my website and asked them to look at it,

over the course of a week. After the week was over I gave participants a posttest. The posttest

consisted of the same ten content questions and five questions on technology. The overall

average correct score on the content jumped to 83%. The overall average correct score on the

technology section was 86%.

The themes that I noticed while looking over the questionnaires were: don’t change a

thing, it was helpful, and suggestions. ‘Don’t change a thing’ comprised 54% of the responses,

which is the majority and leads me to conclude that overall the website should remain in tact.

The ‘website is helpful’ was 25% of the responses, which means that the website had a

somewhat positive impact on one-on-one participants. Finally, 17% of the responses had

suggestions, which mainly consisted of minor things that the SME also pointed out.

Although my posttests had positive responses, Zint points out that “Evaluation results are

likely to suggest that your program has strengths as well as limitations.” (2015). As I began to

think of my results I started to think of the limitations of my posttest and one-on-one

questionnaires. I concluded that there are three limitations: 1. The posttest questions consist of

Page 6: Formative Evaluationstorage.googleapis.com/wzukusers/user-16838763/documents/... · Web viewFormative Evaluation Francisco Rico California State University, San Bernardino In order

mainly recall 2. The structure the questionnaire is too rigid 3. I ask if teachers feel they could

meet the objectives, but I do not require them to submit an example of their technology

utilization. In the future I need to create a posttest that requires deeper content/concept

knowledge. I need to include an additional comment section on the questionnaire. I also need to

require that teachers submit an example of their using technology.

The strengths of my project are that I was able to quickly collect feedback from my

participants. According to my participant feedback, my website was both well designed and

allowed teachers to meet the objectives. I included my email on the website, which allows non-

participants to email me regarding questions and provides them the opportunity to receive

feedback.

What enjoyed most about the instructional design process was creating the website and

finding the information that goes on the website. I have built a site before as part of my e-

portfolio, but I really enjoyed building this website. As a Mexican American, I have always been

interested in the history of Mexico and Latin American countries and this project allowed me to

explore that topic further. I was able to spend time looking at videos, lessons, and articles, that I

might not have had the time to look at before. I also liked looking for new technology to use as a

history teacher. I like to play with new technology and during the creation of my website I was

able to look for new technology and play with it.

Lastly, I liked receiving feedback on my project both from one-on-one questionnaires and

from the SME. The reason that I liked receiving feedback was two-fold, improvements and

positive feedback. I was able to get great ideas on how to adjust my website or completely

revamp it to make it better. I also tended to receive positive feedback, which was like a pat on

the back.

Page 7: Formative Evaluationstorage.googleapis.com/wzukusers/user-16838763/documents/... · Web viewFormative Evaluation Francisco Rico California State University, San Bernardino In order

The two biggest hurdles I faced while completing this assignment were making sure that I

was turning in the correct thing and redoing my site. I am not sure if I was just mentally drained

this quarter, or if I didn't understand how the assignments were set-up, but I was often confused

as to exactly what I had to turn in. I overcame this obstacle by doing three things: re-reading

instructions, asking my professor, and asking my classmates. I did not find re-reading to be as

helpful as asking my professor or asking my classmates. The other big obstacle was having to re-

do large parts of my website. I had a grand idea of what I wanted on my website and when I got

down to finding the information and organizing information, I realized it was too much. I had to

adjust my website accordingly. Prior to that I wanted to focus on Revolutions of the 20th

century, but found that also to be chaotic. I overcame this obstacle by talking to my professor

and colleagues about what was feasible. What also helped me was just doing it and not

deliberating over how difficult it would be to start over.

I have gain a lot of knowledge about Instructional design during this project. I am now

more familiar with what it takes to create a product and all the steps that one needs to take in

order to ensure that the final product is good. At first I did not spend much time talking to

stakeholders, although I did ask some questions, I relied more on what I wanted out of the

website. As the project progressed I realized that I needed more input from stakeholders. In the

future I will seek out more stakeholder input prior to starting my website/product. I now realize

how important it is to sketch out an idea of what you want to create, especially if it is a website.

The amount of hours that I wasted creating something and then having to restart could have been

used on making my site even better.

Page 8: Formative Evaluationstorage.googleapis.com/wzukusers/user-16838763/documents/... · Web viewFormative Evaluation Francisco Rico California State University, San Bernardino In order

AppendixPretest

Page 9: Formative Evaluationstorage.googleapis.com/wzukusers/user-16838763/documents/... · Web viewFormative Evaluation Francisco Rico California State University, San Bernardino In order

posttest

Page 10: Formative Evaluationstorage.googleapis.com/wzukusers/user-16838763/documents/... · Web viewFormative Evaluation Francisco Rico California State University, San Bernardino In order

Needs

SME feedback

Page 11: Formative Evaluationstorage.googleapis.com/wzukusers/user-16838763/documents/... · Web viewFormative Evaluation Francisco Rico California State University, San Bernardino In order
Page 12: Formative Evaluationstorage.googleapis.com/wzukusers/user-16838763/documents/... · Web viewFormative Evaluation Francisco Rico California State University, San Bernardino In order

References

Bianchini, S. (2014). Feedback effects of teaching quality assessment: macro and micro evidence. Assessment & Evaluation In Higher Education,39(3), 380-394. doi:10.1080/02602938.2013.842957

Clark, D.R. (2015). Types of evaluations in instructional design. Retrieved June 01, 2016, http://www.nwlink.com/~donclark/hrd/isd/types_of_evaluations.html

Lauby, S. (2014, May). Six helpful tips on how to conduct a performance review meeting. Retrieved June 4, 2016, from http://www.halogensoftware.com/blog/six-helpful-tips-on-how-to-conduct-a-performance-review-meeting

Rantanen, P. (2013). The number of feedbacks needed for reliable evaluation. A multilevel analysis of the reliability, stability and generalisability of students’ evaluation of teaching. Assessment & Evaluation In Higher Education, 38(2), 224-239. doi:10.1080/02602938.2011.625471

Richardson, R., Kalvaitis, D., & Delparte, D. (2014). Using systematic feedback and reflection to improve adventure education teaching skills. Journal Of Experiential Education, 37(2), 187-206. doi:10.1177/1053825913503116

Theofanos, M. and Quesenbery, W. (2005). Towards the Design of Effective Formative Test Reports. Journal of Usability Studies, Issue 1, Volume 1, November 2005, pp. 28-46. http://www.upassoc.org/upa_publications/jus/2005_november/formative.html

Zint, M. (2015). Evaluation: hat is it and why do it? Retrieved June 04, 2016, from http://meera.snre.umich.edu/evaluation-what-it-and-why-do-it