lecture 12: evaluation and monitoring of collaborative virtual environments/groupware dr. xiangyu...

28
Lecture 12: Evaluation and Monitoring of Collaborative Virtual Environments/Groupware Dr. Xiangyu WANG

Post on 20-Dec-2015

219 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Lecture 12: Evaluation and Monitoring of Collaborative Virtual Environments/Groupware Dr. Xiangyu WANG

Lecture 12: Evaluation and Monitoring of Collaborative

Virtual Environments/Groupware

Dr. Xiangyu WANG

Page 2: Lecture 12: Evaluation and Monitoring of Collaborative Virtual Environments/Groupware Dr. Xiangyu WANG

Groupware Usability

• (Gutwin and Greenberg 2000) define groupware usability as "... the degree to which a groupware system supports the mechanics of collaboration for a particular set of tasks."

Page 3: Lecture 12: Evaluation and Monitoring of Collaborative Virtual Environments/Groupware Dr. Xiangyu WANG

Facts

• Groupware/CVEs is traditionally considered to be difficult to evaluate because of the effects of multiple people and the social and organizational context.

• Researchers and developers have employed a range of techniques including scientific, engineering, and social science methodologies.

Page 4: Lecture 12: Evaluation and Monitoring of Collaborative Virtual Environments/Groupware Dr. Xiangyu WANG

• Pinelle and Gutwin (2000) reviewed 45 papers from the ACM Computer supported cooperative work conference (1990-1998). – almost one-third of the groupware systems were not evaluated

in any formal way.– only about one-quarter of the articles included evaluations in a

real-world setting.– A wide variety of evaluation techniques are in use.

• Their main conclusions are: – more attention must be paid to evaluating groupware systems – there is room for additional evaluation techniques that are simple

and low in cost.

Page 5: Lecture 12: Evaluation and Monitoring of Collaborative Virtual Environments/Groupware Dr. Xiangyu WANG

Classifying groupware/CVEs evaluations

• type of evaluation,

• characteristics of the evaluation,

• data collection techniques,

• placement of the evaluation in the software development cycle,

• type of conclusions drawn from the evaluation

Page 6: Lecture 12: Evaluation and Monitoring of Collaborative Virtual Environments/Groupware Dr. Xiangyu WANG

Classifying groupware/CVEs evaluations

• Type of evaluations– The major differentiating characteristics

between the strategies are level of experimental manipulation and evaluation setting.

Page 7: Lecture 12: Evaluation and Monitoring of Collaborative Virtual Environments/Groupware Dr. Xiangyu WANG

Classifying groupware/CVEs evaluations

• Characteristics of the evaluation: Evaluations are further classified according to the rigor of the experimental manipulation and the type and rigor of measurements.– Quantitative vs. Qualitative– Manipulation:

• Formal / rigorous• Minimal manipulation• No manipulation

– Measurement:• Formal / rigorous v.s. Informal

Page 8: Lecture 12: Evaluation and Monitoring of Collaborative Virtual Environments/Groupware Dr. Xiangyu WANG

Classifying groupware/CVEs evaluations

• Techniques for data collection– User Observation– Interview– Discussion– Questionnaire– Qualitative work measures– Quantitative work measures– Collection of archival material

Page 9: Lecture 12: Evaluation and Monitoring of Collaborative Virtual Environments/Groupware Dr. Xiangyu WANG

Classifying groupware/CVEs evaluations

• Placement of evaluation in software lifecycle– Grudin [14] stresses the importance of

evaluation over a period of time following groupware implementation. He also argues that evaluations of partial prototypes in laboratory settings are not able to address complex social and organizational issues.

Page 10: Lecture 12: Evaluation and Monitoring of Collaborative Virtual Environments/Groupware Dr. Xiangyu WANG

Classifying groupware/CVEs evaluations

• Six potential placements of the evaluation were considered:– Periodic evaluations throughout development process– Continuous evaluation throughout development– Evaluation of a prototype– Evaluation of a finished piece of software– Periodic evaluations after software implementation– Continuous evaluation after software implementation

Page 11: Lecture 12: Evaluation and Monitoring of Collaborative Virtual Environments/Groupware Dr. Xiangyu WANG

Classifying groupware/CVEs evaluations

• Focus of the evaluation: Types of evaluation focus include:– Organizational impact / impact on work practices– End product produced through using the software– Efficiency of task performance using software– User satisfaction with the software– Task support provided by the software– Specific features of the groupware interface– Patterns of system use– User interaction while using the software

Page 12: Lecture 12: Evaluation and Monitoring of Collaborative Virtual Environments/Groupware Dr. Xiangyu WANG

Evaluation Methods

– Heuristic Evaluation– Controlled Experiments– Survey Methods: Surveys & Questionnaires – Ethnographic Methods– Logging & Automated Metrics

Page 13: Lecture 12: Evaluation and Monitoring of Collaborative Virtual Environments/Groupware Dr. Xiangyu WANG

Heuristic Evaluation (HE)

• A well accepted discount evaluation technique for single user systems that is in popular use is Heuristic Evaluation (HE).

• HE involves a group of usability evaluators inspecting the system to identify usability issues. The issues are identified with respect to a set of usability guidelines, or heuristics.

• HE is popular because it is cheap, doesn't necessarily require representatives from the user community, and is effective at finding usability problems.

Page 14: Lecture 12: Evaluation and Monitoring of Collaborative Virtual Environments/Groupware Dr. Xiangyu WANG

Heuristic Evaluation (HE)

• How to perform a Heuristic Evaluation study– Orientation The purpose of the orientation session is to educate

and train the evaluators. – Inspection During the evaluation process, the evaluators are

asked to use and explore the interface. While using the interface, they are asked to identify usability problems that violate the heuristics.

– Debriefing • The purpose of the debriefing session is to analyze the usability

problems that have been identified. • Debriefing should include all evaluators, usability team members

and observers. • The evaluators should go through the problems that they have

found and classify them firstly by which heuristic they violate, and secondly by the severity of the problem. Problem severity is gauged by taking into account the frequency of the problem, the impact of the problem, and the persistence of the problem.

Page 15: Lecture 12: Evaluation and Monitoring of Collaborative Virtual Environments/Groupware Dr. Xiangyu WANG

Heuristic Evaluation (HE)

• Below is a list of the groupware heuristics with a brief description of the behaviour the heuristic relates to in a face-to-face setting. – Provide the means for intentional and appropriate verbal

communication The dominant form of communication in groups is verbal conversation. The conversations are used to establish a common understanding of the tasks and activities that the group is participating in.

– Provide the means for intentional and appropriate gestural communication

– Provide consequential communication of an individual's embodiment

– Provide consequential communication of shared artifacts – Provide protection – Management of tightly and loosely-coupled collaboration – Allow people to coordinate their actions – Facilitate finding collaborators and establishing contact

Page 16: Lecture 12: Evaluation and Monitoring of Collaborative Virtual Environments/Groupware Dr. Xiangyu WANG

Heuristic Evaluation Results

• Provide the means for intentional and appropriate verbal communication: – a chat window and the audio channel.

• Provide the means for intentional and appropriate gestural communication – Illustrative and emblem gestures are supported by the video

stream, although taking these gestures through the video comes at the cost of moving attention away from the workspace area.

– The pointing hand icon in telepointer that can be seen on the whiteboard. The telepointer is a poor expression tool though, it is a fixed shape and there is only one of them for each user. It falls far short of the expressiveness of a pair of human hands.

• …….

Page 17: Lecture 12: Evaluation and Monitoring of Collaborative Virtual Environments/Groupware Dr. Xiangyu WANG

Heuristic Evaluation Source

• Heuristic Evaluation of Groupware by Gregor McEwan

• URL: http://pages.cpsc.ucalgary.ca/~mcewan/HEG.html

Page 18: Lecture 12: Evaluation and Monitoring of Collaborative Virtual Environments/Groupware Dr. Xiangyu WANG

Survey Methods

• Observations

• Interviews (visiting, telephone, indirect)

• Questionnaire

• Diary

• …..

Page 19: Lecture 12: Evaluation and Monitoring of Collaborative Virtual Environments/Groupware Dr. Xiangyu WANG

Questionnaires

• Before conducting a study -practicalities– Time– Cost– Range– Questions (type, wording, order)– Demands reflection– Instructions– Pictures, anonymous cards– Language, knowledge

Page 20: Lecture 12: Evaluation and Monitoring of Collaborative Virtual Environments/Groupware Dr. Xiangyu WANG

Checklist for constructing a questionnaire

• Be concrete

• Think about effects due to the order of posing your questions (context)

• Ambiguity

• Assumptions

• Careful with yes and no questions

Page 21: Lecture 12: Evaluation and Monitoring of Collaborative Virtual Environments/Groupware Dr. Xiangyu WANG

Question type

• Open questions

• List

• Category

• Ranking

• Scale

• Quantity

• Table

Page 22: Lecture 12: Evaluation and Monitoring of Collaborative Virtual Environments/Groupware Dr. Xiangyu WANG

Appearance and layout

• Word processed

• Clear instructions

• Spacing between questions

• Keep response boxes in line (left/right)

• Guide the respondent the right way

• Promise anonymity and confidentiality

Page 23: Lecture 12: Evaluation and Monitoring of Collaborative Virtual Environments/Groupware Dr. Xiangyu WANG

Administering the questionnaire

• Electronic, snail mail or face-to-face

• Self-addressed envelope

• Instruction, information letter

• Anonymous, confidential

Page 24: Lecture 12: Evaluation and Monitoring of Collaborative Virtual Environments/Groupware Dr. Xiangyu WANG

Piloting the questionnaire

• How long did it take you to complete?• Were the instructions clear?• Were any of the questions unclear or ambiguous? If so,

will you say which and why?• Did you object to answering any of the questions?• In your opinion, has any major topic been omitted?• Was the layout of the questionnaire clear/attractive?• Any comments?

Page 25: Lecture 12: Evaluation and Monitoring of Collaborative Virtual Environments/Groupware Dr. Xiangyu WANG

Ethnographic Methods

• Ethnography has been adapted from sociology and anthropology, where it is a method of observing human interactions in social settings and activities.

Page 26: Lecture 12: Evaluation and Monitoring of Collaborative Virtual Environments/Groupware Dr. Xiangyu WANG

Ethnographic methods• There are several reasons why ethnography is of vital

importance to good interface design, including these: – An ethnographic study is a powerful assessment of users'

needs– It uncovers the true nature of the user's job: A goal of an

ethnographic study is to uncover all tasks and relationships that combine to form a user's job.   

– The open-ended and unbiased nature of ethnography allows for discovery: The unassuming nature of ethnography can often yield unexpected revelations about how a system is used.

• Drawbacks:– Time requirements   – Presentation of results– Scale

Page 27: Lecture 12: Evaluation and Monitoring of Collaborative Virtual Environments/Groupware Dr. Xiangyu WANG

Logging and Automated Metrics

• Logging can be manual or automated.

• Automated logging involves having the computer collect statistics about the detailed use of a system. Typically, an interface log will contain statistics about:– the frequency with which each user has used each feature in the

program – the frequency with which various events of interest have occurred

• Statistics showing the frequency of use of commands and other system features can be used to optimize frequently used features and to identify the features that are rarely used or not used. In addition, an analysis on patterns of use can be made using the logging data.

• Statistics showing the frequency of various events, such as error situations and the use of online help, can be used to improve the usability of future releases of the system.

Page 28: Lecture 12: Evaluation and Monitoring of Collaborative Virtual Environments/Groupware Dr. Xiangyu WANG

Next week final design presentation

• This is a group presentation, make sure you all present a part of it. Each group will have 15 minutes. The format is free, power point slides would be the best. – talk about the design concepts/brief and show screen shots of

the final designs.– talk about the design collaboration process

• Your presentation will help us to understand whether you have gain knowledge of design collaboration and aware of the issues that are related to distributed design  collaboration. Thus use this chance to show us that you understand design collaboration and what kind of features a collaborative virtual environment should have.