affective user interface design that endows mobile phones ... › iasdr2013 › papers ›...
TRANSCRIPT
1
Affective User Interface Design that Endows Mobile Phones with Emotional Expressions
Chia-Yu Hsu*, John Kar-kin Zao*, Ming-Chuen Chuang** * Department of Computer Science, National Chiao Tung University, [email protected],
[email protected] ** Institute of Applied Arts, National Chiao Tung University, [email protected]
Abstract: This paper proposes an affective interface for mobile phone which makes user feel like
interacting with an individual who has personality. Mobile phone -among all electronic devices- is
the one that we now rely on and interact with the most. However, the thinking of humanity and
emotional engagement with the device had yet being explored. By designing a set of different
emotion featured elements for the interface, users are able to intuitively understand and therefore
better respond to the devices' information and status. Analyzed the in-depth interviews we
concluded an affective emotional interaction model and its design principles which offer more
emotional engagement for interaction. Based on the principles, we explored the design possibilities
for the interface and tested it. The study intends to structure an affective emotional interaction and
its design criteria for mobile phones to gain certain personalities that lead to users' stronger
comprehensibility, more emotional attachment to mobile phone, and better behavioral engagement
in interaction.
Key words: human-machine interactions, affective interfaces, affective design, interaction, affective expressions, mobile phone, users' emotion
1. Introduction People nowadays rely more on their mobile phones and the interactions with them have become more intimate
and crucial. In particular, smart phones have brought more innovative and human-centered technology with
development. As machines become more intellectual, some new notions started to emerge, such ideas include
Norman’s Emotional Design [1] and Picard’s Affective Computing [2].These have suggested that a machine
should be able to express emotion in order to fully perform its intelligence, and thus being effectively utilized.
If users are having emotional attachments with their mobile phones, they may be more careful while using and
more willing to repair them when broken, therefore the turnover of the product will be suspended and product
loyalty will be raised [3]. As a result, it is crucial for designers to raise the consumer’s affection of the products [4].
Besides, when technological product like computer become a society member of our lives, they may help us to
acknowledging technology and to construct interactions that encourage us to learn [5].
The research assumed that applying emotional interaction features to mobile phones can enhance the user’s
willingness and frequency of use. Following the idea of “Living object” by Jordan [6] and Norman’s introspection
of people’s lack of emotional attachment with their cell phones, we aimed to propose an affective interaction for
mobile phones. Through in-depth interviews, we analyzed the key factors of how human emotionally attached to
living or non-living objects and which of these factors can be implemented into mobile phones. Based on these
2
factors, we identified two aspects: Emotion and behavior. We then constructed the new interaction interface for
mobile phone which consists of an Emotion Model and a Behavior Architecture. Eventually we tested the
prototype with users and evaluated the comprehensibility and consistency of the new interaction interface.
2. User Study The user study has two phases. The first part- the in-depth interview with mobile phone users - is to understand
what are the factors making people interact emotionally with their daily objects. In the second part, we used the
KJ method and introduced some principles which can be applied to mobile phones based on the interview data.
Due to such affective products are still rare and hard to imagine, the daily objects also includes interviewee’s pets.
Series of one-on-one, in-depth interviews were conducted. The brief of the interview guide lines were as below: 1.
What is the role of interviewee’s mobile phone in his/her daily life, and how much emotion he/she gets out of their
phone? 2. What is the object in the interviewee’s life that he/she affectively interacts with the most? Be it either
living or non-living. 3. We asked for more detail of the interaction with the object, including reason of having the
object, type of interactions, emotional changes along interactions, scenarios of the interactions, the behaviors
during the interactions, etc. 4. We asked interviewees to imagine a new mobile phone that can do all mentioned
interactions, and asked for their initial thoughts and opinions about such a product. 5. We asked interviewee’s
opinions on the balance between having affective interactions with phones and keeping the usability or
functionality of the phones.
2.1 Participants We recruited seven participants (three female) between 20~50 years old. All have previous experience with
mobile phones. Their jobs were diverse, and their phones ranged from regular phones to smart phones. All
participants have affective attachments to living or non-living objects (e.g., pets, cars, dolls) in their daily lives.
2.2 In-depth Interview The interview conducted was semi-structural. Through verbatim method, we analyzed the emotional
interactions between people and living and non-living objects. From the result, we realized that non-living objects
can potentially arouse user’s emotional projection, just as living things. However, since mobile phones are
functional products in daily life, the functionality and usability are the most crucial factors that make users love
the devices. Emotional connections are rare in this relationship. As a result, most of the interviewees claimed that
they could hardly imagine the phones become living objects that they might put more emotional into. In addition,
negative or complicated emotional behaviors may lead to user’s distaste, while adding of Artificial Intelligence
may lead to the user’s fear of use.
2.3 KJ Method Though interviewees responded rather rationally claiming they have no emotional connection with their phones,
we did find some emotional interaction according their narratives. We extracted the larger intentions behind some
subconscious behaviors, and used the KJ Method as tool to sort-out the criteria and key points for affective
interaction between people and their mobile phones.
We used the KJ Method to categorize users’ needs from former interviews, and organized the essential factors
that people emotionally interact with targets. These contain: target’s active behavior; feeling of achievement by
3
taking care of the target; feeling of owning each other; feeling of being relied on; feeling of being accompanied;
accumulated affection through times; intimacy and friendliness; eagerness for simple interaction; target’s instant
emotional feedback; pleasure of teasing/ playing with the target; learning because of the target; owner’s mental
projection on target; developed specific behaviors with target; and personification of the target.
After synthesizing the information into a few key points by the KJ Method, we sorted the points to identify
design possibilities (Figure 1). We used design controllability as a y-axis in which the higher means the easier to
control by design; the lower means more depending on individual’s development, yet some of which can be
stimulated by means like imitating life or personalities, etc. The x-axis represents the time, the more towards to
Instance means short-time or immediate reaction in interaction; the Constancy side means the long-staying
characteristic of interaction.
Figure.1 Analysis from KJ method
3. Affective User Interface Design Based on the conclusion of interviews, we start to design the affective interface for mobile phone. We tried to
fully interpret those key points with higher design controllability and extend their potentials in order to cover
those cannot be achieved directly through design.
3.1 Design Feature From the study of users, we found all the key factors can be categorized as either emotion-wise or behavior-
wise. As a result, we proposed a system featuring a Behavior Architecture that generates the phone’s contextual
behaviors, and an Emotion Model which gives the phone sophisticated personality.
While considering the interaction between user and phone, we included the basic functional behaviors which
people are already used to nowadays, and also take into account the new behaviors that may happen under an
emotional interaction scenario. Based on the result from KJ Method, we tried to define the simplest behaviors yet
generate most of the emotional connection.
We concluded the interaction behavior architecture between user and mobile phone. The architecture comprises
levels from actor, functional/emotional, action/reaction, to purpose. Behaviors can be seen as a complete set
responding to the architecture and can be altered according to different scenarios (Figure 2).
4
The semantic expressions user make to the phone includes functional commands such as “Yes”, “No”,
“Previous/ Select Left”, “Next/ Select Right”; contextual feedbacks including “encourage/discourage” the phone’s
suggestion; and emotional interaction including “Comfort”, “Pat”, “Tickle”, or “Ignore”. Phone needs to express
intellectual responses including “Agree/Disagree”, “Suggest Previous/ Next “, “Notify”; active expression such as
“Emotions”, “Playful (get attention)”; and emotional responses “Like/Dislike”, “Comforted”, and “Dodge”.
We see the whole screen as a poseable body and designed the expression according to human instinctive
gestures. We also mirrored the gestures for “user to phone” to “phone to user” in order to make the two sets
similar. This helps users to easily associate expressions with corresponding gestures, and vice versa.
Figure.2 Behavior Architecture
We use dimension models as emotion model for our prototype, and it comprises two dimensions- Arousal and
Valence. Different emotion states are represented as coordinates in the model. We can categorized the four basic
emotions in this model as joyful, angry, sad and relaxed (Figure 3). During interaction, different action from the
user will trigger different movement of coordinates in both Arousal and Valance that leads to the change of
emotion status. By changing the emotion status, the corresponding behaviors may change as well. The research
also proposed corresponding expressions for both behaviors and emotions that derived from this behavior
architecture.
Figure.3 Emotion Model
3.2 Design Principle The idea of the Affective User Interface is to design certain features that trigger users to associate with daily
communication, such as facial expression, languages, biological features and body languages, in order to
comprehend the meaning.
5
Starting from the present technology of mobile phone, the interface can be defined as the composition of
several Elements, i.e. video, audio, motion, etc., and each Element comprises both behavioral and emotional
attributes (Figure 4). More sophisticated and subtle semantics and be expressed thorough manipulating the
sequence and combination of the Elements.
The design of the affective interface in this research is based on the following principles: (1) Use sensuous
effects to imply certain emotion or to emphasize the semantic, e.g., higher color saturation and warm hue implies
more positive emotion status, or longer vibration and sound implies higher excitement. (2) Manipulate life-like
features for user’s better understanding of emotion changes, e.g., simulating heartbeats, breaths, etc. (3) Imitate
body-languages for user’s better understanding of specific semantic, e.g., considering the area of the screen as
movable body, the distortion of the screen can leads to certain understanding of body language.
Following the design principles, we developed the structure of Elements in relationship with Emotions and
Behaviors. On the other hand, the way user communicates with the mobile phone also needs to be designed
corresponding to the device’s language in a coherent and intuitive manner. We adapted the following principles
while designing the gesture of user’s: (1) Coherence of bi-directional communication: similar meaning expressed
by both device and user, such as user’s “Yes” and phone’s “Recommend”, should be consistent. This helps user to
intuitively adopt the language without being confused. (2) Intuitive gestures: searching from daily gestures human
do to others or their pets, such as beating for suppressing and bullying, or caress for comforting and showing
affection.
Figure.4 Compositions of Elements
4. Design Result From the earlier study, we concluded that both behavior and emotion are needed to achieve better affectiveness
of an object. After exploring the suitable emotion model and the emotion architecture for mobile phone, we then
further developed the rules of the emotion model and the visualization of the behaviors as one prototype.
4.1 Emotion Interface Design The interface design in this research takes the entire screen as designable area and all visual effects apply to the
whole screen. To imply one of the two emotional dimensions - Arousal, the design manipulates the frequency of
sparkling screen to make user think of “breathing” or “heartbeat” thus to have a sense of its arousal degree. As for
Valence, we used different degree of screen saturation to imply the dimension; the higher saturation means the
higher Valence. For better feasibility while implementing on prototype, we set 5 different degrees of both Arousal
6
and Valence, which creates the matrix of 25 different emotions. In responding to the matrix, we also set 5 different
degrees of sparkling frequency and screen color saturation (Figure 5).
Figure.5 Emotion Model and Attributes
In order to make the emotional expression more convincing, the visual effects on screen exists across different
work modes of the phone, i.e. desktop page, in-application, locked mode, etc., and will vary with time, like
human’s emotion does. However, in order to avoid disturb, the visual effect of sparkling screen, which implies the
Arousal of emotion, should be deactivated while user is using the device.
From the initial test of these emotional features, we discovered that the simply manipulating the sparkling
frequency and color saturation wasn’t making the display legible enough to tell the different emotion, therefore we
introduced different Hue overlaying the entire screen, which is based on mankind’s universal impression to colors,
to emphasize different emotional status. Such as dark red masking implies anger; and the rainbow color on screen
implies happiness. Table 1 shows an example of design by applying the proposed emotion model.
Table 1. Emotional Expression
4.2 Behavior Interface Design
7
In the animation movie: Aladdin(1992), the animator made the carpet into some human-like or pet-like
character by distorting the shape of itself and it was successfully affective. Norman once emphasized that people
tend to project their own emotion and belief on anything [1], therefore consumers would gain more fun and
pleasure from the product’s being personated. Other research also indicated that a life-like or a Living Object is
one of types that create meaningful user-product relationship [7], and it’s also the factor influencing user’s
pleasure and preference. As a result, we tried to personate the phone while considering the design of the interface.
While designing the behavior, we find it easy to associate the upper corners of the screen to the “hands” of the
device, which can be used to do “waving” in order to express “notifying”; The upper half of the screen can be
seen as the “head” or the “upper-body” of the device which can do directional swinging, nodding or shaking head;
the middle part of the screen can be thought as the “body” or the “belly” of the device, which can imitate the pose
that some pets do to invite people’s caress or the dodging movement while being punished; etc. (Table 2).
Table 2. Semantic Behaviors
4.3 User Gesture Design The design of the control method focuses on gesture-control and multi-touch instead of other means existed in
the market because of the following: (1) Advanced and mature built-in motion sensors such as accelerometer had
been widely adopted on mobile phone to achieve intuitive gesture control. (2) Nowadays touch screen become a
standard for mobile phones, physical keypad is no longer considered necessary; also, considering users may not be
accustomed to speaking to machine and voicing may be inconvenient in many situations, voice-control is not
included in our design yet it can be further developed as alternative. As a result, we don’t put keypad and voice-
control in our design consideration. (3) By designing the gesture control, we can create better consistency between
8
user’s behavior and the movement that the device is expressing while with similar meaning. However, the gesture
and multi-touch control design for the prototype should not be understood as replacement but alternatives to
conventional means, thus user can chose between existing means for more specific function or the corresponding
and intuitive language (Table 3).
Table 3. User Gesture
5. Evaluation To evaluate the application of this interface design, we experimented with prototype and tested the usability.
5.1 Participants There are ten participants in this experiment. The range of age is between 20~50 years old, and all have
previous experience with mobile phones. Considering about that gender may effects demand and preference for
high-tech products, we recruited five males and five females. Considering participants between using smart phone
and regular phones may have different acceptance of the new design, both smart phone users and regular phone
users are included in the test, 5 participants for each of the group. Table 4 shows the information of participants.
Table 4. The Information of Participants
9
5.2 Experimental Procedure In this experiment, we tested how users comprehend the interface and gesture design. We asked participants to
watch each video clip of mobile phone’s behavior and emotion expression, and then participants matched the clips
with different meaning according to their comprehension. for evaluating the result, we used a five-point Likert
scale to see how user understand and agree with them. For the gesture test, we demoed each gesture user interact
with the mobile phone, and interpret their command. Then, we used a five-point Likert scale to find out if it was
intuitive and understandable.
5.3 Results Further discussion base on the results as below:
(1) Confusion Matrix: Table 5 is the confusion matrix of how did user comprehend mobile phone’s expression.
Table 5. Confusion Matrix
We can tell from this matrix that most participants understand the expression we designed for mobile phones.
In this testing of ten participants, only one confused “Dislike/Frown” with “Beaten/Dodge” and one confused
“Playful/Naughty” with “Beaten/Dodge”. As for the emotional expressions, two participants confused the
“Neutral” with “Relaxed”. Other than these 5 items were successfully comprehended.
(2) Evaluating the expressions, participants ranked the degree of engagement for each item from 1 to 5, and
then we calculated the average and standard deviation for each item. The result is shown in Table 6.
Table 6. Evaluation of emotional attachment of mobile phone’s expressions
10
Among the semantic behavior expressions, Table6 shows that Clip #2 (Comforted/Satisfied), #4
(Dislike/Frown), #5 (Playful/Naughty), and #7 (Beaten/Dodge) fall relatively low between 3 (Fair) and 4
(comprehensive) than others, and the deviations are also greater. The design of these behaviors has room for
improvement, and some of them may be customized in order to fit user’s preference and habits. Besides those
with lower grades, the rest were comprehensive and can be seen as effective ways for semantic expressions.
As for emotional expressions, Clip #14 (Joyful) and #15 (Sad) have higher rank in comprehensibility and
lowest deviations, therefore these two are proven universal and successful; Clip #13 (Angry) has slightly lower
ranks yet still comprehensive and low deviation; Clip #12 (Neutral) has low rank and largest deviation and,
according to interviewees, easily got confused with the implication of Clip #16 (Relaxed). This is perhaps due to
the different degrees of color saturation are too subtle and not easy to distinguish. Therefore, greater difference
between saturations, color overlaying or more contrastive sparkling rate may be considered adding to the visual
for better legibility.
11
(3) Evaluation of behavioral engagement of user gestures: participants ranked the degree of intuitiveness for
each item from 1 to 5, and then we calculated the average and standard deviation for each item. The result is
shown in Table 7.
Table 7. Behavioral engagement of user gestures
Among the designs of user gestures, Gesture #2 (No/Cancel) has lowest deviation, which means the least
controversial gesture design; Gesture #3 and #4 (Left, Right) are also less controversial, yet with average 3.8, they
can be improved in design; Gesture #5 (Comfort/Caress) and #6 (Pat/Bully) have high intuitiveness yet also high
deviations, this may have to do with interviewees’ behavioral habits therefore can be customized for better
usability; Gesture #1 (Yes/Okay) and #7 (Tease/Tickle) are the most controversial even though the average
intuitiveness are still above 3 (Fair), the design of these two gesture may also need adjustment. The key idea of
designing the gestures is to provide consistency between gestures and device’s behaviors. For instance, the
movement of the phone’s expressing “Left, Right” should be similar to user’s gesturing “Left, Right” in order to
cause least confusion. Another important idea is that the design of these gestures is not to replace the existing
means of operation but to provide intuitive alternatives for better interaction. Also, the gestures alone cannot cover
the entire complicity of user’s commands. Thus, even though some participants still prefer the traditional ways of
operation, it doesn’t degrade the value of the gestures for being optional alternatives.
6. Discussion Most of the expression designed are comprehensive with little controversy expect 4 expressions-
“Comforted/Satisfied”, “Dislike/Frown”, “Playfu/Naughty”, and “Beaten/Dodge”. This should results from
participants’ own understanding of certain body-languages or different cultural context. These controversial
expressions can be customized or to allow user set up their own preference in order to meet user’s own
understanding.
As for emotional expressions, the design for “Joyful”, “Sad”, “Angry” are comprehensive, yet the design for
“Neutral” and “Relaxed” are relatively confusing. Unlike the other three emotions which overlay certain color or
grayscale on the entire screen, the expression of “Neutral” and “Relaxed” used merely sparkling and color
saturation, this should be the reason of less legibility. However, the average grades of the two were still above
“Fair” meaning still understandable. Using greater contrasts in sparkling rate and saturation may help this issue, or
other type of visual effect can be further developed for these two emotions.
12
7. Conclusion Through user interviews, we concluded the following design principles for the affective interface: (1) the
design of expressions, either emotional or behavioral, should not decrease the usability or functionality of mobile
phone. (2) Consistency and comprehensibility should be carefully examined while designing either emotional
expression or semantic behavior, and the meaning should be delivered correctly to the general. (3) The emotional
expressions and the behaviors should be extendable to different phone working status in order to fully evoke
user’s emotional projection, i.e. extend the affective expressions across locked mode, in-application mode, initial
desktop, etc. (4) The dimensional system- Arousal and Valence- adopted as emotion model in this design
prototype can be further developed for its internal changing logic, and the design of expressions should explicitly
convey the corresponding emotion.
Based on the design principles, we designed the interface prototype and tested it with mobile phone users.
From the evaluation tests, we proved the applicability of the principles and from which the interaction model
developed. The evaluation shows that the affective interface we designed can provide user the experience of
higher comprehensibility, emotional attachment of mobile phone’s expression, and behavioral engagement of user
gestures. We expect the design can improve the emotional connection between users and their phones therefore
bringing more joy to the interaction.
The interface elements and features constructed in this research can be further developed in terms of orders
and compositions in favor of conveying more sophisticated and precise meaning. However, some behavior design
should take individual difference into account and may allow customization to achieve better comprehension. To
sum up, the design prototype and the design principles concluded in the research can be valuable references while
developing an affective interface for mobile phone.
8. References [1] Norman, D. A. (2004) Emotional Design: Why We Love (or Hate) Everyday Things, New York: Basic Books.
[2] Picard, R. W. (1997) Affective Computing, The MIT Press.
[3] Mugge, Ruth, Hendrik N. J. Schifferstein, and Jan P. L. Schoormans (2006) A Longitudinal Study on Product Attachment and Its Determinants, European Advances in Consumer Research, Vol. 7, Eds. Karin M. Ekstrom and Helene Brembeck, Duluth, MN: Association for Consumer Research.
[4] Van Hinte, E. (1997) Eternally yours: visions on product endurance, Rotterdam NL: 010 Publishers.
[5] Marakas, G., Johnson, R. and Palmer, J.W. (2000) A Theoretical Model of Differential Social Attributions Toward Computing Technology: When the Metaphor Becomes the Model, International Journal of Human-Computer Studies.
[6] Jordan, P. W. (2000) Designing Pleasurable Products: An Introduction to the New Human Factors, London: Taylor & Francis.
[7] Katja Battarbee and Tuuli Mattelmaki. (2002) Advertising and Promotion: An Integrated Marketing Communication Perspective, Proceeding of Design and Emotion Conference.