autonomous mobile robot - j-stage

8
KANSEI Engineering International Vol. 1 No.1 pp.59-66 (1999) ORIGINAL ARTICLES EMOTIONAL GENERATION MODEL FOR AUTONOMOUS MOBILE ROBOT Yoichiro MAEDA Osaka Electro-Communication University, 18-8 Hatsu-cho, Neyagawa, Osaka 572-8530 , Japan Abstract: Some emotional models have been already proposed in the recent engineering research . However, the research about transmitting emotions and feelings to the robot has been almost performed nothing . Therefore, researches for the emotional expression are later than the other researches in the robotics . The purpose of our research is to attempt to represent emotions which are difficult to express into a robot as compared with the logical knowledgeand to construct the emotional generation model for the robot behavior. At first, the emotional simulator for an autonomous mobile robot which is able to express various emotions based on three basic emotional vectors: Joy, Anger and Sadness described with simplified fuzzy rules was developed in this research. In the next step , we represented the emotional auto-generation model with the subsumption architecture (SSA). In this paper , we also report the result of experiments performed with miniature robot for the purpose of confirming the efficiency of this simulator and model. By these experiments, it was confirmed that we can easily generate various emotions by using the emotional simulator. Singletons in THEN part and motor output functions to realize emotional behaviors were properly tuned by weighted addition of three basic emotional vectors. We also confirmed it is proper that three basic emotions: Joy, Anger and Sadness are expressed with the emotional generation model. Keywords: Emotional model, Subsumption architecture, Fuzzy rule, Robot behavior , Autonomous mobile robot 1. INTRODUCTION Recently,researches for intelligent robot applied soft computing methods became popular. Further- more, the emotional processing and the emotional model were also proposed in the current research. However,the research about transmitting the men- tal state, for example : emotions and feelings, to the robot has been almost performed nothing be- cause there are no methodologies for representing emotional states flexibly. Therefore, researches for the emotional expression are later than the other researches in the robotics. Typical researches on the emotional expression are discussed in the next section. In our laboratory, we have already reported ex- perimental results for transmitting low-level emo- tions of Braitenberg's models to a miniature robot using only with fuzzy rules[1]. In the next step, we expressed three basic emotional vectors: Joy, Anger and Sadness by using fuzzy rules and devel- oped an emotional simulator for autonomous mo- bile robot which can generate the other emotions by combiningthese vectors assuming that they are or- thogonal[2][3]. In this paper, we also construct the emotional generation model of an autonomous mo- bile robot with automatic changing mechanism of emotions by using Subsumption Architecture (SSA) proposed by R.A.Brooks[4]. Furthermore, we also report experimental results about robot behaviors appeared by these emotions. 2. TYPICAL RESEARCH ON EMOTIONAL EXPRESSION V.Braitenberg explained in his book[5] that we can build a high intelligent robot by addition and combination of sensors and devices. It is very in- teresting that these models include some human- like emotions and concepts of soft computing, that is, genetic algorithms, neural networks, chaotic sys- tems and fuzzy reasoning. Braitenberg's models from lower level to higher are shown as follows: 1) wandering, 2) fear and attack, 3) affection, 4) value and taste, 5) logic,6) selection, 7) concept, 8) space, object and motion, 9) shape, 10) concept formation, 11) rule and regulation, 12) chain of thought, 13) foresight, and 14) ego and optimism. On the other hand, researches attempted to make clear the emotion by means of engineering method- Received August 1, 1999 Accepted October 19, 1999 59

Upload: others

Post on 13-Jan-2022

12 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: AUTONOMOUS MOBILE ROBOT - J-STAGE

KANSEI Engineering International Vol. 1 No.1 pp.59-66 (1999)

ORIGINAL ARTICLES

EMOTIONAL GENERATION MODEL FOR

AUTONOMOUS MOBILE ROBOT

Yoichiro MAEDA

Osaka Electro-Communication University, 18-8 Hatsu-cho, Neyagawa, Osaka 572-8530 , Japan

Abstract: Some emotional models have been already proposed in the recent engineering research. However, theresearch about transmitting emotions and feelings to the robot has been almost performed nothing. Therefore,researches for the emotional expression are later than the other researches in the robotics. The purpose of ourresearch is to attempt to represent emotions which are difficult to express into a robot as compared with the logicalknowledge and to construct the emotional generation model for the robot behavior. At first, the emotional simulator for an autonomous mobile robot which is able to express various emotions based on three basic emotional vectors:Joy, Anger and Sadness described with simplified fuzzy rules was developed in this research. In the next step, werepresented the emotional auto-generation model with the subsumption architecture (SSA). In this paper, we alsoreport the result of experiments performed with miniature robot for the purpose of confirming the efficiency of thissimulator and model. By these experiments, it was confirmed that we can easily generate various emotions by usingthe emotional simulator. Singletons in THEN part and motor output functions to realize emotional behaviors wereproperly tuned by weighted addition of three basic emotional vectors. We also confirmed it is proper that three basicemotions: Joy, Anger and Sadness are expressed with the emotional generation model.Keywords: Emotional model, Subsumption architecture, Fuzzy rule, Robot behavior, Autonomous mobile robot

1. INTRODUCTION

Recently, researches for intelligent robot appliedsoft computing methods became popular. Further-more, the emotional processing and the emotionalmodel were also proposed in the current research.However, the research about transmitting the men-tal state, for example : emotions and feelings, tothe robot has been almost performed nothing be-cause there are no methodologies for representingemotional states flexibly. Therefore, researches forthe emotional expression are later than the otherresearches in the robotics. Typical researches onthe emotional expression are discussed in the nextsection.

In our laboratory, we have already reported ex-

perimental results for transmitting low-level emo-tions of Braitenberg's models to a miniature robotusing only with fuzzy rules[1]. In the next step,we expressed three basic emotional vectors: Joy,Anger and Sadness by using fuzzy rules and devel-oped an emotional simulator for autonomous mo-bile robot which can generate the other emotions bycombining these vectors assuming that they are or-thogonal[2][3]. In this paper, we also construct the

emotional generation model of an autonomous mo-bile robot with automatic changing mechanism ofemotions by using Subsumption Architecture (SSA)

proposed by R.A.Brooks[4]. Furthermore, we also report experimental results about robot behaviorsappeared by these emotions.

2. TYPICAL RESEARCH ON EMOTIONAL

EXPRESSION

V.Braitenberg explained in his book[5] that wecan build a high intelligent robot by addition andcombination of sensors and devices. It is very in-teresting that these models include some human-like emotions and concepts of soft computing, thatis, genetic algorithms, neural networks, chaotic sys-tems and fuzzy reasoning. Braitenberg's modelsfrom lower level to higher are shown as follows: 1)wandering, 2) fear and attack, 3) affection, 4) valueand taste, 5) logic, 6) selection, 7) concept, 8) space,object and motion, 9) shape, 10) concept formation,11) rule and regulation, 12) chain of thought, 13)foresight, and 14) ego and optimism.

On the other hand, researches attempted to makeclear the emotion by means of engineering method-

Received August 1, 1999Accepted October 19, 1999 59

Page 2: AUTONOMOUS MOBILE ROBOT - J-STAGE

ologies become active according to the growth ofbrain science. For example, T.Musha[6] tried toanalyze human emotions objectively by EmotionalSpectrum Analyzing Methods (ESAM). He definedan emotional matrix based on four emotional ele-ments: anger, sadness, joy and relaxation obtainedby human brain signals. K.Yoshida et al.[7] defined

physiological condition, silent reaction and speak-ing motion as the emotional memory and expressedemotional patterns in Cartesian coordinates. T.Gomi

[8] tried to perform experiments for the emulation ofemotions using subsumption robots. T.Mochida[9]

proposed the robot that can experience two status:pleasantness and unpleasantness. A variable frus-tration represents the status that low frustrationis pleasant, but high frastration is to the contrary.Simulation has been conducted using a Braitenberg-style architecture expressed with a neural emotionalmodel that alters the system's behavior as it be-comes more or less frustrated.

3. BASIC EMOTIONAL VECTOR

In this research, first of all, we try to define basicemotions and express the other emotions by themfor general representation of emotions to the robot.These basic emotions in this paper are defined asJoy(J), Anger(A) and Sadness(S) in basic humanemotions. We define these independent orthogonalunit vectors as the basic emotional vector. Figure1 shows the concept of basic emotional vectors. Inthis figure, each unit vector of X, Y and Z axis showsJoy(J), Anger(A) and Sadness(S), respectively. Theother emotions are expressed by composing thesethree vectors on a spherical surface.

4. EMOTIONAL BEHAVIOR

The behavior model represented by fuzzy ruleschanges according to each value (weight of J, A andS) of levers in the control panel of the emotionalsimulator. We can also move levers in real timewhen the simulator is running. Behavior models inthe emotional simulator are represented with simpli-fied fuzzy reasoning. IF part of fuzzy rules is shownin Figure 2. Membership functions of IF part arefixed without influence of three basic emotional vec-

Figure 1. Basic emotional vector.

tors. THEN part of fuzzy rules is shown in Figure3. Each steering angle (singleton values) and eachvalue of motor output functions change accordingto its weight given by the levers of basic emotionalvectors.

Next, we explain about the method of weighted.addition. Assuming that each value of basic emo-tional vectors are X, Y and Z, the condition whichall emotional vectors exist on a spherical surface isshown as the following equation:

x2 + y2 + z2 =1. (1)

We can obtain the following equation by regulariza-

tion:

(2)

In this equation, if X = Y = Z = 0, then xi =

yi = zi = 0. For example, in Figure 3, a singleton

value a of PSS in THEN part is decided with the

following equation:

a = px1 + qy1 + rz1 = -3.7x1 + 1.5y1 + 0.6z1 (3)

where p, q, r are weight values of basic emotional

vectors in the table and xi, yi, z1 are degrees of

emotions decided by human operator according to

levers of the control panel. Singleton values and

motor output functions of the other emotions are

also decided with the same method.

However, output values are not sometimes con-

tinuous when each value of basic emotional vectors

60

Page 3: AUTONOMOUS MOBILE ROBOT - J-STAGE

(a) Direction (b) Distance

Figure 2. IF part in fuzzy rules.

Fuzzy rule of model J

Fuzzy rule of model A(a) Fuzzy rule

Fuzzy rule of model S

(b) Steering angle

(c) Motor output function

Figure 3. THEN part in fuzzy rules.

61

Page 4: AUTONOMOUS MOBILE ROBOT - J-STAGE

Figure 4. Emotional generation mechanism.

(xi,yi,zi) are in the following case: 1) one value is zero and the other two values are equal, 2) three values are equal without zero, and 3) two values are zero. In these cases, a coefficient k is multiplied in the equation (3) as the following equation. In the other cases, the coefficient is set to k = 1.

(4)

5. EMOTIONAL GENERATION MODEL

In this research, we propose the emotional gen-eration model of autonomous mobile robot by using Subsumption Architecture (SSA). As a mechanism for automatically deciding basic emotional vectors, we constructed the emotional generation model as shown in Figure 4. Basic emotional vectors are ob-tained for each basic emotion: Joy, Anger and Sad-ness. A maximum value of sensors is used for an input of emotional generation mechanism. And we applied the integral mechanism which accumulates input values of sensors for the emotional expression. Furthermore, we used a non-sensitive mechanism to express only emotions exceeding a threshold value of emotional vector.

The emotional generation model is constructed by using Argumented Finite State Machine (AFSM). At first, we added the AFSM of the obstacle avoid-ance to the basic motor output. Next, the AFSM of the emotional expression is added by using fuzzy rules as shown in the previous sections. At last, we added the AFSM of automatic emotional generation mechanism. The final emotional generation model in robot behavior is shown in Figure 5. AFSM with top oblique line means left and right motor's output. AFSM with black left upper corner means emotional outputs of Joy, Anger and Sadness. Motor output values change according to its weight given by val-ues of basic emotional vectors decided by output

Figure 6. System configuration of experiment.

Figure 7. Miniature robot Khepera.

values of emotional generation mechanism.

6. EXPERIMENTS

At first, we performed several experiments to confirm how much degree of emotional behaviors we can express by using our emotional simulator based on the basic emotional vector. In the next experi-ment, we confirmed the efficiency of the emotional

generation model described with SSA. 6.1. System configuration

In this research, we performed experiments by using a miniature robot Khepera which communi-cates control commands and sensing signals with

graphic programming software LabVIEW in a com-puter (Gateway200) with Windows95 through RS232C serial cable (See Figure 6). All parameters and con-trol algorithms we used in this experiment are rep-resented with the block diagram in LabVIEW. We also constructed graphical simulator so that we can visually confirm conditions of sensors and motors.

Figure 7 shows the external appearance of Khep-era. This robot is very small and compact size whose diameter is about 55 mm and has sensors, motors, control circuits and batteries in the body. This robot has six infra-red proximity sensors in front part and two in rear part. Khepera can de-tect obstacles from about 20mm to 50mm by these sensors.

62

Page 5: AUTONOMOUS MOBILE ROBOT - J-STAGE

Figure 5. Emotional generation model in robot behavior.

6.2. Experiment for emotional behavior

We can start an experiment by clicking a start button in the control panel of emotional simulator. IF part is fixed and THEN part and motor output function are tuned by values of levers of basic three emotions. An experiment is continued until the hu-man operator pushes a stop button. We can change values of three basic emotional vectors in real time under the experiment . Motor output values are sent to the miniature robot through the serial cable.

In this research, first of all, we expressed each emotion of basic emotional vectors by using only fuzzy rules. Next, we performed experiments of behaviors generated by basic emotions and several other emotions. Values and observed behaviors of basic emotions: Joy, Anger, Sadness and the other emotions are shown as follows. J, A and S mean values of basic emotional vectors.

1) Joy [J=1,A=0,S=0]

Robot becomes in a cheerful condition and moves around a light source wandering inno-

cently. If the value of Joy is increased, robot motion becomes fast and cyclic.

2) Anger [J=0,A=1,S=0]

Robot becomes offensive and rushes to the light source. If the value of Anger is increased,

robot motion becomes fast. Even if a robot overruns the light, he returns there again.

3) Sadness [J=O,A=O,S=1]

Robot becomes in a closed condition and tends to move toward a dark place. If he finds a light source, he moves backwards. If the value of

Sadness is increased, robot motion becomes slow. When he comes to a light source, he

begins to go backwards vibrating his body.

4) Attack [J=0.1,A=0.5,S=0]

Robot rushes to a light source offensively, but he never go back again after he passed through

the light.

5) Defense [J=0,A=0.1,S=0.8]

Robot defends himself from a light source, that is, he keeps an attitude toward the light. He moves backwards, but he never turns his back.

6) Fear [J=0.21,A=0.53,S=0.74]

Robot fears for a light source and tries to run away from the light. First, he comes near the light and turns to the dark place as soon as he finds the light.

7) Love [J=0.23,A=0.1,S=0.2]

Robot performs the affectionate behavior com- ing in contact with a light. He comes near the light slowly and wanders around the light.

But he never go away from the light.

63

Page 6: AUTONOMOUS MOBILE ROBOT - J-STAGE

8) Disgust [J=0,A=0.21,S=0.67]

Robot is disgusted by the light. If he finds a light, he goes away from the light slowly.

9) Happiness [J=0.5,A=0,S=0]

Robot is happy for a light. He moves around the light slowly.

10) Pleasant [J=0.24,A=0.2,S=0.43]

Robot is pleased with a light. He wanders around the light slowly.

We performed various experiments to generate emotions. In the experiments, we put a blue colored

paper on the center of Khepera to observe the robot motion and CCD camera traces the locus of a robot by the blue colored paper. Experimental results by motion tracking are shown in Figure 8. These are

photographs of the display in color image tracker. White curved lines shows the locus of the robot and a black circle in the center shows the point of a light source. We could confirm above-mentioned emotional be-haviors from these experimental results. We think it is proper that three basic emotions: J, A, S are de-fined with the orthogonal independent vector. Fur-thermore, it was confirmed the efficiency of the pro-

posed method because we can generate the other emotions as similar as human emotions.

6.3. Experiment for emotional generation

In the next step, we performed the another ex-

periment to confirm the efficiency of emotional gen-eration model. In the model we proposed in this

paper, each value of basic emotional vectors: Joy, Anger and Sadness is decided by light sensors, in-frared sensors and line sensors, respectively. On eight conditions from casel to case8 as shown in Table 1, we performed several experiments to con-firm emotional behaviors a robot generates.

We performed the experiments to generate emo-tions. In the experiments, we put a blue colored

paper on the center of Khepera to observe the robot motion and CCD camera traces the locus of a robot by the paper. Motion trajectories in only three cases of Joy[case6], Anger[case4] and Sadness[casel] are shown in Figure 9. These are photographs of the display in color image tracker. White lines shows

Table 1. Experimental conditions.

the locus of the robot and a black circle in the cen-

ter shows the point of a light source.

In this experiment, we were difficult to recognize

a kind of emotion when the value of anger is equal to

sadness. However, we could confirm the efficiency

of emotional generation model described with SSA.

It was also confirmed that robot motions in this

experiment are almost similar to the motions in the

previous experiment for emotional behavior.

7. CONCLUSIONS

In this research, we proposed the emotional gen-

eration model described with SSA for deciding three

basic emotional vectors: Joy, Anger, Sadness by us-

ing only simplified fuzzy rules. And we developed

the emotional simulator which can easily generate

various emotions by combining these vectors. Sin-

gletons in THEN part and motor output functions are tuned by weighted addition of three basic emo-

tional vectors. We confirmed it is proper that three

basic emotions are expressed with the emotional

generation model described with SSA.

However, there were some problems that a robot

don't move if he goes away from a light source and

his motion was not smooth because motor output

functions were defined with a first-ordered function.

In the future, we will try to propose the automatic

emotional generation mechanism with learning func-

tion.

64

Page 7: AUTONOMOUS MOBILE ROBOT - J-STAGE

1) Joy 2) Anger

3) Sadness 4) Attack

5) Defense 6) Fear

7) Love 8) Disgust

9) Happiness 10) Pleasant

Figure 8. Experimental results (1).

65

Page 8: AUTONOMOUS MOBILE ROBOT - J-STAGE

1) Joy [case6]

2) Anger [case4]

3) Sadness [casel]

Figure 9. Experimental results (2).

REFERENCES

1. Y.Maeda and H.Tanaka; Emotional Simula- tor for Autonomous Mobile Robot Expressed

with Basic Mental State, Proc. of 13th Fuzzy System Symp.; 743-746 (1997). [in Japanese]

2. Y.Maeda; Fuzzy Rule Expression for Emo- tional behaviors in an Autonomous Mobile Robot,

Proc. of 7th IFSA World Congress; 445-450

(1997).

3. Y.Maeda; Emotional Expression for Autonomous Mobile Robot, Proc. of 5th International Con-

ference on Soft Computing (IIZUKA'98); 1, 243-246 (1998).

4. R.A.Brooks; A Robust Layered Control Sys- tem for a Mobile Robot, IEEE Journal of Robotics

and Automation; RA-2, 14-23 (1986).

5. V.Braitenberg; Vehicles : Experiments in Syn- thetic Psychology; Vieweg & Sohn Verlagsge-

sellschaft MBH (1987).

6. T.Musha; Measuring Emotion, Nikkei Science; 26(4) (1996). [in Japanese]

7. K.Yoshida, M.Nagamatsu and T.Yanaru; A Proposal of Emotional Processing System, Pro-

ceedings of the 4th International Conference on Soft Computing (IIZUKA'96); 2, 802-805

(1996).

8. T.Gomi and K.Ide; Emulation of Emotion Us- ing Vision with Learning, Proc. of Robot and Human Communication (RO-MAN'94) (1994).

9. T.Mochida, A.Ishiguro, T.Aoki and Y.Uchikawa; Behavior Arbitration for Autonomous Mobile

Robots Using Emotion Mechanisms, Proc. of the IEEE/RSJ International Conference on In-

telligent Robots and Systems (IROS'95); 516- 521 (1995).

66