an object-based mapping algorithm to control wearable...

7
An Object-based Mapping Algorithm to Control Wearable Robotic Extra-Fingers Domenico Prattichizzo 1,2 , Gionata Salvietti 2 and Monica Malvezzi 1 Abstract— One of the new targets of wearable robots is not to enhance the lift strength far above human capability by wearing a bulky robot, but to support human capability within its range by wearing lightweight and compact robots. A new approach regarding robotic extra–fingers is presented here. In particular, an object-based mapping algorithm is proposed to control the robotic extra-fingers by interpreting the whole or a part of the hand motion in grasping and manipulation tasks. As a case study, the model and control of an additional robotic finger is presented. The robotic finger has been placed on the wrist opposite to the hand palm. This solution allows to enlarge the hand workspace, increasing the grasp capability of the user. The proposed mapping algorithm allows to assist the human operator without requiring explicit commands. Rather, the motion of the extra–fingers is connected to the human hand so that the user can perceive the robotic fingers as an extension of his body. I. I NTRODUCTION Wearable robots are expected to work closely, to interact and collaborate with people in an intelligent environment [1]. By definition, a wearable robot is a mechatronic system designed around the shape and function of the human body, with segments and joints corresponding to those of the person it is externally coupled with [2]. Such definition was coined for exoskeletons, whose main purposes were enhancing human body force and precision capabilities [3] or helping in rehabilitation processes [4]. Recently, the progress in miniaturization and efficiency of the mechanical and sensing components has extended the field of wearable robotics to new devices which can be seen as extra limbs. In [5] for instance, two additional robotic arms worn through a backpack-like harness are presented. The wearable arms were designed to assist human workers with additional limbs attached to the wearer’s body. Such devices can work closely with the wearer by holding an object, positioning a work-piece, lifting a weight box, etc. [6]. However, finding a trade-off between wearabilty, efficiency and usability of that bulky extra–arms represents still an issue. We recently started investigating on how to enhance the capability of the human hand by means of wearable robots [7]. The goal was to integrate the human hand with additional robotic fingers as represented in Fig. 1 for the case of a sixth finger. The human hand is an extremely versatile tool, suited for a huge range of manipulation tasks, exhibiting 1 Universit` a degli Studi di Siena, Dipartimento di Ingegneria dell’Informazione, Via Roma 56, 53100 Siena, Italy. {malvezzi, prattichizzo}@dii.unisi.it 2 Department of Advanced Robotics, Istituto Italiano di Tecnologia, Via Morego 30, 16163 Genoa, Italy. [email protected] Fig. 1. The concept of extra–finger: the sixth robotic finger is fixed on the wrist opposite to hand palm. flexible and efficient solutions to the features and constraints presented by different tasks [8], [9]. Adding wearable robotic fingers could give humans the possibility to manipulate objects in a more efficient way, enhancing our hand grasping dexterity/ability. Together with the design issues related to portability and wearability of the devices, another critical aspect is integrating the motion of the extra–fingers with the human hand. In fact, demonstration-based algorithms as that presented in [5] fail in generality and adaptability to new tasks, while classical techniques for exoskeleton controllers, based on the reading of some bio-signal like EMG, limit the wearability [10]. In this paper we present a mapping algorithm able to transfer to the extra–fingers a part or the whole motion of the human hand. The algorithm extends the method proposed in [11] to the case of a human hand augmented with robotic extra–fingers. The mapping algorithm is based on the definition of a set of reference points on the so call augmented hand which includes the human hand and the robotic extra–fingers. A virtual sphere is defined as a function of the reference points. When the human hand fingers are moved, the virtual sphere is moved and deformed. The robotic extra–fingers are then actuated so that the reference points on them follow the virtual sphere transformation. The mapping algorithm can be applied considering the whole human hand in the definition and in the transformation of the virtual sphere, but it is possible to adopt only a part of it, for example three fingers. In this case the remaining fingers, not involved in the mapping process, can be used to perform another task. As an illustrative example, let us consider the task of holding and opening a bottle. This task

Upload: vunhan

Post on 04-Apr-2018

234 views

Category:

Documents


4 download

TRANSCRIPT

Page 1: An Object-based Mapping Algorithm to Control Wearable ...sirslab.dii.unisi.it/papers/2014/Prattichizzo.AIM.2014.Other.Fin.pdf · An Object-based Mapping Algorithm to Control Wearable

An Object-based Mapping Algorithm to ControlWearable Robotic Extra-Fingers

Domenico Prattichizzo1,2, Gionata Salvietti2 and Monica Malvezzi1

Abstract— One of the new targets of wearable robots is notto enhance the lift strength far above human capability bywearing a bulky robot, but to support human capability withinits range by wearing lightweight and compact robots. A newapproach regarding robotic extra–fingers is presented here. Inparticular, an object-based mapping algorithm is proposed tocontrol the robotic extra-fingers by interpreting the whole or apart of the hand motion in grasping and manipulation tasks.As a case study, the model and control of an additional roboticfinger is presented. The robotic finger has been placed onthe wrist opposite to the hand palm. This solution allows toenlarge the hand workspace, increasing the grasp capability ofthe user. The proposed mapping algorithm allows to assist thehuman operator without requiring explicit commands. Rather,the motion of the extra–fingers is connected to the human handso that the user can perceive the robotic fingers as an extensionof his body.

I. INTRODUCTION

Wearable robots are expected to work closely, to interactand collaborate with people in an intelligent environment[1]. By definition, a wearable robot is a mechatronic systemdesigned around the shape and function of the human body,with segments and joints corresponding to those of theperson it is externally coupled with [2]. Such definitionwas coined for exoskeletons, whose main purposes wereenhancing human body force and precision capabilities [3]or helping in rehabilitation processes [4].

Recently, the progress in miniaturization and efficiency ofthe mechanical and sensing components has extended thefield of wearable robotics to new devices which can be seenas extra limbs. In [5] for instance, two additional robotic armsworn through a backpack-like harness are presented. Thewearable arms were designed to assist human workers withadditional limbs attached to the wearer’s body. Such devicescan work closely with the wearer by holding an object,positioning a work-piece, lifting a weight box, etc. [6].However, finding a trade-off between wearabilty, efficiencyand usability of that bulky extra–arms represents still anissue.

We recently started investigating on how to enhance thecapability of the human hand by means of wearable robots[7]. The goal was to integrate the human hand with additionalrobotic fingers as represented in Fig. 1 for the case of asixth finger. The human hand is an extremely versatile tool,suited for a huge range of manipulation tasks, exhibiting

1Universita degli Studi di Siena, Dipartimento di Ingegneriadell’Informazione, Via Roma 56, 53100 Siena, Italy. {malvezzi,prattichizzo}@dii.unisi.it

2Department of Advanced Robotics, Istituto Italiano di Tecnologia, ViaMorego 30, 16163 Genoa, Italy. [email protected]

Fig. 1. The concept of extra–finger: the sixth robotic finger is fixed on thewrist opposite to hand palm.

flexible and efficient solutions to the features and constraintspresented by different tasks [8], [9]. Adding wearable roboticfingers could give humans the possibility to manipulateobjects in a more efficient way, enhancing our hand graspingdexterity/ability. Together with the design issues related toportability and wearability of the devices, another criticalaspect is integrating the motion of the extra–fingers with thehuman hand. In fact, demonstration-based algorithms as thatpresented in [5] fail in generality and adaptability to newtasks, while classical techniques for exoskeleton controllers,based on the reading of some bio-signal like EMG, limit thewearability [10].

In this paper we present a mapping algorithm able totransfer to the extra–fingers a part or the whole motionof the human hand. The algorithm extends the methodproposed in [11] to the case of a human hand augmentedwith robotic extra–fingers. The mapping algorithm is basedon the definition of a set of reference points on the so callaugmented hand which includes the human hand and therobotic extra–fingers. A virtual sphere is defined as a functionof the reference points. When the human hand fingers aremoved, the virtual sphere is moved and deformed. Therobotic extra–fingers are then actuated so that the referencepoints on them follow the virtual sphere transformation. Themapping algorithm can be applied considering the wholehuman hand in the definition and in the transformation ofthe virtual sphere, but it is possible to adopt only a partof it, for example three fingers. In this case the remainingfingers, not involved in the mapping process, can be usedto perform another task. As an illustrative example, let usconsider the task of holding and opening a bottle. This task

Page 2: An Object-based Mapping Algorithm to Control Wearable ...sirslab.dii.unisi.it/papers/2014/Prattichizzo.AIM.2014.Other.Fin.pdf · An Object-based Mapping Algorithm to Control Wearable

can be difficultly performed with only one hand. If a roboticextra finger is available, it can be used together with themiddle, ring and pinkie to hold the bottle, while the indexand the thumb can unscrew the cap. As a case study, wedeveloped a modular finger that can be worn on the wristwith the help a rubber band. The structure of the modules isobtained using rapid prototyping techniques, while the activedegrees of freedom (DoFs) are realized with servomotors.The main aim of the prototype described in this work is tomake human hand more symmetric, so that when the humanfingers close (when phalanxes flex), the extra–finger reflectsexactly this motion. This extension is going to increase thehuman hand workspace and its grasping abilities at the sametime.

The paper is organized as it follows. In Section II themapping algorithm is described in details. In Section IIIsome numerical simulation are reported, while Section IVdeals with grasping and manipulation examples with the realprototype. Finally in Section V, conclusion and future workare outlined.

II. THE TASK-BASED MAPPING ALGORITHM

In this section we describe the general procedure proposedto control the joints of the extra–fingers. The algorithm canbe extend to an arbitrary number of extra–fingers.We definetwo different applications of the mapping algorithm that canbe used in grasping and in manipulation tasks. In particular,the whole–hand mapping algorithm considers all the humanhand fingers to compute the motion of the robotic devices.This algorithm is particularity suitable for grasping taskespecially in the approaching phase. The other applicationis the partial–hand mapping algorithm which considers onlysome of the human fingers to compute the motion of therobotic fingers. This last approach is useful in manipulationtask when a part of the hand collaborate with the roboticdevice to hold an object, while the remaining fingers performa different operation, e.g. unscrew a cap while holding abottle.

A. The whole–hand mapping algorithm

Let us define a reference frame S0 on the hand. Its origin,O is in the wrist center of rotation, the z axis is perpendicularto hand palm plane, the x axis is the intersection betweenthe sagittal and the transverse plane, pointing towards thelittle finger, the y axis is consequently defined [12]. Letus indicate with phi ∈ <3, i = 1, ..., nh the coordinatesof reference points on the reference human hand, expressedw.r.t. S0. In this paper we choose as reference points the fivefingertips of the human hand, therefore nh = 5. Note that,the mapping algorithm proposed in [11] does not constraintthe choice of the reference point positions. However, thefingertips are a preferable selection since they are at the endof the kinematic chains represented by the fingers and, thus,their positions contain information about all the joint values.

Let us assume that all the phi can be measured or evaluated,for instance by means of an instrumented glove and directkinematic computation.

Let define as the augmented hand the system composedby the hand with its five fingers and those artificial. On allthe robotic fingers a reference points is placed at the tip. Theaugmented hand is then characterized by a set of referencepoints that is the union of the set of reference points on thehuman fingers and the set of points on the artificial ones. Thereference points for the augmented hand are then pai ∈ <3,i = 1, · · · , na, with na > nh. The first nh reference pointsare the same of the human hand, i.e. phi = pai ∀i ≤ nhwhile the remaining na − nh points are those relative to theextra–fingers. Let us indicate with O the minimum volumebounding sphere containing all the na reference points ofthe augmented hand. Let oh indicate its center and let rh beits radius. Let us define a reference frame S1 on the virtualsphere, whose origin is in the sphere center and whose axisare, in the reference starting position, parallel to S0 axis.The reference starting position is arbitrary and do not affectthe mapping procedure as detailed in [11]. Let assume theaugmented hand in its starting position at time instant t = t0.Assume also that at time instant t = t0 + δt the referencepoint coordinates phi change due to the motion of the humanhand. Let us indicate with ∆phi ∈ <3, i = 1, · · · , nh a vectorcontaining such coordinate variations. This displacementproduces a transformation in the virtual sphere, that in thispaper we approximate as the combination of a rigid bodymotion and an isotropic deformation. The rigid body motioncan be furthermore be represented as the combination of atranslation ∆oh ∈ <3 and a rotation ∆Φ ∈ <3. The rotationterm ∆Φ ∈ <3 is defined as

∆Φ = [∆φ, ∆θ, ∆ψ]T,

where ∆ψ represents the rotation w.r.t. x axis, ∆θ representsthe rotation w.r.t. y and ∆ψ represents the rotation w.r.t. zaxis. It is worth to recall that, even though the rotationsbetween reference frames are not commutative, and therotation order is therefore important, if the rotation anglesare small, the rotation order is not significant. The non rigidisotropic deformation can be described by the parameter∆s ∈ < defined as

∆s =∆r

r.

With these assumptions, we can express the displacement ofeach reference point on the human hand as follows

∆phi = ∆oh + ∆Φ×(phi − o

)+ ∆s

(phi − o

). (1)

It is worth to observe that we approximate the transformationthat the hand applies to the virtual sphere as a combinationbetween a rigid body motion, described by the first two termsof the right side in eq. (1), and an isotropic transformation(compression or expansion) described by the third term. Thethird term takes into account the sphere radius variation. Inthis paper we do not consider other types of transforma-tions for the sake of simplicity, however the method canbe integrated to include a non–isotropic transformation, asdescribed in [13], and also shear deformations [14]. Eq. (1)can be applied to all the reference points phi , leading to the

Page 3: An Object-based Mapping Algorithm to Control Wearable ...sirslab.dii.unisi.it/papers/2014/Prattichizzo.AIM.2014.Other.Fin.pdf · An Object-based Mapping Algorithm to Control Wearable

following linear system

∆ph = A∆ξ, (2)

where ∆ph =[∆phT1 , · · · ,∆phTnh

]T ∈ <3nh is a vectorcollecting all the reference point displacements, ∆ξ =[∆oT, ∆ΦT, ∆s

]Tis a 7 × 1 vector containing the un-

known parameters describing the sphere transformation, in-cluding the translation term ∆oh, the rotation term ∆Ψ andisotropic deformation term ∆s, finally, A ∈ <3nh×7 is thelinear system matrix, defined as

A =

A1

· · ·Anh

,in which each submatrix Ai ∈ <3×7 is defined as 1

Ai =[I −s(phi − o) (phi − o)

].

The linear system in eq. (2) can be solved, to find

∆ξ = A+∆ph +NAψ (3)

where A+ denotes a generic pseudo–inverse of A matrix,while NA ∈ <7×h represents a basis of A matrix nullspace,whose dimension is h ≥ 0, and ψ is an arbitrary h–dimensional vector parametrizing the homogeneous solutionof the system. When h > 0, the vector ψ can be defined tooptimize a cost function that can be defined on the basis ofthe task, e.g. when a grasping task is performed, we wouldneed to assure grasp stability, maximizing ∆s magnitude,while in object manipulation tasks, in which the contactforces should be constant, we should maximize ∆o and∆Φ and minimize ∆s. Once the sphere transformationparameters have been evaluated we need to generate thecommand signals for the extra–finger joints. What we imposeis that also the reference points of the extra–fingers moveaccording to the transformation parameters computed on thevirtual sphere. In particular, we consider that for all pai withnh < i ≤ na,

∆pai = ∆oh + ∆Φ× (pai − o) + ∆s (pai − o) , (4)

where the parameter ∆o, ∆Φ and ∆s are those computed ineq. (3). The joints of the robotic fingers are then controlled toobtain on the fingertips the displacement evaluated in eq. (4).Let nq indicate the number of joints of the artificial fingersand let Ja ∈ <3(na−nh)×nq indicate the finger Jacobianmatrix, we can evaluate finger joint displacements as

∆qa = J+a ∆pai +NJχ, (5)

where J+a denotes a generic pseudoinverse of robotic finger

Jacobian matrix, Nj ∈ <nq×nχ is a matrix whose columnsform a basis of Ja, nχ represents the dimension of robotic

1For any three–dimensional vector v = [v1, v2, v3]T, s(v) indicatesthe skew matrix associated with vector v, i.e.

s(v) =

0 −v3 v2v3 0 −v1−v2 v1 0

.

(a) The fingertips of the aug-mented hand are selected as ref-erence points. The virtual sphereis defined as the minimum vol-ume sphere containing all thereference points, t = t0.

(b) The motion of the humanhand displaces the referencepoints and thus deforms andmoves the virtual sphere. Thereference angle for the joints ofthe extra–fingers are computed,t = t0.

(c) The extra-finger is moved ac-cording to the rigid body motionand the deformation of the vir-tual sphere, t = t0 + δt.

Fig. 2. The mapping algorithm.

finger redundancy space, nχ ≥ 0, and χ ∈ <nχ is a nχ–dimensional parameter parametrizing the homogeneous partof the solution, which represents the redundant motions ofthe robotics fingers. Due to the computational time andthe dynamics of the robotic fingers, the joint displacements∆qa computed at the time sample t = t0 is effectivelyexecuted at time t = t0+δt, where δt indicates the samplingtime. This delay does not affect the mapping procedure. Themapping algorithm is pictorially represented in Fig. 2. Oncethe robotic finger has been actuated, at the time t = t0+δt thevirtual sphere is re-calculate and the procedure is repeated.

B. The partial–hand mapping algorithms

The human hand can perform complex tasks that involveboth grasping and manipulation. As an example, let usconsider the task of open a bottle. Depending on the sizeof the bottle and on user ability, either the task can beperformed using only one hand (the bottle is hold by themiddle, ring and little fingers and the hand palm, while thethumb and the index are in charge of opening it), or the taskcannot be performed using only one hand, since the threefingers and the palm are not sufficient to guarantee graspclosure properties [15]. The extra–finger could be used inthis example to let the hand hold the bottle, while the thumband the index unscrew the cap. In such scenario, it would

Page 4: An Object-based Mapping Algorithm to Control Wearable ...sirslab.dii.unisi.it/papers/2014/Prattichizzo.AIM.2014.Other.Fin.pdf · An Object-based Mapping Algorithm to Control Wearable

be useful to relate the motion of the robotic fingers only tosome of the hand fingers, for instance the middle, ring andpinkie fingers. This solution leaves the thumb and the indexable to act independently and do not affect grasp tightness.

Framing in the procedure previously described, we havenh = 3 and na = 4. It is still possible to define asphere passing through the reference points, and A matrixdimensions are 9× 7, so the linear system defined in eq. (2)can be solved to obtain the transformation parameters ∆o,∆Φ and ∆s, that can be used to control the motion of therobotic finger. From the theoretical point of view, it is thenpossible to control the robotic extra–finger mapping onlythree of the five human fingers.

More complex manipulation tasks could require the use ofthree fingers, namely the thumb, index and middle, leavingthe remaining two and the robotic one to hold the object,This yields nh = 2 and na = 3. The definition of thesphere containing the three points is a priori not determined(to define a sphere, four points are necessary). However,among the infinite spheres that can be defined, we canchoose that with minimum radius, whose center is in thegeometric center of mass of the points and whose radiusis the distance between the reference points and the center.The resulting A matrix has dimensions 6× 7 and, thus, thelinear system in eq. (2) admits infinite solutions. We couldreformulate the problem approximating the transformationthat the human fingers apply to the virtual sphere with arigid body motion, neglecting the deformation. In this case,the reference point displacements are related to the spheretransformation parameters by

∆phi = ∆o+ ∆Φ× (phi − o),

for i = 1, 2, that can be rewritten as

∆ph = Arb∆ξrb,

in which ∆ph = [∆ph1 , ∆ph2 ]T, ξrb = [∆o, ∆Φ]T and

Arb =

[I −s(ph1 − o)I −s(ph2 − o)

]∈ <6×6.

Matrix A is invertible, unless the reference points are coin-cident, and then we can estimate the sphere transformationparameters as

ξrb = A−1∆ph.

Such parameter are the used to project the sphere transfor-mation onto the robotic extra–fingers, as previously detailed.It is worth to observe that Arb matrix corresponds to thetranspose of the classical Grasp matrix, [16]. We furthermoreobserve that, if the augmented hand is grasping an objectwith two human and a robotic fingers, we impose to therobotic finger to follow the human ones without changingthe internal contact forces [17].

III. NUMERICAL SIMULATION

The performance of the proposed control algorithm hasbeen evaluated through a series of numerical simulations us-ing SynGrasp [18], a Matlab Toolbox that aims at simulating

Fig. 3. Scheme of the augmented hand kinematics: links (black lines),reference frames (red, green and blue lines), and joints (cylinders).

and analyze hand grasping. The toolbox includes functionsto define the hand kinematic structure and the contact pointswith a grasped object. New functions have been added todescribe hand model where also extra–fingers are considered.

Concerning the human hand, in this work we consider the20 DoFs hand kinematic model detailed in [19], [20].

In the thumb, the trapezio–metacarpal articulation (TMC)is represented with a 2 DoF joint with orthogonal axes,while the metacarpo–phalangeal (MCP) and the interpha-langeal articulations (IP) are represented as a single DoFrevolute joint. The proximal and distal interphalangeal ar-ticulations (PIP, DIP) of the index, middle, ring and pinkiefinger, are represented as a one DoF revolute joint, whilethe metacarpo–phalangeal (MCP) articulations, connectingfingers to the palm, are represented as a two DoF joint,consisting of two revolute joints with orthogonal axis. The20 DoF structure of the human hand has been augmentedwith the robotic extra–finger prototype model, which has 4DoF: one for the finger abduction/abduction motion and threesimulating the phalanges’ flexion/extension motion. The re-sulting augmented hand structure has then 24 DoFs and isschematically represented in Fig. 3. To control the roboticextra–finger, the mapping algorithm previously illustrated hasbeen implemented.

The numerical simulations were devoted to evaluate therole of the robotic extra–finger in grasping tasks. We con-sidered the grasping of simple shape objects, namely spheres,cubes, and cylindrical pipes, with different sizes, varyingfrom 40 mm to 140 mm the radius/edge. We analysed andcompared three configurations: (1) the human hand, (2) thecomplete augmented hand, (3) the augmented hand in whichthe robotic finger is controlled with only three fingers of thehuman hand, namely the middle, ring and pinkie fingers.

For each grasp, we analyzed the values of one of the graspquality indexes, in this case the grasp isotropy index (GII)discussed in [21], defined as the ratio between the minimumσmin,G and maximum σmax,G singular value of the grasp

Page 5: An Object-based Mapping Algorithm to Control Wearable ...sirslab.dii.unisi.it/papers/2014/Prattichizzo.AIM.2014.Other.Fin.pdf · An Object-based Mapping Algorithm to Control Wearable

40 60 80 100 120 1400

0.01

0.02

0.03

0.04

0.05

sphere diam. [mm]

GII

Human handAugm. HandAugm., nh=3

40 60 80 100 120 1400

0.01

0.02

0.03

0.04

0.05

cube edge [mm]

GII

40 60 80 100 120 1400

0.01

0.02

0.03

0.04

0.05

cylinder edge [mm]

GII

Fig. 4. Grasp isotropy index, i.e. ratio between the minimum and maximumsingular value of grasp matrix G, obtained for different object shapes andsizes, with the human, the augmented hand, and the augmented hand withnh = 3.

matrix G,GII =

σmin,Gσmax,G

.

The index GII measures the contribution of the contactforces to the total wrench exerted on the object, and variesfrom 0 to 1. Its optimal value, i.e. i = 1, corresponds toan isotropic grasp, in which the magnitudes of the internalcontact forces are similar. Further details on the definitionand evaluation of grasp matrix G are available in [16].

The results, in terms of grasp quality matrix, are summa-rized in Figure 4, showing the GII index as a function ofthe object size, for different grasping conditions and differentobjects. The robotic extra–finger significantly improve thegrasp quality index, in particular for large size objects. Theseresults confirm the intuitive observation that an additionalrobotic thumb improves grasp capabilities of the humanhand enlarging its workspace. We furthermore observe that

(a) - (b) -.

−100−80

−60−40

−200

2040

60

−20

0

20

40

60

80

100

120

−120

−100

−80

−60

−40

−20

0

20

y

best obtained grasp

x

z

(c) -

(d) (e)

−60−40

−200

2040

0 20 40 60 80 100

−100

−80

−60

−40

−20

0

20

40

x

best obtained grasp

y

z

(f)

−100−80

−60−40

−200

2040

60

20

40

60

80

100

120

140

−120−100−80−60−40−20

0

y

x

best obtained grasp

z

(g)−20

0

20

40

60−40

−20

0

20

40

60

80

100

120−140−130−120−110−100−90−80−70−60−50−40

y

best obtained grasp

x

z

(h)

−100

−50

0

50

−200

2040

6080

100120

−100

−80

−60

−40

−20

0

y

best obtained grasp

x

z

(i)

Fig. 5. Example of grasps evaluated during the numerical simulations.First row: grasping with the human hand; second row: grasping with theaugmented hand; third row: grasping with the augmented hand in which theindex and the thumb are not active. First column: grasping a sphere; secondcolumn: grasping a cube; third column: grasping a cylindrical pipe.

activating the robotic finger using the data from only threefingers of the human hand we obtain grasps with a slightlylower quality measure GII w.r.t. the complete augmentedhand, however such values are generally higher than thoseobtained with the human hand, especially for larger objects.Fig. 5 shows some examples of grasps realized with thehuman and the augmented hand.

IV. EXPERIMENTS

In this section we present a case study and a set of possibleapplications of a robotic extra–finger adopted in the wristcenter, as a second–thumb. We show a first prototype of therobotic device along with the experimental setup necessaryto apply the mapping procedure described in the previoussection. The extra–finger has been tested in grasping and inmanipulation tasks.

A. Mechanical Design of the extra–finger

The first prototype of the extra–finger has been designedand developed considering a modular structure. The modularapproach together with the rapid prototyping technique usedguarantee high versatility since the extra-finger can be disas-sembled and reassembled to form new morphologies suitablefor new tasks [22]. Modularity also offers robustness andreliability considering that robot parts are interchangeableand can be substitute in case of damage [23]. Each module

Page 6: An Object-based Mapping Algorithm to Control Wearable ...sirslab.dii.unisi.it/papers/2014/Prattichizzo.AIM.2014.Other.Fin.pdf · An Object-based Mapping Algorithm to Control Wearable

Fig. 6. The extra–finger module and structure. The blue part are theservo motors, while the green and red parts form the module structure.Three single–DoF modules are connected through a fourth servomotor to awrist rubber band. The fourth servomotor accounts for abduction/adductionmotion, while the three modules accounts for flexion/extension motion.

TABLE IDENAVIT–HARTENBERG PARAMETERS FOR THE EXTRA–FINGER. THE

PARAMETER a = 42MM IS THE SAME FOR ALL THE PHALANGES, DUE TO

THE MODULAR STRUCTURE OF THE FINGER

Joint id. aj αj dj θjMCP 1 0 −π/2 0 θ1MCP 2 a 0 0 θ2PIP 3 a 0 0 θ3DIP 4 a 0 0 θ4

is obtained considering a servo motor and two 3D-printedplastic connections which define the module structure. Theresulting module has one DoF and the following dimensions:42×33×16mm. In left part of Fig. 6 the exploded and normalview of the module are presented. The extra–finger devicepresented in this paper is a four DoF modular structure.Three DoF are obtained with a pitch–pitch connection ofthree modules. Such connection is used to replicate theflexion/extension motion of the fingers. The device can beworn on the wrist thanks to a special bracelet consistingof a rubber band and a connection module. A fourth servomotor in contained in the connection module to add theabduction/adduction capability to the device. The completestructure of the extra–finger is reported in the right side ofFig. 6. The modules described here can be easily connectedthrough screws to obtain more complex kinematic structureslike the modular hand described in [13].

We report in Table I the Denavit–Hartenberg parametersof the modular device. Concerning the servomotor controller,we used an Arduino Uno board [24]. All the servomotors arePWM controlled.

B. Experimental Setup

The mapping algorithm proposed in Sec II assumes thatthe position of the reference points can be measured duringthe use of the extra–finger. To this aim, a dataglove, theCyberglove III System [25], was used to capture the motionof the human hand. We considered as reference points thefingertips of the human hand and the fingertip of the extra–finger. The position of the reference points on the humanhand are computed using a direct kinematics algorithm

(a) Box (b) Dice

Fig. 7. Different grasps performed with the aid of an extra–finger.

starting from the measures given by the glove. The referencepoint on the robotic device was similarly evaluated from themeasures of module joint angles. Both the dataglove and theArduino Uno board are connected through serial ports witha pc where the mapping algorithm runs.

C. Possible Applications

The first set of experiments tested the capability of thesecond–thumb to enlarge hand workspace in grasping tasks.We grasped objects with different sizes and shapes. Weconsidered reference points on all the five fingertips ofthe human hand. The extra–finger was moved using themapping algorithm previously described. In Fig. 7 we showthe obtained grasp of two objects. Note that the box in Fig. 7-a and the dice in Fig. 7-b were not graspable with one handusing only the human fingers.

In Fig. 8 we report two possible applications involving apartial mapping of the hand. In the first example (Fig. 8-a), the task was to open a bottle using only one hand. Inthis case the middle, ring and pinkie fingertips were used asreference points for the mapping. The bottle was grasped bymeans of the extra–finger and the middle, ring and pinkiefingers. The index and the thumb fingers were left free tomove without affecting the grasp tightness. So that, it waspossible to contemporary hold the bottle and unscrew thecap.

In the other example reported in Fig. 8-b the task was toopen the door using the handle while carrying an heavy bagwith the hand. In this case we used as reference points allthe fingers except the thumb. The user was able to turn thehandle to open the door using the second–thumb while keepholding the bag with the hand.

V. CONCLUSION

In this paper we presented a mapping algorithm to controlrobotic devices that can be used as extra–fingers. The controlsignals are computed without requiring explicit commandsby the human user, but interpreting human hand motion.The number of tracked fingers in the human hand can beselected according to the task to be fulfilled so to leave somefingers free to move without interfering with the devices’behavior. As a case study, we developed a four DoFs device

Page 7: An Object-based Mapping Algorithm to Control Wearable ...sirslab.dii.unisi.it/papers/2014/Prattichizzo.AIM.2014.Other.Fin.pdf · An Object-based Mapping Algorithm to Control Wearable

(a) Open a bottle (b) Open the door

Fig. 8. Different tasks performed with the aid of the second–thumb

that can be worn on the user wrist by a rubber band. Suchdevice plays the role of an extra thumb enlarging the humanhand workspace and the hand dexterity. We provided severalnumerical and real experiments to prove the usability of theobject-based mapping algorithm. We are currently improvingthe design and wearability of the robotic device and weare testing more fingers to be worn contemporary. Even ifthe study is in its preliminary phase, the first results, bothnumerical and experimental, show the feasibility and theusability of this type of devices. Adding a supplementaryrobotic finger to the human hand, we have an improvementof its workspace size, that allows the user to grasp andmanipulate larger objects. Furthermore we can enhance therange of tasks that can be performed by the hand. We believethat the object–bases mapping algorithm introduced in thispaper could represent an interesting paradigm for the controlof wearable robots.

ACKNOWLEDGMENT

The authors are grateful to Irfan Hussain for his help inassembling the prototype of the extra–finger.

REFERENCES

[1] S. Mohammed and Y. Amirat, “Towards intelligent lower limb wear-able robots: Challenges and perspectives - state of the art,” in Roboticsand Biomimetics, 2008. ROBIO 2008. IEEE International Conferenceon, pp. 312–317, Feb 2009.

[2] J. L. Pons et al., Wearable robots: biomechatronic exoskeletons,vol. 338. Wiley Online Library, 2008.

[3] A. B. Zoss, H. Kazerooni, and A. Chu, “Biomechanical designof the berkeley lower extremity exoskeleton (bleex),” IEEE/ASMETransactions on Mechatronics, vol. 11, no. 2, pp. 128–138, 2006.

[4] A. Gupta and M. K. O’Malley, “Design of a haptic arm exoskeleton fortraining and rehabilitation,” Mechatronics, IEEE/ASME Transactionson, vol. 11, no. 3, pp. 280–289, 2006.

[5] B. Llorens-Bonilla, F. Parietti, and H. H. Asada, “Demonstration-basedcontrol of supernumerary robotic limbs,” in IEEE/RSJ InternationalConference on Intelligent Robots and Systems (IROS), pp. 3936–3942,IEEE, 2012.

[6] F. Parietti and H. H. Asada, “Dynamic analysis and state estimationfor wearable robotic limbs subject to human-induced disturbances,” inIEEE International Conference on Robotics and Automation (ICRA),pp. 3880–3887, IEEE, 2013.

[7] O. A. Atassi, “Design of a robotic finger for grasping enhancement,”Master’s thesis, Universita degli Studi di Siena, December 2011.

[8] L. A. Jones and S. J. Lederman, Human hand function. OxfordUniversity Press, 2006.

[9] B. Buchholz and T. J. Armstrong, “A kinematic model of the humanhand to evaluate its prehensile capabilities,” Journal of biomechanics,vol. 25, no. 2, pp. 149–162, 1992.

[10] K. Kiguchi, T. Tanaka, and T. Fukuda, “Neuro-fuzzy control of arobotic exoskeleton with emg signals,” Fuzzy Systems, IEEE Transac-tions on, vol. 12, no. 4, pp. 481–490, 2004.

[11] G. Gioioso, G. Salvietti, M. Malvezzi, and D. Prattichizzo, “Mappingsynergies from human to robotic hands with dissimilar kinematics: anapproach in the object domain,” IEEE Trans. on Robotics, 2013.

[12] G. Wu, F. C. Van der Helm, H. Veeger, M. Makhsous, P. Van Roy,C. Anglin, J. Nagels, A. R. Karduna, K. McQuade, and X. Wang, “Isbrecommendation on definitions of joint coordinate systems of variousjoints for the reporting of human joint motion–part ii: shoulder, elbow,wrist and hand,” Journal of biomechanics, vol. 38, no. 5, pp. 981–992,2005.

[13] G. Gioioso, G. Salvietti, M. Malvezzi, and D. Prattichizzo, “An object-based approach to map human hand synergies onto robotic hands withdissimilar kinematics,” in Robotics: Science and Systems VIII, Sidney,Australia: The MIT Press, July 2012.

[14] G. Salvietti, M. Malvezzi, G. Gioioso, and D. Prattichizzo, “On the useof homogeneous transformations to map human hand movements ontorobotic hands,” in Proc. IEEE Int. Conf. on Robotics and Automation,no. 0, (Hong Kong, China), 2014.

[15] A. Bicchi, “On the closure properties of robotic grasping,” The Int. J.of Robotics Research, vol. 14, no. 4, pp. 319–334, 1995.

[16] R. Murray, Z. Li, and S. Sastry, A mathematical introduction to RoboticManipulation. 1994.

[17] A. Bicchi, “Force distribution in multiple whole-limb manipulation,”in Proc. IEEE Int. Conf. Robotics and Automation, (Atlanta), pp. 196–201, 1993.

[18] M. Malvezzi, G. Gioioso, G. Salvietti, D. Prattichizzo, and A. Bicchi,“Syngrasp: a matlab toolbox for grasp analysis of human and robotichands,” in Proc. 2013 IEEE International Conference on Robotics andAutomation, (Karlsruhe, Germany), pp. 1088–1093, 2013.

[19] M. Gabiccini, A. Bicchi, D. Prattichizzo, and M. Malvezzi, “On therole of hand synergies in the optimal choice of grasping forces,”Autonomous Robots, pp. 1–18.

[20] D. Prattichizzo, M. Malvezzi, M. Gabiccini, and A. Bicchi, “On mo-tion and force controllability of precision grasps with hands actuatedby soft synergies,” IEEE Transactions on Robotics, vol. in press, pp. 1–17, 2013.

[21] B.-H. Kim, S.-R. Oh, B.-J. Yi, and I. H. Suh, “Optimal graspingbased on non-dimensionalized performance indices,” in IntelligentRobots and Systems, 2001. Proceedings. 2001 IEEE/RSJ InternationalConference on, vol. 2, pp. 949–956, IEEE, 2001.

[22] F. Sanfilippo, G. Salvietti, H. Zhang, H. P. Hildre, and D. Prattichizzo,“Efficient modular grasping: an iterative approach,” in Proc. IEEE Int.Conf. on Biomedical Robotics and Biomechatronics, (Rome, Italy),pp. 1281–1286, 2012.

[23] M. Yim, W.-M. Shen, B. Salemi, D. Rus, M. Moll, H. Lipson,E. Klavins, and G. S. Chirikjian, “Modular self-reconfigurable robotsystems,” IEEE Robotics and Automation Magazine, vol. 14, no. 1,pp. 43–52, 2007.

[24] Arduino, “Arduino uno, an open-source electronics prototyping plat-form.”

[25] Immersion Technologies, “Cyberglove wireless system.” on-line:http://www.cyberglovesystems.com/.