a pervasive p3p-based negotiation mechanism for privacy-aware pervasive e-commerce

9
A pervasive P3P-based negotiation mechanism for privacy-aware pervasive e-commerce Ohbyung Kwon School of Management, Kyung Hee University, Seoul, South Korea abstract article info Article history: Received 12 November 2007 Received in revised form 23 July 2010 Accepted 8 August 2010 Available online 11 August 2010 Keywords: P3P Pervasive computing Pervasive e-commerce Agent technology Privacy-preserving system Context-awareness Privacy management is crucial in conducting pervasive computing services. The Platform for Privacy Preferences (P3P) is one of the most signicant efforts currently underway for users of Web-based services. Since users are typically nomadic in pervasive computing services, however, their specic privacy concerns change dynamically with context. This leads us to develop a dynamically adjusting P3P-based policy for a personalized, privacy-aware service as a core element of secure pervasive computing. The purpose of this paper is to propose dynamically and exibly a pervasive P3P-based negotiation mechanism for a privacy control of those functions. To do so, we consider and implement a multi-agent negotiation mechanism on top of a pervasive P3P system. © 2010 Elsevier B.V. All rights reserved. 1. Introduction Along with hardware, network protocols, interaction substrates, applications, and computational methods, privacy control is among the primary issues in pervasive computing. Privacy usually refers to personal information, and the invasion of privacy is usually interpreted as the unauthorized collection, disclosure, or other use of personal information as a direct result of electronic commerce transactions [47]. Privacy concerns are a signicant social issue in electronic society including e-commerce stakeholders and virtual communities [14,31]. It has been known that privacy concerns inuence on the stakeholders' decision process [45]. In pervasive computing systems, which make use of user-related information to interact more naturally with users through a set of devices in a task environment, users' sensitive information could be collected by a variety of service providers, potentially threatening their privacy [39]. Lahloh et al. addressed the fear of lling out forms found among service users, and hence stressed the importance of enhanced privacy guidance constructed with invisible computers in pervasive computing environments [27,28]. Privacy issues, as well as agent-based automation issues, are more critical in context-aware services that run in pervasive computing environments [4,24,26,43]. The privacy control might not be scalable if the information about privacy control is aggregated centrally. The unied privacy tagging project is an on-going effort to prevent undesirable object aggregations in context-aware systems in a scalable manner [20]. To fully realize the potential of pervasive computing, application designers must incorporate users' personal privacy preferences [9]. To protect users' privacy in ubiquitous or pervasive computing settings, the Privacy Prole Negotiation Protocol (PPNP) has been proposed. PPNP, initiated by the Tohda Lab at Keio University in Japan, manages users' privacy proles by permitting transfer of proles only to trusted services. In a similar effort, ETH Zurich has been developing a privacy- aware system that implements P3P. The Platform for Privacy Preferences (P3P) is an ofcial effort of the World Wide Web Consortium (W3C) to enable Web-based service users to gain control over their private information [12] and is designed in particular to automatically determine with which service providers and users' personal information may be shared. P3P provides a way for a Web site to encode its data-collection and data-use practices in a machine-readable XML format known as a P3P policy [10]. So far, however, efforts to balance privacy protection and service quality have proven ineffective [33]. The setup of a pervasive computing environment with P3P policies for pervasive e-commerce should be quite feasible. In recent years, many research teams have focused on P3P-based privacy-aware ubiquitous or pervasive computing systems and services [15,17,22,29,30,34,42,48,49]. Using the P3P extension framework, Langheinrich has implemented a mechanism for describing dissemination practices based on the location of the data-collection for a privacy-aware system [30]. To extend privacy-aware pervasive e-commerce protocols based on P3P, however, one must consider a large number of sensors, data exchanges, and users with a variety of preferences so that P3P might be applicable; in other words, privacy concerns in pervasive e-commerce are signicantly dependent upon the context of the user. A context-free Decision Support Systems 50 (2010) 213221 Tel.: +82 2 961 2148; fax: +82 2 961 0515. E-mail address: [email protected]. 0167-9236/$ see front matter © 2010 Elsevier B.V. All rights reserved. doi:10.1016/j.dss.2010.08.002 Contents lists available at ScienceDirect Decision Support Systems journal homepage: www.elsevier.com/locate/dss

Upload: ohbyung-kwon

Post on 05-Sep-2016

215 views

Category:

Documents


1 download

TRANSCRIPT

Decision Support Systems 50 (2010) 213–221

Contents lists available at ScienceDirect

Decision Support Systems

j ourna l homepage: www.e lsev ie r.com/ locate /dss

A pervasive P3P-based negotiation mechanism for privacy-awarepervasive e-commerce

Ohbyung Kwon ⁎School of Management, Kyung Hee University, Seoul, South Korea

⁎ Tel.: +82 2 961 2148; fax: +82 2 961 0515.E-mail address: [email protected].

0167-9236/$ – see front matter © 2010 Elsevier B.V. Aldoi:10.1016/j.dss.2010.08.002

a b s t r a c t

a r t i c l e i n f o

Article history:Received 12 November 2007Received in revised form 23 July 2010Accepted 8 August 2010Available online 11 August 2010

Keywords:P3PPervasive computingPervasive e-commerceAgent technologyPrivacy-preserving systemContext-awareness

Privacy management is crucial in conducting pervasive computing services. The Platform for PrivacyPreferences (P3P) is one of the most significant efforts currently underway for users of Web-based services.Since users are typically nomadic in pervasive computing services, however, their specific privacy concernschange dynamically with context. This leads us to develop a dynamically adjusting P3P-based policy for apersonalized, privacy-aware service as a core element of secure pervasive computing. The purpose of thispaper is to propose dynamically and flexibly a pervasive P3P-based negotiation mechanism for a privacycontrol of those functions. To do so, we consider and implement a multi-agent negotiation mechanism on topof a pervasive P3P system.

l rights reserved.

© 2010 Elsevier B.V. All rights reserved.

1. Introduction

Along with hardware, network protocols, interaction substrates,applications, and computational methods, privacy control is among theprimary issues in pervasive computing. Privacy usually refers topersonal information, and the invasion of privacy is usually interpretedas the unauthorized collection, disclosure, or other use of personalinformation as a direct result of electronic commerce transactions [47].Privacy concerns are a significant social issue in electronic societyincluding e-commerce stakeholders and virtual communities [14,31]. Ithas been known that privacy concerns influence on the stakeholders'decision process [45]. In pervasive computing systems, whichmake useof user-related information to interact more naturally with usersthrough a set of devices in a task environment, users' sensitiveinformation could be collected by a variety of service providers,potentially threatening their privacy [39]. Lahloh et al. addressed thefear of filling out forms found among service users, and hence stressedthe importance of enhancedprivacy guidance constructedwith invisiblecomputers in pervasive computing environments [27,28]. Privacyissues, as well as agent-based automation issues, are more critical incontext-aware services that run in pervasive computing environments[4,24,26,43]. The privacy controlmight not be scalable if the informationabout privacy control is aggregated centrally. The unified privacytagging project is an on-going effort to prevent undesirable objectaggregations in context-aware systems in a scalable manner [20].

To fully realize the potential of pervasive computing, applicationdesigners must incorporate users' personal privacy preferences [9]. Toprotect users' privacy in ubiquitous or pervasive computing settings,the Privacy Profile Negotiation Protocol (PPNP) has been proposed.PPNP, initiated by the Tohda Lab at Keio University in Japan, managesusers' privacy profiles by permitting transfer of profiles only to trustedservices. In a similar effort, ETH Zurich has been developing a privacy-aware system that implements P3P.

The Platform for Privacy Preferences (P3P) is an official effort oftheWorldWideWeb Consortium (W3C) to enableWeb-based serviceusers to gain control over their private information [12] and isdesigned in particular to automatically determine with which serviceproviders and users' personal information may be shared. P3Pprovides a way for a Web site to encode its data-collection anddata-use practices in a machine-readable XML format known as a P3Ppolicy [10]. So far, however, efforts to balance privacy protection andservice quality have proven ineffective [33].

The setup of a pervasive computing environment with P3P policiesfor pervasive e-commerce should be quite feasible. In recent years,manyresearch teams have focused on P3P-based privacy-aware ubiquitous orpervasive computing systems and services [15,17,22,29,30,34,42,48,49].Using the P3P extension framework, Langheinrich has implemented amechanism for describing dissemination practices based on the locationof the data-collection for a privacy-aware system [30].

To extend privacy-aware pervasive e-commerce protocols based onP3P, however, one must consider a large number of sensors, dataexchanges, and users with a variety of preferences so that P3Pmight beapplicable; in other words, privacy concerns in pervasive e-commerceare significantly dependent upon the context of the user. A context-free

214 O. Kwon / Decision Support Systems 50 (2010) 213–221

uniform privacy policy would probably work poorly in a pervasivee-commerce model. Despite the availability of P3P, there are still fewmechanisms to negotiate differences in systems or to advise users howto achieve their goals [14]. Based on the outstanding research aboutnegotiation strategies between Web sites and user agents done byCranor and Resnick, context-awareness is essential to practical P3P-based protocols [11]. No matter what differences there are among theirlevels of privacy preferences and trust are, usersmaybe treatedwith thesame blanket privacy policy. This approach may disappoint loyal users,hence putting that loyalty directly at risk. To resolve these concerns,providing a personalized privacy policy that takes into account bothusers' individual characteristics and their point-of-service context in anautomatic, dynamic, andflexiblewaywould help realize anunobtrusiveand secure pervasive computing service. As Cespedes and Smith haveargued, users may fear the widespread availability and use of theirpersonal information [8]. Moreover, the dimensionality of the issues,which include collection, errors, secondary use, and improper access,might not be absolute or fixed as users' perceptions and surroundingconditions continually change [40]. Hence, what and how the personalinformation including contextual data could be made are explicitlyknown to the user and then adjusted according to the user's context.

To date, studies on privacy in pervasive computing services considerpolicy matching, which is useful in protecting user data. Researchersassumed in these studies, however, that no negotiation mechanism isneeded in interchanging user data and services. Theprivacy-aware Salsaagent system is outstanding in that the systemconsiders thenegotiationphase between users and brokers. In the system, activation of thenegotiating state could occur when the broker rejects the request of auser agent when accessing a service agent [44]. Salsa, however, shouldbe extended to allow agents to negotiate with the privacy policy itself.Therefore, this leads us to extend legacy P3P-based privacy manage-ment in order to enhance freedom in registering the nomadic user'spersonal information. To do so, the following unresolved research issuesmust be addressed: first, the service provider's privacy policy should bedynamically designed according to the user's current context, becausethe nomadic user's privacy concern structure may vary with context.This functionality would decrease the service provider's negotiationefforts. Second, the provider's privacy policy should be personalized toindividual users. In particular, a user's reputation value is crucial forprivacy policy negotiation. Reputation has been regarded as crucial inelectronic marketplace, and hence pivotal in building privacy-awareservices that enable users to manage privacy effectively [21,23,50].Reputation is important in determining the level of risk to whichindividuals are exposing themselves to and is therefore needed todevelop an appropriate privacy policy. Reputation hence provides anoperable metric for establishment of trust between unknown entities,and is pivotal in building systems that enable users to manage privacyeffectively [36]. Users with a better reputation will be less likely to haveservice providers require detailed personal information in order toprovide better service. The information of reputation is managed bymaintaining a history of behavior for users in privacy-aware ubiquitouscomputing systems [38]. eBay's Feedback Forum is, to our knowledge,one of the most famous systems which have successfully incorporatedreputation [40]. Service providers may want the flexibility to applydifferent privacy policies for different users according to user reputationvalue.

This paper proposes the concept of P4P (Pervasive Platform forPrivacy Preferences), an extension of P3P, and develops a P4P-basednegotiation methodology for privacy-aware pervasive computingservices. The methodology includes a context-aware policy designand personalization. It also takes into consideration a user's personalprofile – current location, demographic data, reputation – as context.Moreover, the proposed methodology balances privacy protectionand service quality. To do so, first, we consider privacy protection byallowing users to negotiate with the services on submitting dataelements according to their privacy preferences; in other words, the

user can intervene on “required data elements.” Second, giving theusers the right to required data elements may decrease the servicequality in terms of service utilization since a lack of user-specific datamay result in fewer provided services from the service provider'sviewpoint. To resolve this problem, we consider the reputationmechanism to selectively accept the user's privacy concerns.

The remainder of this paper is organized as follows: Section2 reviewsprivacy issues and P3P-aware systems. Section 3 describes a negotiationmechanism with the concept of P4P. In Section 4, performance test isshown, and concluding remarks are made in Section 5.

2. Related work: privacy issues and P3P

In recent years, privacy protection has been among themost activeand spotlighted issues in electronic business. Accordingly, developingsolutions to these privacy concerns that are both technically andsocially secure is important. For example, many commercial Web sitesprovide a privacy policy, because these sites require personalinformation such as name, e-mail address, certain preferences, andeven a social security number (SSN). Specifying a site's privacy policyto inform the users before they register for services has been regardedas a sound way to mollify users' privacy concerns.

The design of P3P is partially derived from the code of ethics foruser agents. Based on the CMA code of ethics, it prescribes that a useragent should follow such a code in terms of notice and communica-tion, choice and control, fairness and integrity, and security [46].

To specify a privacy policy in a complete and standardized way,policy specification languages such as EPAL and P3P have emerged.Among these, the P3P specification defines the syntax and semanticsof P3P-based privacy policies. Since the specifications are in amachine-readable format, user agents can understand P3P specifica-tions and then automate decision-making on behalf of their userswhen appropriate, so that users need not read the privacy policy ofevery site they visit. The user agent is a program which mediatesinteractions with services on behalf of the user according to herpreference. Various privacy-preserving systems based on P3P havebeen proposed over the last decade. Ackeman proposed a technicalmechanism to inform users of data requests and their consequences[2]. A privacy control module could be embedded in the privacy-aware systems as middleware [18]. The Personal Context AgentNetworking (PeCAN) knowledge architecture consists of both client-and Web-side architectural data components and services, whichinform the user of online privacy and trust within e-commerce tasks[22]. Representative commercial examples that use a P3P policyinclude Microsoft's Explorer Ver6.0 and AT&T's Privacy Bird.

The flow of personal information in pervasive computing serviceshas been explained with economic models. Acquisti described aneconomicmodel as a function of the expectedbenefits of completing thetransaction, including the expected benefits of maintaining informationprivacy [3]. Jiang et al. [19] used an economics-based approach toanalyze information flow in ubiquitous computing. Price and Adam [35]described a framework that allows users flexible control of releasingpersonal data in a ubiquitous computing environment.

However, current research studies seldom address methods forresolving user privacy concerns in a pervasive computing environ-ment. To resolve this problem, discussions are being held to create abetter, more complete specification of P3P [5]. A privacy-awareservice in a pervasive computing environment for nomadic usersrequires amended and tailored P3P specifications.

3. System overview

3.1. Overall framework

To realize a P4P-aware pervasive computing service, an overallnegotiation framework is proposed as shown in Fig. 1. To determine

Nomadic user

Pervasive computingservice zone

User agent

Negotiator

Serviceagent

User’sprivacy

preference

P4P / Service withnegotiated interface

context

Privacypolicy

Fig. 1. Overall negotiation framework.

Table 1Proposed context model.

Context category Context field Examples

Activity Public Current scheduleModeratePrivate

Location Public Current locationModeratePrivate

Computational entity Available Device, network, TV channel, etc.Unavailable

Social Public Nearby personPrivateAlone

Physical environment Excellent Temperature, climate, etc.ModeratePoor

Identity High ReputationModerateLow

215O. Kwon / Decision Support Systems 50 (2010) 213–221

optimal service usage, the framework should take the followingrequirements into account:

• Enable dynamic P3P policy change according to user's privacypreference.

• Enable dynamic P3P policy change according to the user's currentcontext.

Since the policy reference in the P3P 1.0 specification allows theP3P policy to be located in a well-known location, we assume that theprivacy preference is stored as an ontology file, so that user agentsmay easily access each other for a negotiation to execute a service.

In a pervasive computing service environment, the user interfacefor nomadic users will not be restricted to aWeb browser, but may beextended to any interface that adopts advanced technologies such asmultimodal interaction, augmented reality, and motion recognition.For this reason, the framework should consider a pervasive computingservice zone for services with P4P-negotiated interactions. In thiszone, the user's current context data can be detected by an array ofsensors and then delivered to user agents through the sensorynetwork to provide service in advance.

Entering into the service zone, the user agent accesses the servicelist to select a service on the user's behalf. Pervasive computingservices obviously need to be aware of the current regulatory regimeso that they can comply with it [35].

Hence, a basic P4P interaction might proceed as follows:

• The user agent identifies the user's current context and formulatesthe user's dynamic privacy policy file.

• The user agent sends the user's dynamic privacy policy to the serviceagent to request the preference policy URI of the service agent.

• The service agent considers the user's privacy policy, customizes theP3P proposals, and then informs the user agent of the URI.

• The user agent visits the URI, and one or more P3P proposals areretrieved.

• The user agent evaluates the proposal according to the user'sprivacy preference rule set and determines what actions to take(e.g., deny, accept, prompt, or send a counter-proposal).

• If the proposal is consistent with the user's preferences, then anagreement is reached. The agent sends the service the ID of theproposal.

• The service provides the user with the customized pervasive service.

3.2. Pervasive Platform for Privacy Preferences (P4P)

P4P is an extension of conventional P3P that additionally considersspecifications relevant to context-sensitive privacy control. A treestructure is used to represent data elements in P3P specifications. Forexample, the data element “vehicle.model” is a child of the dataelement “vehicle.” However, since the current P3P specification doesnot consider contextual data, we extend it to include dynamicallychanging data elements using existing notation. To represent thesedynamic elements, we suggest the following method:

P3P data schema.CONTEXT.context field.context_value.

Note that default value for context_value is false. For instance, if auser provides only two declarations for privacy preferences on GPSposition data as:

user.current.GPS_position.Activity.public.true anduser.current.GPS_position.Location.moderate.true,

then the user's currentGPS position is requiredwhenhis or her activityis public and the location is in a moderate space, respectively. Otherthan those two data points, the context value is set to false, whichindicates that the data is not required in such context. For example, ifwe do not declare user.current.GPS_position.Activity.public, thenuser.current.GPS_position.Activity.public is automatically set to false.

To deliver the context data element to the service providers,context information can be acquired from either a personal contextontology or a user agent. Table 1 lists the categories in the P4P contextmodel. The context model partially relies on Dey's context classifica-tions: time, identity, location, activity, and computational entity,which are widely adopted in context-aware system development[13,25]. Social context is included in the category because it becomesmore important for community-based system [1,47]. Among thesecontext categories, we will focus in this paper on identity, especiallyreputation, to explain the proposed methodology.

3.3. Negotiation mechanism

Privacy concerns arise when there is tension for an individualbetween the gains earned by providing information and the need tohide the personal information [37]. For user privacy, we adopt aneconomics-based model on privacy-efficiency trade-offs similar toMilne and Gordon's for the negotiation mechanism [32]. Hence, werecognize the P4P-based privacy-aware system as the process of socialcontracts between client as privacy data supplier, and service provideras demander. The first step in the process of addressing the privacy-

Level of surveillance (si)

Cost

*si

yi = fi(si | ci,pi,oi,ri)

Cost of low service quality

Cost of surveillance*fi(si | ci,pi,oi,ri)

Fig. 2. Trade-off relationship for negotiation.

216 O. Kwon / Decision Support Systems 50 (2010) 213–221

efficiency trade-off is identifying the attributes that affect the overallprocess of social contracts. In this paper, we focus on the user'sreputation. The contracting dyads then establish privacy proposalswhichmay result in different levels of risk: on the one hand, the clientworries about low service quality and high surveillance level, and, onthe other hand, the service provider worries about having a too-lowsurveillance level and its client's potential withdrawal due to poorservice quality. Hence, the dyads intend to find the optimal level ofsurveillance that will minimize the risks both of the cost ofsurveillance and of the cost of low service quality.

The economics-based model seeks an optimal privacy policy interms of cost, which is calculated by trade-offs between the cost oflow service quality and the cost of surveillance—delivering one'sprivate information to the other side. The cost incurred by anegotiation break-down resulting from the interest discrepancybetween the user and service provider parties is the cost of lowservice quality. On the other hand, the cost of surveillance indicatesthe economical and/or psychological concern occurred by deliveringprivate data to service providers to get served. Privacy preferencesand actual behavior often exhibit a trade-off relationship, which cansometimes cause complications to users [7].

The functions are suggested by the authors based on twounderlying theoretical models: the economic rationality model andthe bounded rationality model. In the economic rationality modelthrough rational choice theory, the service provider will invest in asurveillance system in order to maximize the provider's profit [6]. Theservice provider is interested in and motivated by the surveillancefactor which can increase the provider's service quality—hence, morebenefit and ultimately less cost. The marginal benefit of surveillancewill be increased and the marginal cost will be decreased. And, theinverse is also true: the cost of low service quality will exponentiallyincrease as surveillance levels decrease, and the cost of surveillancewill exponentially increase as the lower priority surveillance factorsare considered in the surveillance system.

Secondly, Simon's bounded rationality, the base of decision supportsystem research, is also adopted in the model [41]. According tobounded rationality, the decision makers may act differently with thesame information, which are (ci,pi,oi,ri) in themanuscript. In a dynamicP3P context, bounded rationality is reflected to the shapes of costfunctions, which can be different from each other. The authors put thedecision maker's subjective values by designing the cost functions withtwoparameters:α andβ. Alsowe invite contextual information into ourmodel to better explain the decisionmaker's bounded rational behavior.

Hence, as shown in Fig. 2, for a data element i, the costs ofsurveillance, y=α2, ie

β2, isi, and of low service quality, y=α1, ie

−β1, isi,

depend on the level of surveillance,si.

The negotiating method is represented as follows:

STEP 1: the user agent requests the service provider's P3P policy.The P3P policy Ψ is represented as:

Ψ = f c1;p1; o1; r1ð Þ;…; ci; pi; oi; rið Þ;…; cN ;pN ; oN ; rNð Þg ð1Þ

where N is the number of data elements included in the P3P policy,ci∈C, pi∈P,oi∈O={always,opt− in,opt−out}, ri∈R, where C, P,O, and R indicate sets of categories, purposes, options, andretentions, respectively. Category considered in this paper isalready described in Table 1. The purpose element describes asto why the data is being collected relevant to the web site. Theretention element indicates how long the site will keep thepersonal information. These four factors are basically provided inP3P specifications.STEP 2: the user agent produces an optimal solution byminimizingthe total cost. The total cost on the user's side is determined by a

data element where the optimal cost of surveillance is more thanthose of any other data elements.A data element's optimal cost is derived as follows. For all i, fi(si|ci,pi,oi,ri)=α1, ie

−β1, isi+α2, ieβ2, isi, since the cost of low service quality is

y=α1, ie−β1, isi and the cost of surveillance is y=α2, ie

β2, isi. Hence, theoptimal level of surveillanceof the ith data element, si*,with thegivenci*,pi*,oi*,ri* is derived as Eq. (2):

s�i =lnα1iβ1i− lnα2iβ2i

β1i + β2ið2Þ

Then the total cost of the user's side is represented as in Eq. (3):

TCU = Maxf f1 s�1 jc�1; p�1; o�1; r�1� �

; f2 s�2 jc�2;p�2; o�2; r�2� �

;…; fN

× s�N jc�N ; p�N ; o�N ; r�N� �g

ð3Þ

where si* denotes the optimal level of surveillance of the ith

category.STEP 3: suppose that TCU = fM s�M jc�M ;p�M; o�M ; r�M

� �; 1≤M≤N. Then,

the optimal set of user preferences, s�1; s�2;…s�i ;…s�N jc�1;…c�N ;p

�1;…;

�p�N ; o

�1;…; o�N ; r

�1;…; r�Ng, is passed to the Negotiator, so that the

Negotiator can compare the user's preferences to those of the serviceagent.STEP 4: for any category i, service agent sets si=si

*. Then the totalcost of the service provider is as in Eq. (4):

TCS = Maxfg1 p�1; o�1; r

�1; c

�1 js1

� �;…; gj p�j ; o

�j ; r

�j ; c

�j jsj

� �;…; gN

× p�N ; o�N ; r

�N ; c

�N jsN

� �gð4Þ

STEP 5: the optimal set of the service provider's optimal preferenceis as in Eq. (5):

fp�1;…;p�j ;…;p�N ; o�1;…; o�j ;…; o�N ; r

�1;…; r�j ;…; r�N ; c

�1;…; c�j ;

…; c�N js1;…; si;…; sNgð5Þ

Then, Eq. (5) is passed to the Negotiator.

217O. Kwon / Decision Support Systems 50 (2010) 213–221

STEP 6: the Negotiator asks the User agent if the service provider'soptimal preference is acceptable. If so, stop. If not, preferencerelaxation is performed according to the following rules:

[Rules of preference relaxation for User agent]

Rule 1 (Reputation rule): send any privacy-free profile to provethat the user is trustworthy, so that the Service agent mayoptimize its preference by requiring a higher condition categoryvalue for the user. The privacy-free profile is deeply related toestimate the user's reputation.Rule 2: while keeping the value of the total cost of the user's side(TCU) unchanged, change the level of surveillance (si) except theoptimal level of the level of surveillances (sM), because sM is thedeterminant ofTCU.Rule 3: if the User agent is prompted to allow more surveillance(si), increasing the total cost of the user's side (TCU), the User agentshould ask the user for permission to do so.

[Rules of preference relaxation for Service agent]

Rule 1: send any privacy-free profile to prove that the serviceprovider is trustworthy, so that the User agent can optimize itspreference by requiring category values with better conditions tothe service provider.Rule 2: by keeping the value of the total cost of the seller's side(TCS) unchanged, change the level of (cj,pj,oj, rj) except that of c�j ;

p�j ; o�j ; r

�j Þ, because c�j ; p

�j ; o

�j ; r

�j

� �is the current determinant ofTCS.

Rule 3: if the Service agent requires a value of (cj,pj,oj, rj) that canincreaseTCS, the Service agent should inform the service providerto acquire permission.

Preference relaxation is useful for increasing the rate of serviceutilization while preserving the user's privacy concern. If the serviceprovider requires a data element that the user does not want toprovide, the transaction would be closed. However, if the user istrustworthy, then the service provider may make a concession to theuser's privacy concerns by either withdrawing the original datarequest or by suggesting a different piece of information about whichthe user has less of a privacy concern.

4. Experimental evaluation

4.1. Setting of the experiment

The aim of this experiment is to investigate how the P4P-awarepervasive computing service adopting the negotiation strategyproposed in this paper provides value that varies by user in terms ofservice utilization. To do so, to illustrate the technical and operationalviability of the negotiation mechanism proposed in this paper, anexperimental evaluation was carefully conducted before introducingthe P4P-based pervasive computing service.

We developed a prototype system, P4P-based recommendationsystem, running on an IBM Pentium Server with one 1.8G processor,1Gbyte of RAM, and a 120 Gbyte hard disk. The negotiation mechanismwas programmed in Java SDK 1.4.x running on Microsoft Windows XPProfessional Edition. The experimentswere conductedwith PDAs,whichcan access the prototype system via a wireless Internet connection.

4.2. Experimental design

Each experimental design has 19 participants as subjects: ninebuyers and ten sellers. Each seller plays a role of one of the actual

shops: W. Bank, H. Bank, F. Café,.S. Bucks Café, Post Office, S. Café, S.Office, Kiosk #1, Kiosk #2, and Kiosk #3. In all of the experimentsreported here, the subject pool consisted of graduate students intechnology management or international management departmentat Kyung Hee University. Of the 19 participants, only one is a first-year graduate student; the others are in their second year or later.The participants were well acquainted with the service, regardlessof their demographic characteristics. Moreover, the participantswere quite skillful in using the mobile devices, PDAs. Usable datawas hence successfully collected from the participants. Eachparticipant received a set of instructions prior to the experimentand proceeded through the instruction at her own pace. Uponcompletion of the instructions, the PDA is provided to the buyers. Toeliminate unanticipated bias involving PDA usage, instructions weregiven for how to use the PDA. After careful explanation, the userswere asked to fill in their contextual preferences for using personalinformation in the context customer relationship management,including for one-to-one marketing such as providing promotion orevents. A couple of data elements is considered in the negotiation:user id, user profile, phone number and home address. The P4Ppreference rating screen is shown as Fig. 3. At anytime, of course,the user can modify the ratings.

Reputation is categorized as high, moderate and low as context.To determine the optimal level of surveillance for each data element,the values α1,β1,α2, and β2 are set by the user. Since it is difficult forthe user to choose numerically appropriate values, we provided themwith simple questions and a 5 Likert scale. If the buyer does notexplicitly rate her preference, the values are set to the default.

The sellers are asked to fill in the reputation value for each buyer.To help them rate the reputation value, a sort of anonymous personalprofile is provided, including age and income.

The buyer then runs the P4P-based service while walking aroundthe shops with the PDA. When the user approaches a shop, she or hemay get a recommendation from the corresponding seller. Fig. 4shows the results of the P4P suggestion about which data elementsare requested to make further recommendations. If the suggestionis acceptable, the user is asked to select ‘like’ and proceed.Otherwise, she would select ‘dislike.’ As shown in Fig. 4(a) and(b), the same buyer may receive different suggestions fromdifferent sellers. Woori Bank just requests the user's address tomake a P4P-aware recommendation, while F. Café requires all dataelements. These differences occur mainly because of varyingrecognition of the user's reputation value and negotiation strategies.Source credibility theory can similarly support reputation mechan-isms that could be applied [16]. For the experiment, three strategiesare randomly selected and applied to get the results: suggestionwithout negotiation (NO_NEGO), suggestion with negotiation(NEGO), and suggestion with negotiation having reputation valueas context (NEGO_REPUT). During the experiment, to eliminate anybias affecting evaluation of the final result, buyers are not informedof which strategy is applied. When buyers select ‘like’ or ‘dislike’,the evaluation results are stored in the database for furtherperformance evaluation.

4.3. Result 1: the effect of negotiation mechanism on service utilization

In this paper, two underlying theories that show the marketefficiency of web-based services are described: transaction costtheory and agency theory. Success rate is one of the most widelyacknowledged metrics for evaluating transaction processing perfor-mance. Increasing the transaction success rate would decrease themarket efficiency by reducing overall transaction costs. Traditionally,a service agent in multi-agent system attempts to increase thetransaction success rate by decreasing selling price, which does notguarantee the profitability of the service provider. However, sinceour P4P based negotiation increases the rate without also increasing

Fig. 3. Screenshot of P4P Rating.

218 O. Kwon / Decision Support Systems 50 (2010) 213–221

the price level, the proposed mechanism is more profitable. Thesuccess rate in reaching a deal (sr) is described as follows:

sr =∑N

i=1si

Nð6Þ

where si=1 if the ith pair of user agent and service agent haseventually agreed to transact, and 0 vice versa. N indicates totalnumber of pairs.

One of the major issues regarding privacy-preserving systems is toincrease the service utilization while addressing users' privacyconcerns. That the users cannot selectively submit data items to bereported is mainly a product of the current limitations of Web-basedsystems. Even in cases of P3P-based privacy-aware systems, only theservice provider determines P3P content. We hence need to examinewhether our negotiation mechanism is significantly effective inincreasing market efficiency in terms of success rate (Hypothesis 1).

Hypothesis 1. P3P-based privacy-preserving system with the proposednegotiation mechanism (NEGO) will outperform that without a negoti-ation mechanism (NO_NEGO) in terms of the mean success rate.

The cost of surveillance and the cost of low service quality are thendetermined to derive automatically the total cost and the optimallevel of surveillance for each data element. If all of the optimalsolutions are implemented, the final cost is then determined byselecting the maximum total cost among the data elements. Using athreshold value set by the seller, a final decision is made aboutwhether the privacy policy is accepted.

If the user does not accept the policy, then the negotiation systeminforms the service provider, inquiringwhether the service provider iswilling to relax the policy. To make this decision, the user's reputationvalue is considered. For the actual e-commerce system, userreputation data could be estimated partially using the data providedby financial agencies, and partially using the user profile stored in thepersonal ontology. However, determining the estimation function is afinancial engineering or trust issue and is beyond the scope of thecurrent research. Reputation evaluation methods such as theEigenTrust mechanism, Dempster–Shafer theory, and the Behavior

Characteristics-based reputation evaluation method could be accept-ed for future work [25,36].

In our laboratory experiment, reputation value was computedbased on the Behavior Characteristics-based reputation evaluationmethod: The value of reputation is derived from the user's pastbehavior history using our prototype system. In this method, userbehavior is categorized into several types: basically stable, steadilyincreasing, steadily decreasing, etc. The acceptable degree of relaxa-tion is determined using the user's reputation value: relaxing thepurpose, retention, or even dropping the data element from the list.The temporally updated P4P is then resent to the negotiator for thenext round. To simplify our experiment, only two rounds wereallowed for relaxation. If more relaxations are allowed, the successrate will be greater than that seen in this experiment.

To test Hypothesis 1, we adopted the paired samples t-test tocompare the mean success rate of the two mechanisms: NO_NEGOand NEGO. Based on the results listed in Table 2, we could concludethat the negotiation mechanism significantly increases the successrate. The statistical test results for Hypothesis 1 indicate that the nullhypothesis is strongly statistically rejected at less than 1% significancelevels. The negotiation mechanism positively affects the usage ofprivacy-aware services.

Increasing the success rate in an automated manner, as shown inTable 2, implies that the benefit of the proposed negotiationmechanism can also be illustrated by agency theory: the higher thesuccess rate between user agent and service agent, the higher theefficiency of hiring computational agents. This results in decreasingagency costs via automated transactions.

4.4. Result 2: the effect of reputation awareness on service utilization

We conducted another experiment to examine the extent to whichconsidering user reputation influences the performance of thenegotiation mechanism. A client with a higher socio-economic statusand credits in fact has a higher reputation, which in turn affects theusage of web-based services. Research also shows that reputation is adeterminant of trust, which significantly affects e-transactions.Reputation is a key for discerning with whom the client or suppliershould transact.

Table 2Results of the statistical test—two-samples test for Hypothesis 1.

Performancemeasures

Mean rate of service utilization

Without negotiation mechanism With negotiation mechanism(NO_NEGO) (NEGO)

Success rate 23.33% 65.24%

Mean Std.deviation

Std. errormean

t Significance(2-tailed)

NO_NEGO–NEGO

−.41905 .49458 .03413 −12.278 0.000***

*** pb0.01.

(a) Sample result (Woori bank)

(b) Sample result (F.Café)

Fig. 4. Sample recommendation results (Woori bank).

219O. Kwon / Decision Support Systems 50 (2010) 213–221

Meanwhile, transactions have two kinds of risks: one, losingtrustworthy clients by raising their privacy concerns and makingthem feel uncomfortable from being asked to reveal too muchpersonal data; two, confronting risk by not requesting sufficientpersonal data from a client. One of the best ways to minimize thesetwo risks, and hence increase the transaction success, is to give theclient some flexibility and choice—allow him or her freedom to notdisclose personal data according to the level of reputation. Thetransaction decision must take reputation level into consideration:requesting more personal data from clients who have a lowerreputation, and allow clients who have a higher reputation to notsubmit some personal data. The proposed negotiation methodincludes this task.

Consequently, to examine whether the proposed negotiationmethod intelligently behaves according to the client's reputationlevel, the success rate was compared between negotiation protocolsboth with and without consideration of reputation (Hypothesis 2):

Hypothesis 2. The P4P-based privacy-preserving system with anegotiation mechanism that considers reputation as context(NEGO_REPUT) will outperform the P3P-based privacy-preservingsystem that does not consider reputation (NEGO), in terms of thesuccess rate.

We again adopted paired samples t-test to compare the successrates of two mechanisms: NEGO and NEGO_REPUT. Table 3 shows theresults: the rate of service utilization is considerably higher whenreputation is not considered. The results indicate that Hypothesis 2seems to be rejected.

In this case, however, a higher success rate does not necessarilyindicate superior performance, simply because users with a lowerreputation are less likely to provide value to the service provider. Suchusers may even abuse the information of the service providersacquired during the use of the service. The service provider thus maywant to grant the use of its services and information selectively, andby considering user reputation as a factor. A user with a higherreputation should be more than welcomed by the service providerand a user wither a poorer reputation may be less welcomed. Todetermine whether simple and reputation-aware negotiations wouldexhibit better performance, we conducted a regression analysis toexamine to what extent reputation level could be a determinant. Fig. 5shows that a negotiation mechanism that considers the userreputation seems more proportional to the reputation level thanother mechanisms do. The F-values of each case are compared inTable 4. We conclude from that table that a negotiation that considersreputation is smarter than one that does not: The mechanism canselectively attract desirable users while avoiding users who have alower reputation, to decrease service risk.

5. Conclusion

Pervasive P3P-based negotiation mechanisms are developed andevaluated, with the goal of increasing the possibility of a servicematch

Table 3Results of the statistical test—one-way ANOVA for Hypothesis 2.

Performancemeasures

Mean rate of service utilization

Negotiation mechanism withoutconsidering reputation

Negotiation mechanismconsidering reputation

Success rate 65.24% 60.95%

Mean Std.deviation

Std. errormean

t Significance(2-tailed)

NEGO–NEGO_REPUT

.04286 .50174 .03462 1.238 0.217

*** pb0.01.

Fig. 5. Comparison of the success rates by changing reputation level.

Table 4Results of statistical test—regression analysis.

Mechanism d.o.f MSR MSE F-value R-Square

Without negotiation mechanism 209 .333 .179 1.859 0.009Negotiation mechanism withoutconsidering reputation

209 .412 .227 1.817 0.009

Negotiation mechanism withconsidering reputation

209 10.832 .188 57.553* 0.217

* pb0.1.

220 O. Kwon / Decision Support Systems 50 (2010) 213–221

by decreasing the gap between the information that service providersrequire, and the personal information the user is willing to share.Since P4P assumes a nomadic user and a pervasive service running inany space, the specification includes contextual data elements, so thatthe service may keep track of the user's data which, again, may bedynamically changing. Based on these dynamically changing privacypreferences, negotiation of what information should be passed is acore idea addressed in this paper.

Areas of future research include full-fledged development of P4P-based privacy-aware pervasive systems, provision of complete P4Pcategories, purposes and retentions, and evaluation of the negotiationmechanism to optimize performance. Meanwhile, since we adopt ahedonic task, shopping, in this paper, the outcome of the case studycould only be generalized to hedonic tasks. An expanded study toovercome the generalization issues should be remained as futureresearch. We currently plan to extend the presented mechanism sothat it is available in legacy privacy-preserving systems.

Acknowledgement

This research is supported by the Ubiquitous Computing andNetwork(UCN) Project, Knowledge and Economy Frontier R&DProgram of the Ministry of Knowledge Economy(MKE) in Korea anda result of subproject UCN UCN 10C2-T2-11T.

References

[1] G.D. Abowd, Social disclosure of place: from location technology to communica-tion practices, Lecture Notes in Computer Science 3468 (2005) 134–151.

[2] M.S. Ackeman, Privacy in pervasive environments: next generation labelingprotocols, Personal and Ubiquitous Computing 8 (6) (2004) 430–439.

[3] A. Acquisti, Protecting privacy with economics: economic incentives forpreventive technologies in ubiquitous computing environments, Workshop onSocially-Informed Design of Privacy-enhancing Solutions in Ubiquitous Comput-ing, Proceedings of the UbiComp 2002, Göteborg, Sweden, 2002.

[4] C. Adams, V. Katos, Privacy challenges for location aware technologies, IFIPInternational Federation for Information Processing 191 (2005) 303–310.

[5] R. Agrawal, J. Kiernan, R. Srikant, Y.R. Xu, XPref: a preference language for P3P,Computer Networks 48 (5) (2005) 809–827.

[6] G.S. Becker, G.N. Becker, The Economics of Life, McGraw-Hill, 1997.[7] B. Berendt, O. Ganther, S. Spiekermann, Privacy in e-commerce: stated preferences

vs. actual behavior, Communications of the ACM 48 (4) (2005) 101–106.[8] F.V. Cespedes, H.J. Smith, Database marketing—new rules for policy and practice,

Sloan Management Review 34 (4) (1993) 7–22.[9] L.F. Cranor, M. Langheinrich, M. Marchiori, J. Reagle, The Platform for Privacy

Preferences 1.0 (P3P1.0) Specification, W3C Recommendation, April 2002, HTMLVersion at www.w3.org/TR/P3P/.

[10] L.F. Cranor, P. Resnick, Protocols for automated negotiations with buyeranonymity and seller reputations, Netnomics 2 (1) (2000) 1–23.

[11] L.F. Cranor, Web Privacy with P3P, O'Reilly, 2002.[12] M.J. Culnan, How did they get my name—an exploratory investigation of consumer

attitudes toward secondary information use, MIS Quarterly 17 (3) (1993) 341–361.[13] A.K. Dey, Context-aware computing: the CyberDesk project, AAAI 1998 Spring

Symposiumon Intelligent Environments, Technical Report SS-98-02, 1998, pp. 51–54.[14] M. Duckham, L. Kulik, A Formal Model of Obfuscation and Negotiation for Location

Privacy, Pervasive 2005, Munich, Germany, 2005, pp. 152–170.[15] J.B. Earp, A.I. Anton, L. Aiman-Smith, W.H. Stufflebeam, Examining internet

privacy policies within the context of user privacy values, IEEE Transactions onEngineering Management 52 (2) (2005) 227–237.

[16] M.A. Ekstrom, H.C. Bjornsson, C.I. Nass, A reputation mechanism for business-to-business electronic commerce that accounts for rater credibility, Journal ofOrganizational Computing and Electronic Commerce 15 (1) (2005) 1–18.

[17] J. Goecks, E. Mynatt, Enabling privacy management in ubiquitous computingenvironments through trust and reputation systems, Proceedings of CSCW 2002,New Orleans, LA, 2002.

[18] D. Hong, M. Yuan, V.Y. Shen, Dynamic privacy management: a plug-in service forthe middleware in pervasive computing, Proceedings of the 7th InternationalConference on Human Computer Interaction with Mobile Devices & Services,Salzburg, Austria, 2005, pp. 1–8.

[19] X. Jiang, J.I. Hong, J.A. Landay, Approximate information flows: socially-basedmodeling of privacy in ubiquitous computing, The Fourth International Confer-ence on Ubiquitous Computing, Goteberg, Sweden, 2002.

[20] X. Jiang, J. Landay, Modeling privacy control in context-aware systems, IEEEPervasive Computing 1 (3) (2002) 59–63.

[21] A. Josang, R. Ismail, C. Boyd, A survey of trust and reputation systems for onlineservice provision, Decision Support Systems 43 (2) (2007) 618–644.

[22] D.N. Jutla, P. Bodorik, Y.J. Zhang, PeCAN: an architecture for users' privacy-awareelectronic commerce contexts on the semantic web, Information Systems 31 (4–5)(2006) 295–320.

[23] D.S. Kamvar, M.T. Schlosser, H. Garcia-Molina, The eigen trust algorithm forreputation management in P2P networks, Twelfth International WorldWideWebConference, Budapest, Hungary, 2003.

[24] D.J. Kim, D.L. Ferrin, H.R. Rao, A trust-based consumer decision-making model inelectronic commerce: the role of trust, perceived risk, and their antecedents,Decision Support Systems 44 (2) (2008) 544–564.

[25] O. Kwon, N. Sadeh, Applying case-based reasoning and multi-agent intelligentsystem to context-aware comparative shopping, Decision Support Systems 37 (2)(2004) 199–213.

[26] O. Kwon, The potential roles of context-aware computing technology inoptimization-based intelligent decision-making, Expert Systems with Applica-tions 31 (3) (2005) 629–642.

[27] O. Kwon, Multi-agent system approach to context-aware coordinated webservices under general market mechanism, Decision Support Systems 41 (2)(2006) 380–399.

[28] S. Lahlou, F. Jegou, European disappearing computer privacy design guidelinesV1.0. Ambient Agoras, Report D15.4, Disappearing Computer Initiative, Oct. 2003.

[29] S. Lahlou, M. Langheinrich, C. Rocker, Privacy and trust issues with invisiblecomputers, Communications of the ACM 48 (3) (2005) 59–60.

[30] M. Langheinrich, Privacy by design—principles of privacy-aware ubiquitoussystems, Proceedings of the Ubicomp 2001, Atlanta, GA, 2001, pp. 273–291.

[31] M. Langheinrich, A privacy awareness system for ubiquitous computingenvironments, Proceedings of the Ubicomp2002, 2002, pp. 237–245.

[32] A.P. Meyer, Privacy-aware mobile agent: protecting privacy in open systems bymodelling social behaviour of software agents, ESAW 2003, London, UK, 2003,pp. 123–135.

[33] G.R. Milne, M.E. Gordon, Direct mail privacy-efficiency trade-offs within animplied social-contract framework, Journal of Public Policy & Marketing 12 (2)(1993) 206–215.

[34] C. Neustaedter, S. Greenberg, The design of a context-aware homemedia space forbalancing privacy and awareness, Lecture Notes in Computer Science 2864 (2003)297–314.

[35] P. Persiano, I. Visconti, An anonymous credential system and a privacy-aware PKI,Lecture Notes in Computer Science 2727/2003, 2003, pp. 27–38.

[36] B. Price, K. Adam, B. Nuseibeh, Keeping ubiquitous computing to yourself: apractical model for user control of privacy, International Journal of HumanComputer Studies 63 (1–2) (2005) 228–253.

[37] X. Qu, X. Yang, Y. Tang, H. Zhou, A behavior characteristics-based reputationevaluation method for grid entities, Lecture Notes in Computer Science 3470(2005) 567–577.

[38] D. Redell, Information Technology and the privacy of the individual, Daft ACMWhitepaper on Computer and Privacy, September, 1992.

[39] P. Resnick, R. Zeckhauser, E. Friedman, K. Kuwabara, Reputation systems,Communications of the ACM 43 (12) (2000) 45–48.

[40] D. Saha, A. Mukherjee, Pervasive computing: a paradigm for the 21st century, IEEEComputer 36 (3) (2003) 25–31.

O. Kwon / Decision Support Sy

[41] H. Simon, Bounded rationality and organizational learning, Organization Science 2(1) (1991) 125–134.

[42] H.J. Smith, S.J. Milburg, S.J. Burke, Information privacy: measuring individuals'concerns about organizational practices, MIS Quarterly 20 (2) (1996) 167–196.

[43] L. Sweeney, Privacy-preserving surveillance using databases from daily life, IEEEIntelligent Systems 20 (5) (2000) 83.

[44] K. Tang, Y.L. Chen, H.W. Hu, Context-based market basket analysis in a multiple-store environment, Decision Support Systems 45 (1) (2008) 150–163.

[45] The Feedback Forum, eBay.http://pages.ebay.com/services/forum/feedback.html[46] K. Valck, G.H. van Bruggen, B. Wierenga, Virtual communities: a marketing

perspective, Decision Support Systems 47 (3) (2009) 185–203.[47] W3C, http://www.w3.org/TandS/QL/QL98/pp/APPEL-QLW.html, 1998.[48] H. Wang, M.K.O. Lee, C. Wang, Consumer privacy concerns about Internet

marketing, Communications of the ACM 41 (3) (1988) 63–70.[49] M. Weiser, Some computer science issues in ubiquitous computing, Communica-

tions of the ACM 36 (7) (1993) 75–84.[50] J. Yen, R. Popp, G. Cybenko, K.A. Taipale, L. Sweeny, P. Rosenzweig, Homeland

security, IEEE Intelligent Systems 20 (5) (2005) 76–86.

Dr. Ohbyung Kwon is presently a professor at the School ofManagement, Kyung Hee University, South Korea, wherehe initially joined in 2004. In 2002, he worked at theInstitute of Software Research International (ISRI) atCarnegie Mellon University. He received MS and PhDdegrees at KAIST in 1990 and 1995, respectively. He is now

221stems 50 (2010) 213–221

an adjunct professor at San Diego State University (SDSU).His current research interests include context-awareservices, case-based reasoning and DSS. He has presentedvarious papers in leading information system journalsincluding Decision Support Systems, Simulation, Interna-tional Journal of Computer Integrated Manufacturing, andBehavior and Information Technology.