learning from the dark web: leveraging conversational ...learning from the dark web: leveraging...

21
CONCEPTUAL/THEORETICAL PAPER Learning from the Dark Web: leveraging conversational agents in the era of hyper-privacy to enhance marketing Felipe Thomaz 1 & Carolina Salge 2 & Elena Karahanna 3 & John Hulland 3 # The Author(s) 2019 Abstract The Web is a constantly evolving, complex system, with important implications for both marketers and consumers. In this paper, we contend that over the next five to ten years society will see a shift in the nature of the Web, as consumers, firms and regulators become increasingly concerned about privacy. In particular, we predict that, as a result of this privacy-focus, various information sharing and protection practices currently found on the Dark Web will be increasingly adapted in the overall Web, and in the process, firms will lose much of their ability to fuel a modern marketing machinery that relies on abundant, rich, and timely consumer data. In this type of controlled information-sharing environment, we foresee the emersion of two distinct types of consumers: (1) those generally willing to share their information with marketers (Buffs), and (2) those who generally deny access to their personal information (Ghosts). We argue that one way marketers can navigate this new environment is by effectively designing and deploying conversational agents (CAs), often referred to as chatbots.In particular, we propose that CAs may be used to understand and engage both types of consumers, while providing personalization, and serving both as a form of differentiation and as an important strategic asset for the firmone capable of eliciting self-disclosure of otherwise private consumer information. Keywords Web . Dark Web . Consumer privacy . Marketing strategy . Chatbots . Conversational agents . Personalization . Anthropomorphism Introduction Individuals talking about the Web often refer to it as a static entity. In reality, however, the Web is a constantly evolving, complex system, with important implications for both firms and consumers. In this paper, we begin by reviewing the Webs evolution over time, noting that these changes are driv- en by shifts in the relative market power dynamic that plays out between firms and consumers, based primarily on the issue of which party controls or owns information. We build on this review to suggest how the Web may potentially change in the next five to ten years. Our contention is that society will see a shift in the nature of the Web, as various stakeholders become increasingly concerned about privacy issues, away from a largely automatic opt-inculture (wherein consumers typically allow firms to collect, use, and share their personal information with other organizations) to one characterized as substantially opt-out.This shift will have profound implica- tions for the practice of marketing. In particular, we predict that various information sharing and protection practices currently found only on the Dark Web will be increasingly adapted in the overall Web, resulting in a hyper-private, more adversarial environment. In the process of this transformation, firms will lose the ability to create in- depth profiles of consumers, leading to eroded customer Mark Houston served as accepting Editor for this article. * Felipe Thomaz [email protected] Carolina Salge [email protected] Elena Karahanna [email protected] John Hulland [email protected] 1 Saïd Business School, University of Oxford, Oxford OX1 1HP, UK 2 School of Business, Wake Forest University, Winston-Salem, NC 27106, USA 3 Terry College of Business, University of Georgia, Athens, GA 30601, USA https://doi.org/10.1007/s11747-019-00704-3 Journal of the Academy of Marketing Science (2020) 48:4363 Published online: 11 November 2019

Upload: others

Post on 28-Mar-2020

11 views

Category:

Documents


0 download

TRANSCRIPT

CONCEPTUAL/THEORETICAL PAPER

Learning from the Dark Web: leveraging conversational agentsin the era of hyper-privacy to enhance marketing

Felipe Thomaz1 & Carolina Salge2& Elena Karahanna3 & John Hulland3

# The Author(s) 2019

AbstractTheWeb is a constantly evolving, complex system, with important implications for both marketers and consumers. In this paper,we contend that over the next five to ten years society will see a shift in the nature of the Web, as consumers, firms and regulatorsbecome increasingly concerned about privacy. In particular, we predict that, as a result of this privacy-focus, various informationsharing and protection practices currently found on the Dark Web will be increasingly adapted in the overall Web, and in theprocess, firms will lose much of their ability to fuel a modern marketing machinery that relies on abundant, rich, and timelyconsumer data. In this type of controlled information-sharing environment, we foresee the emersion of two distinct types ofconsumers: (1) those generally willing to share their information with marketers (Buffs), and (2) those who generally deny accessto their personal information (Ghosts). We argue that one way marketers can navigate this new environment is by effectivelydesigning and deploying conversational agents (CAs), often referred to as “chatbots.” In particular, we propose that CAs may beused to understand and engage both types of consumers, while providing personalization, and serving both as a form ofdifferentiation and as an important strategic asset for the firm—one capable of eliciting self-disclosure of otherwise privateconsumer information.

Keywords Web . Dark Web . Consumer privacy . Marketing strategy . Chatbots . Conversational agents . Personalization .

Anthropomorphism

Introduction

Individuals talking about the Web often refer to it as a staticentity. In reality, however, the Web is a constantly evolving,

complex system, with important implications for both firmsand consumers. In this paper, we begin by reviewing theWeb’s evolution over time, noting that these changes are driv-en by shifts in the relative market power dynamic that playsout between firms and consumers, based primarily on theissue of which party controls or owns information. We buildon this review to suggest how theWebmay potentially changein the next five to ten years. Our contention is that society willsee a shift in the nature of the Web, as various stakeholdersbecome increasingly concerned about privacy issues, awayfrom a largely automatic “opt-in” culture (wherein consumerstypically allow firms to collect, use, and share their personalinformation with other organizations) to one characterized assubstantially “opt-out.” This shift will have profound implica-tions for the practice of marketing.

In particular, we predict that various information sharingand protection practices currently found only on the DarkWebwill be increasingly adapted in the overall Web, resulting in ahyper-private, more adversarial environment. In the process ofthis transformation, firms will lose the ability to create in-depth profiles of consumers, leading to eroded customer

Mark Houston served as accepting Editor for this article.

* Felipe [email protected]

Carolina [email protected]

Elena [email protected]

John [email protected]

1 Saïd Business School, University of Oxford, Oxford OX1 1HP, UK2 School of Business, Wake Forest University,

Winston-Salem, NC 27106, USA3 Terry College of Business, University of Georgia,

Athens, GA 30601, USA

https://doi.org/10.1007/s11747-019-00704-3Journal of the Academy of Marketing Science (2020) 48:43–63

Published online: 11 November 2019

knowledge and the potential end of existing micro-targetingpractices. Marketers will need to be nimble in order to survivethis coming change. From the consumer perspective, this shiftis likely to increase the costs of information search, and re-quire new, more expensive, and more complex technologicalinvestments.

In this type of controlled information-sharing environment,we foresee the emersion of two broad types of consumers,with very different digital data footprints: (1) those consumerswho are willing to give permission to firms to track, record,use, and share consequential information (e.g., purchase andsite visit histories), rendering their digital essence “naked” toall, and (2) those who deny access to such information andthereby become digital “ghosts.” The first group of consumers(Buffs) will be similar to today’s digital consumers, whereasthe second group (Ghosts) will be quite different.1

Operating in such an environment will create prob-lems for businesses familiar with current Web 2.0-basedtools, designed to optimize marketing to the first group(Buffs). So how should firms respond? We argue thatconversational agents (CAs)—often referred to as“chatbots”—will play an increasingly important role inhelping firms market to both groups. We are alreadyseeing signs of firms using artificial intelligence (AI)and machine learning techniques to provide value byreplacing humans in social systems where the occupa-tion or position is either “undesirable” or costly, and byaugmenting and complementing humans in social sys-tems where the task or job is either tedious or repeti-tive. For example, a growing number of chatbot serviceagents are replacing call center employees, but alsofreeing human claims processors to focus on resolvingmore complex cases (Juniper Research, “AI in Retail,”April 2019). Whereas all CAs will be “intelligent,” ourprediction is that the underlying machine learning mech-anisms required to engage the two consumer groups willbe different, dictated by the data that they are willing toreveal. CAs interacting with Buff consumers will havesupervised learning enabled, and greater personalizationwill be possible. In contrast, Ghost consumers will re-ceive mass-personalized CA assistance, where aggregat-ed data enabled by unsupervised learning algorithmswill provide lower value for both the consumers andthe firm. Furthermore, CAs may help firms elicit addi-tional consumer information by nudging Ghost con-sumers to increase self-disclosure via the promise ofmore personalized and valuable CA interactions and

through evoking social responses through anthropomor-phic design.2

Evolution of the Web

Each step in the Web’s evolution (summarized in Table 1) hasbeen accompanied by a significant initial shift in the balanceof power between firms and consumers, and a similar level ofconcern by managers seeking to leverage the available tech-nology to maximize firm value. These initial threats to firmsuccess are identified in the third column of Table 1. Forexample, in the early days of Web 1.0, there was a fear thatby disseminating information widely (particularly price infor-mation), firms would lose their ability to effectively price dis-criminate across markets, segments, and purchase occasions,essentially collapsing markets to a lowest common denomina-tor (e.g., Burke 1997). There was also the fear of engaging incompetitive warfare, forcing firms to quickly “race to the bot-tom,” given the perception that consumers would have fullinformation about competitive offerings (e.g., Peterson,Balasubramanian, & Bronnenberg 1997).

In a similar vein, the shift to Web 2.0 resulted in an in-creased ability for consumers to coordinate their activities,exchange information directly with one another, and furtherorganize themselves into dedicated and vocal specific-interestcommunities. Version 2.0, in conjunction with the rise of so-cial media, marked a substantial reduction in managerial con-trol over the firm’s own messaging, as users created and dis-seminated their own content, with value-relevance to the firm(e.g., Edvardsson, Tronvoll & Gruber 2011).

However, in each of these first two evolutionary stages,over time managers were able to identify important firm ad-vantages (listed in column four of Table 1) that resulted in arelevelling of the balance of power between firms and con-sumers, in the form of increasing levels of data generated byconsumers in these digital environments. Data-rich environ-ments provide firms with greater amounts of consumer knowl-edge, and allow for less intrusive observation of actual behav-iors and preferences. The resulting Digital, Social Media, andMobile (DSMM) ecosystem has been considered as a sourceof intelligence (Lamberton & Stephen 2016) for observing,analyzing, and predicting behavior (Bucklin & Sismeiro2003; Chatterjee, Hoffman, & Novak 2003; Montgomeryet al. 2004). The advancements in this area have led to, amongother developments, efficient behaviorally targeted ads(Lambrecht & Tucker 2013; Summers, Smith & Reczek2016), intelligent product recommendation systems (Ghose

1 This segmentation into only two types of consumers is clearly somewhatsimplistic. However, focusing on Buffs and Ghosts throughout the balance ofthis paper allows us to draw important distinctions between the two groups,and to discuss how marketers will need to adopt very different strategies foridentifying and engaging each. In reality, of course, many consumers are likelyto fall somewhere in between the two extremes that we describe in detail here.

2 In this paper, we use the term “nudge” to describe a process whereby botshave a perceived value that incentivizes consumers to share their data, ratherthan bots designed to extract data unwillingly. As we subsequently discuss, ourperspective here is that marketers should not design CAs to “deliberatelymislead users as to privacy features” (Leong & Selinger, 2019, p. 300).

J. of the Acad. Mark. Sci. (2020) 48:43–6344

et al. 2012), and morphing banner advertising (Urban et al.,2013), which are made possible by leveraging user profilingtechniques (Trusov, Ma, & Jamal 2016).

However, in the current evolution to Web 3.0, marketersface the very real possibility of losing these advantages if theyno longer have the ability to identify consumers and connectthem to their previous behaviors (e.g., Deighton 1997).Customer privacy is central to this potential problem (Martin& Murphy 2017, Stewart 2017).

Over the past two decades, accompanying the growth ofthe market on personal identity and behavior data describedabove, consumers, firms, and governments have increasinglyengaged in discussions of the nature of ownership of this data,as well as potential mechanisms and controls to protect thedifferent needs of these parties. For example, consumers in-creasingly use AdBlock-like technologies to avoid advertis-ing, cookies, and trackers online. Firms use this concern asadditional points of differentiation (e.g., Apple refusing a gov-ernment request to unlock a customer’s iPhone), and govern-ments look for new ways to restrict access to customer data(e.g., the European Union’s General Data Protection Act(GDPR), California’s AB 375 bill).

Altogether, this shifting to a private-as-default behaviormarks the emergence of a newWeb 3.0 environment, whereinnovel technologies and evolving consumer behaviors com-bine to create a new set of challenges and opportunities formarketers. Fortunately, there is a part of the Internet that wecan examine to understand and learn more about this hyper-private future: the Dark Web.

What marketers can learn from the Dark Web

The Web is divided into three sub-components: the SurfaceWeb (e.g., Clearnet), the Deep Web, and the Dark Web. TheSurface Web is the component most people are immediatelyfamiliar with, as it incorporates all of the websites indexed bysearch engines, and represents all of the sites that a person canreasonably and easily navigate to. The Deep Web containsinformation that lies behind some sort of barrier (typically inthe form of passwords) that inhibits easy, unapproved access.For example, individuals’ private bank account informationresides in the Deep Web, secure behind a barrier, well awayfrom any random access requests. Finally, the Dark Web(somewhat of a misnomer, in that it is not very web-like) ismade up of non-indexed and disconnected websites that re-quire specialized software (e.g., The Onion Router; usuallyreferred to as TOR), as well as specific knowledge and autho-rization (i.e., a given URL or onion address) to gain access.

The DarkWeb has gained recent popularity (and notoriety)in the press because of revelations about hidden criminal ac-tivity and black markets (such as the original Silk Road mar-ketplace), whistleblowing websites (like Wikileaks), andTa

ble1

Evolutio

nof

theWeb,bystage

Web

Stage

PrimaryFocus

Threatsto

Firms

Adv

antagesto

Firms

RepresentativeDescription

Web 1.0

Focuson

contentand

inform

ationpresentation;

“Publicationlogic”

Consumershave

fullaccess

toinform

ationand

canlook

forbestdeal

Gainaccess

torich

demographic,

transactional,search

data

“Internet-relatedmarketin

gcanresultinextrem

epricecompetitionwhenproductsor

services

areincapableof

significantd

ifferentiation.Thiscanhappen

whenthey

areperceivedas

commodities,partially

becauseotherfactorsthatmightmoderate

competition(e.g.,storelocatio

n)areabsent

andpartially

becauseof

therelativ

eefficiency

ofpricesearchingengendered

bytheInternet.”

Peterson,B

alasubramanian&

Bronnenberg

(1997,p.336)

Web 2.0

Focuson

multip

arty

communicationand

inform

ationexchange;

“Participationlogic”

Consumerssharecontentand

opinions;firms

have

reducedcontrolo

vermessaging

and

branding

Gainaccess

torelational(network)

data,can

undertakesocial

listening,abilityto

foster

eWoM

“InthecurrentW

eb2.0era,custom

ersarenotonlypassivelyusing,butalsoactiv

ely

creatingandsharing,web

content,andthey

thus

noto

nlyco-createbutalso

co-produce

valueandshapeserviceas

wellassocialsystem

s.Organizations

will

have

toadaptto—

andideally

pro-activelyinfluence—

thisnewsocialreality

ifthey

wanttobe

abletocontinue

tounderstand

andmanageserviceexchangesand

co-creationof

valuewith

theircustom

ersin

thefuture.”

Edvardsson,Tronvoll&

Gruber(2011)

Web 3.0

Focuson

privacy–personalizationbal-

ance;“Privacylogic”

Consumersem

powered

toassertrightto

privacy;

firm

smustfindwaysto

recognize,

understand,and

engage

consum

ers

AIapplications

canprovidecustom

ervalueandincentivizedata

disclosure

totrustedplatform

s

“Evidenceavailablefrom

therecent

past[suggests]thatamarketinprivacywill

emerge,w

ithanonym

ityavailableto

thosewho

valueandcanafford

it…”

(Deighton1997,p

250)

J. of the Acad. Mark. Sci. (2020) 48:43–63 45

activist safe-havens (e.g., Arab Spring). However, the envi-ronment is not exactly new. For example, the TOR system hasbeen around since 2002 (eleven years before the launch of SilkRoad), and similar connecting environments even longer (e.g.,Napster (started in 1998), USENET (1979)).

Research examining darknets and the Dark Web

Over the past quarter century, marketers have studied the im-pact of darknets—restricted and parallel or isolated network(e.g., file transfer P2P, like Napster or Bittorrent)—on a num-ber of categories and industries (e.g., music, film, software).This research has predominantly focused on the effects of P2Pfile sharing (and other piracy behaviors) on price sensitivity(Jain 2008, Sinha & Mandel 2008), substitution effects(Danaher et al. 2010), diffusion (Givon, Mahajan & Muller1995), and control mechanisms/policy (Sinha, Machado, &Sellman 2010). Similarly, other fields have explored tradition-al marketing questions, like the profitability of vendors andresidual value to consumers (Holt, Smirnova, & Chua 2016),while utilizing traditional marketing approaches to guide re-search in this environment (Li, Chen, & Nunamaker 2016;Benjamin, Valacich, & Chen 2019). (See Table 2 for a briefsummary of these papers). What is important to recognize isthat these early (and often illegal) consumer practices haveevolved over time from darknets to the Clearnet, and fromservices like Napster and Pirate Bay into iTunes, Netflix,and others.

In contrast, the Dark Web, as a massive World Wide Web-scaled “darknet” in terms of size and scope of activity, has not

been featured in the marketing literature. In the DarkWeb, onecan observe privacy-seeking customer behaviors, as well asthe genesis of encrypted and private marketplaces that containmundane items (e.g., books, services), in line with the existing“darknet” research. However, the Dark Web also entails amuch broader set of both licit (e.g., Facebook, The New YorkTimes) and illicit (e.g., trade in drugs, weapons, and humantrafficking) commercial activities, as well as consumer-con-sumer, consumer-firm, and consumer-technology interactionsthat extend beyond individual preferences or singular market-places and platforms. Researchers outside of marketing (e.g.,in criminology, information systems, and public policy) havestarted to explore a variety of topics relating to the Dark Web.Table 3 provides a summary of some of these papers; they areincluded here because they touch on topics that are familiar toor that might normally be considered the domain of marketingscholars.

For instance, within criminology, researchers have exam-ined the effect of specific news on illicit sales volumes, find-ing counterintuitive increases in Dark Web transactions(Ladegaard 2019). Other criminology scholars have examinedthe structure of illicit digital markets, and the associated effi-ciency and resiliency they exhibit (Bakken, Moeller &Sandberg 2018; Duxbury & Haynie 2019). In a related ISstudy, Yue, Wang, & Hui (2019) conducted a user-generatedcontent analysis of Dark Web hacker communities, and foundevidence connecting increases in user chatter to a lower fre-quency of cyber-attacks. Closer to home (for marketing re-searchers), policy scholars have focused on aspects of con-sumer well-being in terms of satisfaction and safety (Barratt,

Table 2 Research examining darknets with a marketing focus

Paper DV/Focus Finding

Givon, Mahajan &Muller (1995)

Sales and Product Diffusion Models the diffusion of a product along an official channel and a shadow (illicit, darknet)path, showing that piracy can positively impact the official diffusion through word ofmouth

Jain (2008) Price Competition Piracy can enable lower price competition as price sensitive consumers move to the illicit(darknet) channel

Sinha & Mandel(2008)

Willingness to pay/Willingness topirate

Improved functionality in legal channels significantly reduce willingness to pirate, whileeither increased likelihood of getting caught or severity of punishment can actuallyincrease piracy in more risk tolerant consumers

Sinha, Machado, &Sellman (2010)

Digital Rights Management(DRM) as piracy deterrent

Removal of DRM has the potential to convert darknet consumers into paying consumers,while increasing the demand and willingness to pay for the product

Danaher et al. (2010) Sales and Cannibalization The temporary removal of a legal channel resulted in a disproportionate demand for pirated(darknet) content, in evidence of larger relative fixed/learning costs of piracy, but lowmarginal cost of additional consumption.

Holt, Smirnova, &Chua (2016)

Revenue of Vendors/ Value toBuyers

Data vendors profit from all transactions, whether the items being sold have value or not,earning and estimated $326,000 over the observation period. Buyers bear more risk, butprofit from exploitation of data, earning as much as $2 Million in return.

Li, Chen, &Nunamaker (2016)

Identification of key vendors Utilization of social media analytics, sentiment scoring, and topic modeling to identify keyvendors in data markets

Benjamin, Valacich,& Chen (2019)

Darknet Research Framework Development of a framework for research on darknets, including the identification of datasources, data collection strategies, data evaluation, and ethical concerns

J. of the Acad. Mark. Sci. (2020) 48:43–6346

Ferris, &Winstock 2016; Caudevilla et al. 2016; Van Buskirket al. 2016). Despite these early efforts, however, little isknown about consumer Dark Web behavior, and virtually nomarketing implications have been explored.

The Dark Web as an unregulated, adversarial testbed

The Surface Internet has always lagged somewhat behind theDark Web in terms of the available technology it utilizes. Incontrast, the DarkWeb functions as an unregulated testbed fornew ideas and technologies, with its successes often later mi-grating to the Surface Web. In the Dark Web, unpolished userinterfaces, unstable services, and higher levels of user involve-ment appear to be fairly standard.

Despite the illegal (and immoral) activities often associatedwith the DarkWeb, it continues to embrace a libertarian, hack-er ethos, especially in terms of its respect for experimentationand associated freedoms. In return, the beta-like environmentof the Dark Web is often forgiving of errors; developments donot have to satisfy governmental (or other institutional) stan-dards, and they are not beholden to other stakeholder con-cerns. As a result, trial-and-error is more rampant in theDarkWeb. For example, whereas the SurfaceWeb is currentlygrappling with the high volatility in the value of Bitcoin (themost familiar example of cryptocurrencies), the Dark Webtrades in a variety of different cryptocurrencies, some of whichare market-specific. Similarly, whereas the Surface Web isconcerned with identifiable data leakage, misuse, and manip-ula t ion , the Dark Web opera tes wi th encrypted

communications (PGP) and distributed signals to aid in ano-nymity (TOR).

Over time, we can expect that Surface Web consumers aswell as digital/digitized consumption environments will be-come more similar to what is currently observed in the DarkWeb, with commensurate impact on marketers and firms.What’s the basis for this claim?We already see some adoptionof Dark Web technology and behaviors on the Surface. Forexample, in 2019 the Firefox web browser adopted an anti-fingerprinting measure to increase user privacy and circum-vent advertising relying on tracking. This technology wasoriginally developed for TOR, the Dark Web browser. Alongsimilar lines, Google announced in 2019 anti-fingerprintingactions on their Chrome browser, as well as an entire set ofopen industry standards to safeguard user privacy for the en-tire web, dubbed the ‘Privacy Sandbox.’ Lastly, as perhaps thebest-known example, the WhatsApp application allows forend-to-end encrypted communication for individuals andcommunities, using established cell and data networks to en-sure user privacy. Thus,WhatsApp usage creates a DarkWeb-like experience, wherein consumers employ hyper-privatecommunications of content hidden even from the service pro-vider. As a result, it is impossible for the provider to pass oninformation to third parties; the type of information that hasfueled many firms’ marketing successes in the Web 2.0 envi-ronment. Nor isWhatsApp alone in this space; encrypted mes-sengers include Telegram and Signal, among others. Newencrypted browsers that route all traffic through encryptedvirtual private networks to mask user identity and locationare also being introduced (e.g., Epic).

Table 3 Recent (non-marketing) Dark Web research

Paper DV/Focus Finding

Criminology

Bakken, Moeller &Sandberg (2018)

Market Efficiency Dark Web black markets are formally organized, with hierarchical structures, rules, and limitations.Thesemarkets reduce the costs and risks associated with traditional drugmarketplaces and allows formore efficient transactions

Duxbury & Haynie(2019)

Criminal NetworkResilience

Brokers (diverse connectivity) within a criminal network represent better targets for removal in lawenforcement intervention when compared to high connectivity individuals. Recovery time isconditional on market strategies (secrecy/efficiency).

Ladegaard (2019) Illicit trade volume Media coverage of law enforcement activity on dark web marketplaces does not hinder trade, but ratherincreases the volume of transaction, including immediately following arrests and sentencing

Information Systems

Yue, Wang, & Hui(2019)

Cybersecurity Increasing the number of discussions in Dark Web hacker forums about denial of service (DDOS)attacks actually reduces the number of successful attacks executed

Public Policy

Barratt, Ferris, &Winstock (2016)

Consumer Safety Dark Web marketplaces are associated with significantly fewer threats and violence than alternativedrug markets, even compared to relatively safe private/closed markets

Caudevilla et al. (2016) Drug Purity Hundreds of samples of drugs from Dark Web were tested in a laboratory, finding that the samplescontained the advertised substances, and the majority was of high purity.

Van Buskirk et al. (2016) ConsumerCharacteristics

Dark web drug customers tended to be younger, involved in property crime, and engaging with agrowing set of drugs

J. of the Acad. Mark. Sci. (2020) 48:43–63 47

The primary aim of Dark Web innovation is to maintaincomplete user privacy (e.g., total anonymity). Users can uti-lize “anonymity-granting technologies” to protect their priva-cy from government agencies, political opponents, trolls, data-hungry organizations, and even Internet service providers(Jardine, 2018). In this adversarial environment, individualsview every other entity as a potential “enemy,” eager to ac-quire useful information and ready to deploy it against them.As a result, Dark Web participants use every possible mea-sure, from technological to behavioral, to minimize (or elim-inate) their digital footprints. The end-result of this is that verylittle information is visible about any individuals operating inthe Dark Web, unless they choose to disclose it. (Due to thehigh costs of exposure, particularly in illegal markets, suchdisclosures are quite rare.)

This focus on privacy enables Dark Web users to person-ally control (and thus limit) access to information about them-selves (Altman 1975; Westin, 1967). This view of privacy asselective control represents a common perspective on privacythat originated in Westin’s (1967) and Altman’s (1975) theo-ries of general privacy. For example, Altman defined privacyas “the selective control of access to the self.” The control-based definition of privacy is broadly accepted and has beenused as the foundation of most information privacy research(Smith et al. 2011). But the ability to completely limit accessto the self in order to protect the self comes at a relatively highcost in terms of having to accept lower computer performance,slower internet browsing, and greater inconveniences. In fact,the desire to increase transaction efficiency while remaininganonymous drives much of the Dark Web innovation we de-scribed earlier.

The role of privacy in the emerging darksurface

The privacy-focused behaviors described above and enactedby Dark Web users represent a significant threat to marketingpractices in the current Surface Web, which rely on easy cap-ture of digitally-based consumer information from data richenvironments that facilitate precise targeting, re-targeting(abandoned baskets), behavioral advertising, lookalike model-ling, etc. Furthermore, many firms currently benefit from shar-ing or selling this information to third parties. However, in asystem where consumers have control over their informationand act in ways to look like unknown new visitors (i.e., no partof their digital character is revealed, but instead protected), thevalue of firms’ existing data and models will be vastly dimin-ished. For example, lacking the ability to connect users topreviously collected information (e.g., click-stream data,which is fairly common today), firms will be forced to resortto “average consumer” profiles to predict consumer behavior(only updating when consumers are willing to disclose

specific information about themselves). This data-impoverished environment will result in firms adopting moretraditional mass-market approaches (with attendant lowerprofit potential, loss of efficiency, and eroded effectiveness).

If the Dark Web is indeed the unstable precursor of thefuture Surface Web, we can expect the Surface to go “dark,”and that browsing will become incredibly private in a user-friendly way with minimal to no-cost. The incentive for great-er privacy, allowing consumers to secure control over andlimiting access to their own personal information in all levelsof the Web, will come from a combination of: (1) a continuedincrease in the value of an individual consumer’s “IdentityGraph” (the aggregated total digital and analog data footprintof an individual), (2) improved software, and (3) growinggovernment and quasi-government concerns with privacy vi-olations of individuals.

With respect to the last of these three drivers, there is in-creasing momentum towards the view of privacy as a funda-mental “human right,” and is recognized as such under Article12 of the 1948 UN Universal Declaration of Human Rights3

as well as by the constitutions of many countries. In the past, adistinction has been made (Smith 2001, pp. 1000-1001) be-tween countries that viewed privacy as a human right andpassed “sweeping privacy bills that address all the instancesof data collection, use, and sharing (Bennett & Raab 2006;Dholakia & Zwick 2001)” versus those that viewed privacy asa commodity, and enacted a “patchwork of sector-specificprivacy laws that apply to certain forms of data or specificindustry sectors (Bennett & Raab 2006; Dholakia & Zwick2001).” It follows that countries seeing privacy as a right gen-erally adopt practices where the default is privacy, whereas theprivacy-as-commodity countries instead consider the processof requiring opt-ins to disclosure to be an “undue burden.”However, the momentum towards a view of privacy as a rightdoes shift the market towards an opt-in environment, wherecompanies only have access to data about consumers whochoose to make that specific information available (Smith2001). Privacy that was once described as “the right to belet alone” (Warren & Brandeis, 1890) will be best describedin a few years as “the right and ability to control informationabout the self.”

So, what is inhibiting this shift to a Dark Surface Web?First, Dark Web privacy currently comes with a significantperformance loss (e.g., slow page loading speeds withinTOR), and safe encryption requires managing long publickeys. However, technological advances deployed in theDark Web are increasingly making privacy protection techno-logically feasible at scale, and financially viable. Easy, single-click privacy and encryption software allowing consumers tominimize (or eliminate) their digital footprints will facilitatethis shift in the longer term by minimizing the effort required

3 https://en.wikipedia.org/wiki/Universal_Declaration_of_Human_Rights

J. of the Acad. Mark. Sci. (2020) 48:43–6348

to assure privacy. (For example, consider Google’s activitiesdescribed earlier, and note that the fingerprinting protectionenables greater user privacy at no cost to the user, and with apotential advertising revenue loss to Google).

Second, many current consumers are not fully aware of thevalue of their personal data, and tend to make it available tofirms at little or no cost. Towards this end, states are passingprivacy acts that protect consumers (e.g., see California’s AB375 bill), raising awareness about the costs and risks of freelysharing information with firms and thus exacerbating the con-cerns consumers already have. However, even those con-sumers who are fully aware of the costs, and are concernedabout their privacy, often choose to disclose identifiable per-sonal data (Adjerid et al., 2018). This discrepancy betweenattitudes and behaviors is what scholars refer to as the “priva-cy paradox” (Acquisti et al., 2015). Thus, in addition to ad-vances that make safeguarding privacy viable at the individuallevel, consumer-based behavioral changes on the SurfaceWebwill be necessary in order to ensure that consumers do not“give away” information, rendering privacy softwareirrelevant.

In this new, Dark Surface Web environment, we believe thatthe default consumer behaviorwith respect to granting data usagepermission to firms will be minimal, since customers are becom-ing more reluctant to opt-in and less predisposed to share infor-mation unless given strong incentives to do so (e.g., the “privacycalculus” phenomenon (Dinev & Hart, 2006). In this emergingdark surface environment, the standard consumer will be aGhost, with the firm having very little insight into the nature ofthe individual. In order to overcome this, firms will need toprovide some value, or incentive, for users to engage in a mentalprivacy calculus that may lead them to opt-in and provide thefirmwith additional, consumer-specific information. However, atthe other end of the spectrum, it seems likely that a secondgeneral group of consumers (Buffs)—those who are readily will-ing to share their personal information with firms—may alsoexist.

How can firms, accustomed to having access to digitalfootprints of customers and other profile information to per-sonalize offerings and interactions, operate in a hyper-privateopt-in rather than a naked-to-all opt-out world? While Buffconsumers will share personal information enabling firms tocontinue using their extant methods, firms will have little in-formation on the digital footprint and preferences of Ghostconsumers. One way to entice Ghost consumers to disclosepersonal information is to provide them with financial incen-tives. This strategy would result in an information marketwhere firms could purchase personal information from con-sumers willing to sell it, but also where consumers could pur-chase this information ‘back’, or even sell it to other con-sumers. Studies on privacy calculus already show preliminaryevidence of this dynamic, whereby consumers weigh privacyconcerns and related risks against the benefits of information

disclosure, and sometimes end up trading privacy for mone-tary rewards (e.g., Caudill & Murphy 2000; Hann et al. 2008;Phelps et al. 2000; Xu et al. 2010).

Another possibility is to utilize technology to nudge con-sumers toward self-disclosure in exchange for hyper-person-alization. Hyper-personalization is a significant benefit, sepa-rate from those provided by financial rewards, and consumersare frequently willing to exchange their own privacy for per-sonalized offerings. This complex trade-off between person-alization and privacy is known as the “personalization para-dox” (Aguirre et al. 2016; Bleier & Eisenbeiss 2015).Technology can play a prominent role in this trade-off. Forexample, anthropomorphized technology has the potential tonudge consumers towards greater self-disclosure by transmit-ting social cues that activate social scripts and through con-versations that invoke norms of reciprocity (Moon 2000).Such anthopomorphization can evoke social responses thatencourage greater self-disclosure, even by Ghost consumers.

Ultimately, firms will need to create strategies to personalizeinteractions and provide value to both Buffs and Ghosts. Thoughthere has been extensive research on personalization, the emerg-ing Dark Surface environment that makes consumer profilingless accessible creates new challenges. Firms adapting to thisnew environment will need to understand theways inwhich theyare affected by the “personalization paradox” and whichconsumer-facing technologies will generate the greatest valuefor consumers in a way to tip the trade-off towards data sharing.Among the set of candidate technologies that can provide a leverin this trade-off, the increasing shift towards conversational-commerce (a term coined by Uber’s Chris Messina) providesone of the most compelling candidates. Conversational com-merce refers to the use of natural language interfaces (such aschats and messaging) by consumers to interact real-time withorganizations (humans and bots). Gartner predicts that by 2020,85% of all consumer interactions with a firm will occur viaconversational agents (CAs). As such, given the expected ubiq-uity of the technology in consumer interactions, we see CAs asone critical technological facilitator of firm-consumer exchangein the emerging Dark Surface Web 3.0 environment. In the bal-ance of this paper, we explore in greater detail the role that can beplayed by CAs to nudge consumers towards greater self-disclosure through anthropomorphization, and to provide vary-ing personalization value to each of the two consumer groupsdescribed above.

Conversational agents

Conversational agents (also called chatbots, conversationalAI-bots, virtual assistants, and dialogue systems) are naturallanguage computer programs designed to approximate humanspeech (written or oral) and interact with people via a digitalinterface. Although they have existed since the 1960s (e.g.,

J. of the Acad. Mark. Sci. (2020) 48:43–63 49

ELIZA developed by Joseph Weizenbaum in 1966), conver-sational agents (CAs) have recently garnered substantial in-dustry attention. They are becoming the new front-office faceof many companies, representing a shift from “clicks to con-versations” (Daugherty & Wilson, 2018) and from e-commerce to conversational-commerce. CAs are also becom-ing critical components of the customer service infrastructure,by replacing or augmenting tasks traditionally performed bysales employees (Larivière et al., 2017; Verhagen, van Nes,Feldberg, & van Dolen, 2014) and by providing consumerswith successful service encounters (Larivière et al., 2017; vanDoorn et al., 2017). The recent availability of conversations-as-a-platform (CAAS) tools is making it easier for firms todevelop and deploy such CAs.

Examples of CAs abound, and range from Alexa, which al-lows people to execute a variety of mundane tasks such as order-ing food and tracking flight statuses, to Dressipi’s Amiya, whichhelps customers find and purchase products they want based onstyle preferences. CAs can be entirely digital and exist online(e.g., Bank of America’s Erica), or can have physical embodi-ments and exist offline in organizational settings, stores (e.g.,LoweBot), or one’s home (e.g., Alexa). Given the range ofCAs, one way in which they have been classified is based onwhether they are (a) general-purpose CAs, such as Siri andAlexa, or domain-specific CAs, such as IKEA’s Anna, and (b)whether their primary mode of communication is text-based orspeech-based (Gnewuch, Morana, & Maedche, 2017).4

The major aim of CAs is to enhance both the experienceand the outcomes of consumer interactions with the organiza-tion across sales, marketing, and customer service (Daugherty& Wilson, 2018). For example, Hello Hipmunk is a CA thatmakes it easier and more convenient for people to search andbook vacation trips. As Adam Goldstein, the CEO and co-founder of Hipmunk, noted:

The average traveller runs 20 searches when planning atrip. Hello Hipmunk shrinks that process to one simpleconversation. It can process tons of information fromflight pricing to room availability and synthesize it in-stantly (Staff, 2016, para. 3).

Conversational agents: Competitive assetsin an increasingly dark surface web

CAs that interact with people as useful private assistants oreffective customer service representatives are likely to be

major assets for companies. For example, Juniper Researchpredicts that by 2022 CA use for customer service will savecompanies $8 billion. But beyond being just another customerinteraction tool, CAs can also become a way firms differenti-ate themselves.

Because theymake it more convenient for people to rapidlyaccess data, evaluate information, and execute tasks (Sankar& Balakrishnan, 2016; Shum, He, & Li, 2018), in addition toproviding more enjoyable experiences (Brandtzaeg & Følstad,2018) and a sense of companionship (Turkle 2017;Brandtzaeg & Følstad, 2017), CAs can nudge consumers tovoluntarily share personal data with companies. This sharingof data is clearly important for online interactions, where CAscan prod people to disclose identifiable, rather than de-identi-fied, data (see section below on “Personalizing Interactions forBuff and Ghost Consumers” where we elaborate further onthis). Furthermore, CAs provide a means for companies tocollect offline consumer data, a task that has traditionallyproven more challenging. For example, they can track whatclothing items people bring into fitting rooms and answerquestions related to size availability, color options, matchingaccessories, etc. (Daugherty & Wilson, 2018). In the process,they not only collect information on popular items in general,but can also incentivize consumers to share identifiable per-sonal data in order to receive more personalizedrecommendations.

By directly collecting private data from consumers (bothonline and offline), CAs can enable the generation of moreaccurate identity graphs that allow companies to market prod-ucts and services more efficiently and effectively by, for ex-ample, targeting people with the right content at the right time(McAfee & Brynjolfsson, 2012; Schumann, vonWangenheim, & Groene, 2014; Spangler, Hartzel, & Gal-Or,2006). Collected personal data can also be leveraged by CAsin future interactions to further personalize, at great scale,conversations with people. Mattel’s Hello Barbie, the world’sfirst Barbie CA, represents an example. Not only can HelloBarbie engage in meaningful conversations with children, butshe can also capitalize on the details of prior interactions, suchas a child’s favorite color and beloved pet, to quickly becomea close friend (Vlahos, 2018).

Moreover, CAs can be a source of differentiation and com-petitive advantage when they become orchestrators of custom-er interactions, not just within the company, but across othercompanies as well—for example, people can use Amazon’sAlexa to both order pizza from Domino’s and get flight statusupdates from Delta (Daugherty & Wilson, 2018). AsDaugherty and Wilson (2018, p. 95) point out: “In the past,companies like Domino’s, Capital One, and Delta owned theentire customer experience, but now, with Alexa, Amazonowns part of the information exchange as well as the funda-mental interface between the companies and the customer, andit can use the data to improve its own services.”Consequently,

4 Many other classifications also exist. For example, Gartner describes CAsbased on (a) engagement levels, ranging from the provision of explicit infor-mation at one end to interactive conversation at the other end; and (b) taskcomplexity, ranging from informational at one end to transactional at the otherend.

J. of the Acad. Mark. Sci. (2020) 48:43–6350

companies owning the most popular CA interfaces will beadvantaged.

The ability to use CAs to collect identifiable data frompeople, both online and offline, and both within and acrossfirms, is going to become especially important in a world ofweb platforms where browsing activity is increasingly private(Bursztein, 2017). As a result, companies that motivate andnudge consumers to self-disclose private data, in interactionswith CAs or otherwise, will have an edge.

Furthermore, firms will have to rise to the challenge ofmeaningfully personalizing consumer interactions in such aninformation impoverished environment. Personalization hasbeen identified as one of the most successful relationship-building mechanisms used by firms (Claycomb & Martin,2001), since it increases sales’ leads, customer acquisitionand retention (Bojei et al., 2013; Sahni, Wheeler, &Chintagunta, 2018), firm profit, customer satisfaction, andenables the discovery of novel consumer needs and prefer-ences (Arora et al., 2008; Huang & Rust, 2017). While someCAs may provide an impersonal experience, the more suc-cessful CAs will be designed to engage and personalize theexperience even for Ghost consumers.

Our discussion of CAs in the next sections focuses on thesetwo issues: ethically nudging consumers towards voluntaryself-disclosure of personal data, and designing CAs to engageconsumers through personalized interactions.

Encouraging voluntary self-disclosurewith conversational agents

Developing CAs to nudge people, especially Ghost con-sumers, to self-disclose private data requires research to in-form which design features and personality traits may result inthe creation of engaging, trustworthy, and ethical CAs.

Ethical anthropomorphism One way to foster trust, increaseengagement, and encourage self-disclosure is to ethically an-thropomorphize5 CAs, as individuals often feel less inhibitedwhen interacting with anthropomorphic computers, sharingprivate information (Leong & Selinger, 2019; Turkle, 2017),and even developing personal relationships (Moon, 2000) (seeAppendix 1, Table 7 for a review of prior studies). The processof anthropomorphising CAs can occur in a variety of ways,but some dimensions to consider include name, gender, em-bodiment, a physical (or virtual) appearance that may includeage, ethnicity, and attractiveness, a personality, a voice with acertain tone or expression (if speech-based), and a conversa-tional style that can range from open-ended to predefined

a n s w e r s ( s e e L e o n g & S e l i n g e r , 2 0 1 9 ) .Anthropomorphization is especially effective when the an-thropomorphic features of the CA (such as ethnicity (Qiu &Benbasat, 2009) or personality (Al-Natour, Benbasat, &Cenfetelli, 2006, 2011)) are designed to be similar to thoseof the consumer, a phenomenon we term homophilousanthropomorphism.

Anthropomorphization influences behaviors through anumber of mediating mechanisms. First, anthropomorphisedagents provide nonverbal cues that often generate “mindless”responses from people, to the extent that people apply socialscripts—scripts for human-to-human interaction—to CAs,“essentially ignoring the cues that reveal the essential asocialnature of a computer” (Nass & Moon, 2000). These socialresponses occur as a result of conscious attention to a subsetof contextual cues that trigger various scripts from the past(Langer, 1992; Moon, 2000, 2003; Nass & Moon, 2000;Nass, Steuer, & Tauber, 1994)) that are applied mindlesslyeven when such behaviors seem irrational, inappropriate, orunnecessary (Nass & Moon, 2000).

Second, in addition to evoking social scripts that encouragesocial interaction, social responses to anthropomorphised CAscan nudge consumers towards self-disclosure through evokingnorms of reciprocity. For example, Moon (2000, p. 328)shows that people share intimate data with computers “whencomputers initiate the disclosure process by sharing informa-tion first” and then follow a “socially appropriate sequence ofdisclosure by escalating gradually from superficial to intimatedisclosures.” Thus, to nudge consumers to share information,anthropomorphic CAs can also incorporate design elements ofreciprocity without violating patterns of escalation in disclo-sure (e.g., lie, share too much information too fast, or askpeople to disclose data too early).

Third, anthropomorphism also increases social presence,defined as the degree to which a communication mediumallows one to perceive the communicator as being psycholog-ically present during an interaction (Short, Williams, &Christie, 1976). Social presence resulting from anthropomor-phization has been associated with trust, engagement, andsatisfaction (Kumar & Benbasat, 2006; Picard, 1997; Qiu &Benbasat, 2009; Turkle, 2017; Bleier, Harmeling, & Palmatier2019). In examining specific anthropomorphic features, re-searchers have shown that certain personality characteristicssuch as friendliness and expertise (Verhagen et al., 2014) aswell as embodiment and communication style (Qiu &Benbasat, 2009) influence perceptions of social presence.For example, recommendation agents with animated faces(rather than disembodied ones) and voice outputs providingrich social cues (rather than text) enhance socially presenceand generate higher trust, enjoyment, and perceived benefits(Qui and Benbasat 2009).

Since a CA’s anthropomorphic features can affect both peo-ple’s interactions and engagement with the CA, as well as their

5 Honest or ethical anthropomorphism is the idea that “robot designers shouldnot use anthropomorphism to deliberately mislead users as to privacy features”(Leong & Selinger, 2019, p. 300). For simplicity, from now on we use thewords anthropomorphism, anthropomorphization, or anthropomorphic fea-tures to refer to ethical anthropomorphism.

J. of the Acad. Mark. Sci. (2020) 48:43–63 51

perceptions of its trustworthiness and usefulness, they are alsolikely to influence their privacy-calculus assessments.Consumers weigh privacy concerns and related risks againstthe benefits of information disclosure (e.g., Dinev & Hart,2006). The extent to which CAs are perceived as more engag-ing, enjoyable and useful will magnify their perceived benefitswhile the extent to which they are perceived as more trustwor-thy will reduce the perceived privacy risks. Both effects willshift the privacy-calculus towards greater information disclo-sure. Furthermore, emotions impact the privacy calculus ofconsumers, with positive affect leading to lowered perceptionsof risk (Li, Sarathy, & Xu, 2011), higher intentions to discloseinformation (Anderson & Agarwal, 2011), and more self-disclosure (Kehr, Kowatsch, Wentzel, & Fleisch, 2015; Yu,Hu, & Cheng, 2015).

However, there is a non-linear relationship between anthro-pomorphization and outcomes. While anthropomorphizationcan generate positive marketing results (Aggarwal & McGill,2007), too much of it can lead to negative effects. For exam-ple, some attempts to provide overly humanized agents havecreated unrealistic consumer expectations that turned intofrustration (Knijnenburg & Willemsen, 2016) and abuse(Neff & Nagy, 2016). Excessive anthropomorphism can alsotrigger consumer discomfort (also known as the “uncannyvalley” concept—see Mori, MacDorman, & Kageki, 2012;van Doorn et al., 2017) leading to decreased favorability to-ward the CA (Mende et al., 2019).

Additional research is thus required to better understand theright level and type of anthropomorphic design features foreach context and different consumer groups (i.e., for Buffconsumers vs. Ghost consumers). For example, whilehomophilous anthropomorphism across a number of features(age, gender, personality, race, dialect, etc.) may be possiblewith Buffs (since identifiable data and personal characteristicsare captured and used) there are limits to the level ofhomophilous anthropomorphism possible with Ghosts (giventhat only de-identifiable data at the aggregate level is used).More work is also needed to investigate which anthropomor-phic cues are consequential to organizational outcomes andunder what conditions and which of these may encourageGhost consumers towards greater information disclosure.

Fairness and transparency Alternatively, self-disclosure canbe encouraged when firms provide assurances of algorithmicfairness and transparency (Garfinkel et al., 2017). In terms offairness, audits and certifications provide assurances that thefirm’s CAs are fair and unlikely to generate biased interactionstowards particular subgroups, such as customers of certainrace, gender, or socioeconomic status. This concern is likelyto be more prevalent for the Buffs who disclose such informa-tion. In terms of transparency, developing CAs with predictivemodels that provide explainability, such as logistic regression(rather than “black box” predictive options like neural

networks) or using models such as LIME (LocalInterpretable Model-Agnostic Explanations) that generate ex-planations to describe predictions made by machine learningalgorithms, provide assurances that the firm is committed toaccountability and transparency, and willing to share how theinformation delivered by CAs have been derived. Empiricalevidence (e.g., Wang & Benbasat 2008) suggests that suchtransparency, i.e., explaining the logic behind a particularthought, decision, or recommendation, engenders trust.

Table 4 presents some research questions on how to en-courage voluntary self-disclosure through conversationalagents. Of great importance is identifying which of these aremost effective for nudging Ghost consumers towards disclos-ing more personal information.

Personalizing interactions for Buff and Ghostconsumers

Given that designing CAs to incentivize self-disclosure willnudge some consumers but not others, firms need to designCAs that personalize interactions for both types of consumers.While some CA design attributes are similar across the twogroups, there are also differences and unique features for eachthat derive from the amount and type of data available forpersonalization.

To better understand which design elements impact thepersonalization of consumer interactions with CAs for Buffand Ghost consumers, we structure our discussion around twointerdependent processes that are essential to understandingwhat the consumer needs and how to tailor the CA interactionto match the consumer’s needs.6 The first process focuses onunderstanding consumers and involves the collection of avail-able data and construction of consumer profiles. The secondprocess focuses on generating responses and includesmatching products, services, or information to consumers’needs and emotions, and communicating these to the con-sumers in conversations that are tailored to the individual.

While some dimensions of these processes are similar tocurrent web personalization practices (e.g., session contextmodelling that uses click stream for data collection), othersare unique to personalization with CAs (e.g., use of anthropo-morphism for conversational presentation, emotion and senti-ment tracking, etc.). One of the basic differences lies in thefact that CA interaction design needs to incorporate principlesof both intellectual quotient (IQ) in being able to understandand respond with accuracy to consumer needs and emotional

6 These processes are based on the three-process framework for personalizedrecommendations by Adomavicious and Tuzhilin (2005).We excluded the lastprocess of their framework which focuses on impact since our discussion isrestricted to CA design and not downstream consequences. The processes arealso adapted to the context of CAs based on specifics of the CA architectureand design guidelines for the chat module of the CA (e.g., see Shum et al.2018).

J. of the Acad. Mark. Sci. (2020) 48:43–6352

quotient (EQ) in establishing an emotional connection andidentifying the consumer’s emotions through the conversationand generating responses that are emotionally appropriate,social, and engaging (Shum et al. 2018). Figure 1 shows thedesign implications of each process for Buff and for Ghostconsumers. We organize our discussion around these process-es and follow the structure in Fig. 1.

Understanding Buff and Ghost consumers

To personalize the consumer interaction, one must understandthe consumer (Adomavicius & Tuzhilin, 2005; Johar,Mookerjee, & Sarkar, 2014; Shum et al. 2018; Tam & Ho,2006). In order to understand the two different types of con-sumers we discuss here, CAs must first elicit and collect datafrom the consumers (data collection), use this data to estimatetheir preferences and build profiles (building profile) that theywill consequently use to tailor the interaction (Mobasher,2007; Mobasher, Cooley, & Srivastava, 2000). There areunique differences in these steps between Buffs and Ghosts,since the type of data available to the CA a priori as well assome of the methods used to elicit data from them will bedifferent for each type of consumer. Table 5 shows the differ-ences between Buff and Ghost consumers on the informationsharing spectrum.

Data collection Two different mechanisms currently informthe collection of data: explicit and implicit methods

(Adomavicius, Huang, & Tuzhilin, 2008; Li & Karahanna,2015; Murthi & Sarkar, 2003). Explicit methods directly askconsumers for data, whereas implicit methods infer prefer-ences by monitoring consumers’ behaviors (e.g., productviews on a website). Since Buff consumers self-disclose pri-vate data to firms, CAs in this group can rely on stored demo-graphic data and on implicit methods of data collection bytracking consumers’ online behaviors across devices and in-teractions. This implicit form of data collection is equivalentto how information is currently gathered and used for web,mobile, and other kinds of personalization services offered inthe Surface Web (Chung, Wedel, & Rust, 2016). Such identi-fiable individual-level data can then be leveraged to build theprofile of consumers.

The data collection method for Ghost consumers, however,will have to rely more heavily on explicit methods becausesuch consumers are not identifiable a priori. As a result, CAsin this group must elicit stated needs and preferences by askingquestions and engaging in two-way conversations about, forexample, the purpose of buying a product, features or attributesof a desired item, etc. (Qiu & Benbasat, 2009). Research sug-gests that rich contextual information in long conversationswith CAs may enable CAs to recognize consumers’ interestsand intent even more accurately than having stored consumerprofiles in which the data and information may be incompleteor ambiguous (Shum et al. 2018), making CAs’ potentiallyexplicit methods of eliciting preferences of value to Buff con-sumers as well. Further, since people’s future actions are more

Table 4 Research questions on how to encourage voluntary self-disclosure through conversational agents

EthicalAnthropomorphism

Anthropomorphic FeaturesWhat anthropomorphic features are most effective towards encouraging self-disclosure?How do these features vary depending on whether the consumer is a Ghost consumer or a Buff consumer?How do these features vary based on the task (e.g., customer service vs. purchasing) and on the context (e.g., retail vs.

healthcare vs. financial)?What homophilous anthropomorphic features are most effective, and do the features change based on outcomes of interest

(e.g., trust vs. satisfaction)?What form of homophilous anthropomorphism is possible for Ghost consumers?What combination/level of anthropomorphic features leads to the “uncanny valley” and does this vary by Buff and Ghost

consumer?Mediating MechanismsWhat are the mediating mechanisms between anthropomorphic features and self-disclosure?Which anthropomorphic features operate through which mediating mechanisms?How do these mediating mechanisms vary by Buff and Ghost consumer?Privacy Calculus and TrustWhat anthropomorphic or other CA design features shift the privacy calculus towards self-disclosure?What anthropomorphic features influence the assessment of benefits of the CAs and what are these benefits (e.g., enjoyment,

usefulness, etc.)?What anthropomorphic features influence the assessment of risks of the CAs (e.g., build trust that reduces assessment of risk)?

Transparency andFairness

What institutional and structural assurances (e.g., certifications, audit, etc.) are effective in providing assurances of algorithmicfairness?

How does their effectiveness in encouraging interaction and self-disclosure vary by Buff and Ghost consumer?How does transparency influence self-disclosure?What type and level of transparency? How does this vary byBuff andGhost

consumer?What is the appropriate level and type of transparency for CAs in different contexts (e.g., customer service vs. sales)? Can too

much CA transparency backfire in different contexts?

J. of the Acad. Mark. Sci. (2020) 48:43–63 53

dependent on their past behaviors rather than their stated pref-erences (Hosanagar, 2019), CAs that also elicit data on priorbehavioral activities (e.g., “what other products did you searchfor in the last week?”) will be able to build more accurateconsumer profiles. As we have already discussed, anthropo-morphization, norms of reciprocity in the conversation, andother conversational strategies are important in explicitmethods of data collection to encourage data sharing, especial-ly for Ghost consumers. For CAs to garner trust and enhancethe provision of personal data from Ghosts, they may need tobe able to be transparent about why a certain question wasasked, how they will use the data provided, and what willhappen to the data once the conversation is over. Such ask-but-explain-why transparency can mitigate feelings of vulner-ability (Martin & Murphy, 2017) and incentivize Ghost con-sumers to share additional data with the CA. The ability to useCAs to collect personal data from people in one-on-one con-versations is unique and key to understanding Ghosts better.

In addition to explicit methods of data collection, sessioncontext modelling by CAs (e.g., Shum et al. 2018) allowsthem to gather and use interactive click stream data duringthe specific interaction between the CA and the consumer(i.e., implicit data, see Johar et al., 2014; Padmanabhan,Zheng, & Kimbrough, 2001). This modelling approach candynamically inform the CA’s understanding of the consumerand their intent and is an especially useful source of data topersonalize interactions with Ghost consumers. For example,if a Ghost consumer clicks on a specific product and stays onthe product’s page for some amount of time, CAs can utilizesuch behavior to gauge their interest in the product and pro-vide helpful responses that can nudge the consumer towardspurchase. The profile of Ghosts will, therefore, be based ontheir explicit answers to programmed questions, dynamic be-havior during a specific session extracted through session con-text modelling, and also informed by anonymized aggregatedata of other consumers behaving in similar ways.

Fig. 1 Process of personalizing the CA conversation. Notes: Stages areadapted from the three-stage recommendation process model developedby Adomavicius and Tuzhilin (2005). The processes are adapted to the

context of CAs based on specifics of the CA architecture and designguidelines for the chat module of the CA (see Shum et al. 2018)

J. of the Acad. Mark. Sci. (2020) 48:43–6354

Given that CAs rely on natural language understanding,two other activities (separate from consumer profiling andsession context modelling) are also important to understand-ing the consumer (Shum et al. 2018): (1) understanding themessage, and (2) emotion and sentiment tracking.Understanding the message involves semantic encoding andintent understanding, that is, understanding the purpose of themessage (e.g., Tur & Deng 2011; Vinyals & Le, 2015). Thistask is easier when the CA has a consumer profile in place (asin the case of Buff consumers) where preferences and a historyof interactions can facilitate intent inference. However, emo-tion and sentiment tracking are generally based on the currentinteraction (e.g., Yang et al. 2016, Chen et al. 2016) and can beused to personalize interaction with Buff and Ghost con-sumers alike.

Building profile As the consumer data is collected by the CA,generating personalized interactions requires integrating thedata collected to iteratively build accurate and holistic con-sumer profiles (Adomavicius et al., 2008; Gao, Liu, & Wu,2010). Personalization of CA interactions occurs at each con-versation turn as part of the CA’s response generation process,and is influenced by the consumer profile that has been devel-oped up to that point in the conversation.

The more data collected by the CA, the smaller the groupsegmentations for Ghosts (with the segments getting smallerin size as the information disclosed is increased) and the morecomplete individualization for Buffs (i.e., segments of one)resulting in more personalized CA interactions. Thus, theamount of data collected by CAs will determine the numberof different profiles constructed along with the level and valueof CA personalization. Many systems utilize a collection ofindividual or aggregate consumer facts (e.g., demographicinformation, favorite product, amount spent in an online store)to represent factual profiles in relational databases(Adomavicius & Tuzhilin, 2001). But, as Adomavicius et al.(2008, p. 65) note “factual profiles may not be sufficient in

certain more advanced personalization applications.” This ob-servation is particularly true for Buff consumers, who willinteract with CAs that utilize more advanced profiling tech-niques and leverage granular aspects of their behavior. Thesetechniques may include descriptive models, such as rules, se-quences, and signatures (Tuzhilin, 2008), or predictivemodelssuch as logistic regressions, neural networks, decision trees,support vector machines (SVM), and Bayesian networks(Adomavicius et al., 2008; Murthi & Sarkar, 2003). Thoughthese models can be applied to both Buff and Ghost con-sumers, our discussion that follows highlights the differencesin the types of data used for each group and the types ofinferences made in terms of preferences.

Descriptive rules rely on CAs to examine the attributes ofconsumers and their respective activities (identifiable forBuffs and anonymized for Ghosts) to derive preferences usinga variety of data mining techniques, such as association rulesand classification rule discovery (Adomavicius & Tuzhilin,2005). The sequence approach uses processual browsing ac-tivities to infer consumer preferences (Mannila, Toivonen, &Verkamo, 1997; Niu, Yan, Zhang, & Zhang, 2002). With thistechnique, CAs can leverage frequent episodes and othermethods to learn sequential patterns of behavior, constructingprofiles for both Buff (unique individual path) and Ghost (typ-ical journey) consumers. Signatures are the data structuresused to capture the evolving behavior learned from largestreams of simple transactions (Cortes et al., 2000). An exam-ple signature is “top 5 most frequently browsed product cate-gories over the last 30 days.” This signature can be stored inthe profile of a specific person (Buff) or in the profile of atypical consumer (Ghost) (Adomavicius & Gupta, 2009).Finally, predictive models are based on various aspects ofconsumer behavior and can be built either for a specific person(Buffs) or a whole segment of similar individuals (Ghosts)(Adomavicius et al., 2008). While descriptive and predictivemodels represent advanced profiling techniques, more re-search is still needed to understand what models (descriptive

Table 5 Differences between Buff and Ghost consumers

Buffs Ghosts

Data Available Prior toCA Interaction

Demographic data, prior transaction data, and tracking consumeronline behaviour across devices and interactions

None

Data Collected DuringCA Interaction

Answers to explicit questionsSession Information (clickstream data)Emotions and Sentiments

Answers to explicit questionsSession Information (clickstream data)Emotions and Sentiments

Data Retention After CAInteraction

Full history of interaction and clickstream data Anonymized data is retained and aggregatedwith the other anonymized data to updateaggregate profiles for Ghost consumers

Profile Available Prior toCA Interaction

Identifiable-Individual of the Buff (Rich) Anonymized-Aggregate of Other Ghosts (Poor)

Profile Available AfterCA Interaction

Identifiable-Individual of the Buff(Richer since more data were collected)

Anonymized-Aggregate of this Ghost and Others(Less poor since more data were collected)

Personalized Interaction Hyper-Personalized Mass-Personalized

J. of the Acad. Mark. Sci. (2020) 48:43–63 55

vs predictive) are more effective for CAs interacting with dif-ferent types of consumers under different conditions.Similarly, more research is also needed to develop newmodels (e.g., prescriptive) of consumer profile building in thisnew hyper-private environment.

Generating responses for Buff and Ghost consumers

As the data are collected and the profiles of different con-sumers are iteratively built, CAs need to leverage the informa-tion they have in order to engage in personalized interactionswith consumers. The first step here is matchmaking, whichinvolves the identification of products, services, and informa-tion that accurately match the profile of consumers(Adomavicius & Tuzhilin, 2005; Johar et al., 2014;Mobasher, 2007). This matching would imply personalizingand guiding the conversation to align with what the CA hasassessed as the consumer’s behaviors, preferences, emotions,and needs while minimizing the consumers’ effort (e.g., if aBuff customer interacts with a CA every week to ask about thestatus of their portfolio of investments, the CA can anticipatethis, tailor the conversation, and provide this piece of infor-mation unprompted). After matching consumer preferencesand selecting an appropriate response, CAs must engage inconversations with consumers to present the personalizedinformation. (This dynamic two-way conversational channelis unique to CAs—web personalization, for instance, is not adialogue but rather one-way where consumers receive person-alized recommendations for related products or services.)

Matchmaking Existing approaches differ in that they use dif-ferent sources of information to match consumers preferencesfor products, services, or information (Adomavicius et al.,2008). These approaches include (a) content-based, whichtypically uses the consumers’ stored profile that includes his-torical ratings, viewing behavior, and purchases to match pref-erences; (b) social networks, which leverages the social con-nections of consumers to match their preferences assumingthat people who are friends with one another tend to havesimilar characteristics and preferences; (c) collaborativefiltering, which matches consumers’ preferences based onthe preferences of others who exhibit similar behaviors/preferences as the consumer; and (d) a hybrid approach com-bining the above methods (Adomavicius & Tuzhilin, 2005;Adomavicius & Gupta, 2009; Arazy, Kumar, & Shapira,2010; Li &Karahanna, 2015). The type of approach leveragedby the CAwill heavily depend on the type of data used to buildconsumer profiles (Li & Karahanna, 2015), and consequentlyon whether the consumer is Buff or Ghost. For example, whilecollaborative filtering can be used for both types of con-sumers, social network approaches would only be feasiblefor Buff consumers, and content-based approaches for Ghostconsumers would be constrained to data extracted from the

session context modelling because of the lack of historicaldata on these consumers.

Presenting personalized response The natural language dia-log interaction style of CAs offer the possibility not only topersonalize the response to match the consumer’s product orservice preferences, but also to personalize the conversation inan anthropomorphic way (e.g., tone, style, accent, humor, so-ciability) to further match the consumer’s personality andemotions. Therefore, in addition to identifying what to re-spond to the consumer through the various matchmaking ap-proaches, it is important to identify how to respond to theconsumer.7 According to Shum et al. (2018, p. 6), a CA“may generate responses in attractive styles (e.g., having asense of humor) that improve user engagement. It needs toguide conversation topics and manage an amicable relation-ship in which the user feels he or she is well understood and isinspired to continue to converse with the bot,” which is im-portant for both types of consumers but more so for engagingGhost consumers and understanding their needs. Generatingresponses that reflect a consistent CA personality makes theconversation easier andmore predictable for the consumer andgenerates trust (Shum et al. 2018). As such, CA personalityinformation (e.g., age, gender, etc.) is often incorporated intothe process of generating responses (e.g., see Li et al. 2006;Mathews et al. 2015, Shum et al. 2018).

In addition, a conversation style that embodies both IQ interms of the accuracy of the responses provided as well as EQin terms of emotion appropriateness, can facilitate the gener-ation of trust that what is being presented in the conversationis accurate, fair, explainable, and made benevolently in theinterest of the consumer. While the personalization of “what”is delivered to the user may be hampered by the limited profiledata of Ghost consumers, the personalization of “how” theconversation is conducted is likely less hampered, since itrelies heavily on session information and personality settingsof the CA.

Personalized interactions

As illustrated in Fig. 1, a personalized interaction is theresulting outcome of an iterative process of understandingconsumers and generating responses that takes place as CAschat with Buffs and Ghosts. Given that Buff consumers allowfirms to implicitly collect their personal data and create iden-tifiable profiles based on their historical interactions with thefirm, the interactions they have with CAs will be hyper-per-sonalized, more accurate, and generated with content-based or

7 Clearly, there are many different things impacting how CAs respond toconsumers (e.g., the strategic framing of a message and its linguistic features).Due to space limitations, however, we focus our discussion of the “how” onanthropomorphism.

J. of the Acad. Mark. Sci. (2020) 48:43–6356

social network approaches, or a hybrid of the two. Such hyper-personalization makes it easier and faster for customers tointerface with CAs since their existing patterns can be usedto anticipate future requests, identify information relevant totheir needs, and recommend products or services that matchtheir preferences, and thus reduce search costs. In contrast,Ghost consumers will enjoy interactions that are tailoredbased on mass-personalization. Since the personal preferencesof Ghost consumers are not stored in their consumer profiles,CAs will have to engage in conversations and elicit their statedpreferences, needs, and prior behaviors during each conversa-tion in order to provide them with more personalized experi-ences. Then, personalizing the interaction will be based on thedata elicited through the conversation (the more data collectedby the CA the more personalized the interaction will be) andcollaborative filtering, where the consumer’s profile (dynam-ically built during the interaction) will be matched with

aggregate profiles of other similar consumers. The interactionwill therefore be personalized based on patterns that emergefrom these other aggregate profiles.

The discussion above is meant to be illustrative of differ-ences and similarities in CA design across the two consumergroups and is neither meant to be exhaustive nor comprehen-sive. In general, more research is needed to inform whichdesign elements result in the creation of engaging and person-alized, but also ethical and unobtrusive CAs. Table 6 presentssome illustrative research questions along these lines orga-nized around our framework. Deriving personalization bene-fits of using CAs without being “creepy” while morally at-tending to different needs of the two groups is important be-cause, despite successes, there have been many failures andmany challenges still remain in designing CAs that providehigh quality interactions (Ben Mimoun, Poncin, & Garnier,2012; Chakrabarti & Luger, 2015; Gnewuch et al., 2017;

Table 6 Research agenda for personalizing consumer interactions through conversational agents in the era of hyper-privacy

Process of Personalizing the CAConversation

Research Questions

Understanding Consumers

Data CollectionExplicit / ImplicitClick StreamTime Spent on Page

What data collection techniques, or combinations, are most effective in collecting the type of data needed to buildconsumer profiles and personalize interactions for Ghost consumers?

What is the right combination of explicit and implicit methods that minimizes effort for Ghost consumers withoutgiving rise to privacy concerns (e.g., explicit methods require more effort but generate fewer privacy concernswhereas session modelling (implicit) methods does not involve effort but can give rise to privacy concerns)?

Building ProfileIndividual / Aggregate Identifiable /

AnonymizedSession Context ModellingEmotion & SentimentModels

What models (descriptive vs predictive) are more effective for CAs interacting with Buff and with Ghostconsumers and how do these vary depending on the task (e.g., whether it is transactional or informational) andcontext (e.g., healthcare vs. shopping vs. financial?

Development of new models (e.g., prescriptive) of consumer profile building that rely heavily on explicitmethods of questioning, session context modelling, and emotion and sentiment tracking

Generating Responses

MatchmakingContent-BasedCollaborativeSocial NetworkHybrid

What matchingmethods are most effective for Ghost consumers? And does this vary by the nature of the task andcontext?

Under what conditions is it more effective to match the intent and preferences of Ghost consumers based oncollaborative, content-based, and social networks rather than only using a typical hybrid approach(content-based and social networks)?

Presenting Personalized Response*AnthropomorphismTransparencyFairnessExplainabilityIQ & EQ

What linguistic and dialog design practices can CAs use to engage Buff and Ghost consumers?How does the effectiveness of these vary between the two types of consumers?What is the right level of EQ for Buff and Ghost consumers? Does too much lead to privacy concerns?

Personalized Interactions

Mass- / Hyper- PersonalizationAccuracyEngagementTrustSelf-Disclosure

Which models and combination of models (data collection, profile building, matching, response generation)provide more accurate and engaging interactions and a higher level of perceived personalization?

Do these vary for Buff and Ghost consumers?To what extent is personalization for Buff consumers more accurate than for Ghost consumers? How does this

vary by task and by context?

How does the level of personalization (or perceived personalization) influence (a) trust, (b) the privacy-calculus,(c) self-disclosure? How does this vary by Buff and Ghost consumers?

How does the level of personalization (or perceived personalization) influence consumer outcomes (e.g.,customer satisfaction, loyalty) and firm outcomes (e.g., sales)? Does this vary by Buff and Ghost consumer?

Note: * See Table 4 for Ethical Anthropomorphism, Transparency, and Fairness Research Questions

J. of the Acad. Mark. Sci. (2020) 48:43–63 57

Jenkins et al., 2007; McTear, Callejas, & Griol, 2016;Schuetzler et al., 2014; Shechtman & Horowitz, 2003).

Conclusion

The Web is an evolving complex system that impacts bothfirms and consumers. We review the evolution of the Web,and note that Web changes are driven by dynamics of infor-mation control embedded in market power disputes betweenfirms and consumers. Based on this review, we suggest thatthe Web will significantly change in the next five to ten years.Stakeholder privacy violations will lead government agenciesto enforce new legislations, resulting in an information sharingculture of “opt-in” where consumers will by default be ghostconsumers and no longer allow firms to collect, use, and sharetheir personal information with other organizations. As a

result, firms will lose the ability to create in-depth profile ofconsumers, and personalized practices like micro-targetingmay be at risk.

To survive this coming change, marketers will have to in-centivize consumers to remain in the buff, as many are today,while also serving the needs of consumers who deny accessand become ghost consumers. We suggest that CAs will playan increasingly important role in helping firms market to bothghost and buff consumers. In particular, we argue that CAscan be used to understand and engage both Ghost and Buffconsumers by developing personalized interactions for bothgroups of consumers (see Fig. 1), albeit in different ways. Wealso suggest that CA design may be instrumental in nudgingconsumers to self-disclose private information. CAs that do sowell can become a source of differentiation and competitiveadvantage for the firm.

Appendix

Table 7 Existing studies on anthropomorphic features of conversational agents

Article Features Theory Variables Results Context Method

Aggarwal &McGill(2007)

Smile Schema Congruity IV: Feature TypeMediator:

AnthropomorphicProduct Perceptions

Moderator: AffectiveTag

DV: Product Evaluation

The ability of consumers toanthropomorphize a product and theirconsequent evaluation of that productdepend on the extent to which thatproduct is endowed withcharacteristics congruent with theproposed human schema. Consumers’perception of the product as humanmediates the influence of feature typeon product evaluation. The affectivetag attached to the specific humanschema moderates the evaluation butnot the successful anthropomorphizingof the product

Marketing Experiment

Al-Natouret al. (2006)

Appearance;Gender;Personality;DecisionStrategy;

Online

DecisionalGuidance &Speech ActTheory

DVs: PerceivedSimilarity

Anthropomorphic features manifestdesired personalities & behaviors

E-commerce Experiment

Al-Natouret al. (2011)

Appearance;Gender;

Personality;DecisionStrategy;

Online

Social Psychology& HCI Research

IV: Perceived SimilarityDVs: Usefulness; Ease

of Use; Trust; SocialPresence;Enjoyment;

Perceived similarity influencesenjoyment, social presence, trust, easeof use, & usefulness

E-commerce Experiment

Araujo (2018) Name;Language Style;Framing;Online

Computers as SocialActors

Mediator: SocialPresence

DVs: Satisfaction;Mindless / MindfulAnthropomorphism;Attitude; EmotionalConnection

The usage of human-like language orname were sufficient to increase per-ception of the agent as being human--like. Adopting an intelligent framedoes reduce perceptions of mindlessanthropomorphism for machine-likeagents, yet no such difference was

E-commerce Experiment

J. of the Acad. Mark. Sci. (2020) 48:43–6358

Table 7 (continued)

Article Features Theory Variables Results Context Method

found for the human-like agent. Theusage of

human-like cues had significant positiveinfluence on emotional connection

Mimoun &Poncin(2015)

Gender;Appearance;Ethnicity;Online

Selling Research IVs: Playfulness; SocialPresence

DVs Satisfaction;Purchase Intention;Decision QualityMediator: ShoppingValue

Use of a CA increases shopping value;shopping value mediates the effects ofplayfulness & social presence onsatisfaction & intentions (hedonic) &decision quality (utilitarian)

E-commerce Survey

Ben Mimounet al. (2017)

Gender;Appearance;

Ethnicity;Online

IVs: Involvement;Product Familiarity;Internet Skills; Needfor Interaction

DV: E-ConsumerProductivity

Objective e-consumer productivity de-pends on interaction with CA.Individual characteristics affect per-ceived productivity, either indepen-dently from CA use (for involvementor product familiarity) or in interactionwith CA (for Internet skills and needfor interaction)

E-commerce Experiment

Chattaraman,Kwon, &Gilbert(2012)

Appearance;Online

Computers as SocialActors & SocialResponse Theory

Mediators: Trust; SocialSupport; PerceivedRisk

DV: Trust; Intention

Embedding a CA that serves search andnavigational support functions leads toincreased perceived social support,trust, and intention. The effect of CAson intentions is mediated by andperceived risks and the effect on trust ismediated by perceived social support

E-commerce Experiment

Ciechanowskiet al. (2019)

Appearance;Language Style;Online

The idea of theuncanny valley

DVs: Affect,Electromyography,Respirometer,Electrocardiography,

& ElectrodermalActivity

Participants experienced less uncannyeffects and less negative affect incooperation with a text chatbot thanwith an animated avatar chatbot. Thetext chatbot also induced less intensepsychophysiological reactions

AcademyEnrolmentProcess &CasualConversation

Experiment

Elkins &Derrick(2013)

Tone;Online

InterpersonalAdaptationTheory

DV: Trust Vocal pitch was inversely related toperceived trust, but temporally variant;vocal pitch early in the interviewreflected trust. The CAwas perceivedas more trustworthy when smiling

Mock-ScreeningInterview

Experiment

Knijnenburg&Willemsen(2016)

Appearance;Language Style;

Online

Human-ComputerInteraction &Agent-BasedInterfaces

Mediator:AnthropomorphicBeliefs

DVs: Exploitation; Use

Users of more humanlike agents try toexploit capabilities that were notsignaled by the system. This severelyreduces the usability of systems thatlook human but lack humanlikecapabilities. Users of humanlike agentsalso form anthropomorphic beliefsabout the system: They act humanliketowards the system and try to exploittypical humanlike capabilities theybelieve the system possesses

Tourism Experiment

Li et al. (2019) Voice;Online

Computers as SocialActors

IV: Space TypeMediator:

Self-AwarenessDV: Self-disclosure

Users were most willing to discloseinformation about their tastes andinterests and least willing to disclosemoney information. Users in the livingspace were willing to disclose moreinformation than those in theworkspace, which was mediated byusers’ expectations for the reciprocalservices of CAs rather than theawareness of other persons or externalfactors

Control &Question &Answer Task

Experiment

Louwerseet al. (2005)

Appearance;Voice;Gender;Online

Social AgencyTheory & SocialCue Hypothesis

DV: Comprehension &Impression Scores

Participants preferred natural agents withnatural voices. Although female agentswith male voices formed an exception,this was explained by a stereotypeeffect

Education Experiment

Gender; Services Experiment

J. of the Acad. Mark. Sci. (2020) 48:43–63 59

Open Access This article is distributed under the terms of the CreativeCommons At t r ibut ion 4 .0 In te rna t ional License (h t tp : / /creativecommons.org/licenses/by/4.0/), which permits unrestricted use,distribution, and reproduction in any medium, provided you giveappropriate credit to the original author(s) and the source, provide a linkto the Creative Commons license, and indicate if changes were made.

References

Acquisti, A., Brandimarte, L., & Loewenstein, G. (2015). Privacy andhuman behavior in the age of information. Science, 347(6221), 509–514.

Adjerid, I., Acquisti, A., & Loewenstein, G. (2018). Choice architecture,framing, and cascaded privacy choices.Management Science, 65(5),2267–2290.

Adomavicius, D., & Tuzhilin, A. (2005). Personalization technologies: Aprocess-oriented perspective. Communications of the ACM, 48(10),83–90. https://doi.org/10.1007/s11576-006-0098-7.

Adomavicius, G., & Tuzhilin, A. (2005). Toward the next generation ofrecommender systems: A survey of the state-of-the-art and possibleextensions. IEEE Transactions on Knowledge & Data Engineering,(6), 734-749.

Adomavicius, G., & Gupta, A. (2009). Business Computing. EmeraldGroup Publishing.

Adomavicius, G., Huang, Z., & Tuzhilin, A. (2008). Personalization andRecommender Systems. In Z.-L. Chen, S. Raghavan, P. Gray, & H.J. Greenberg (Eds.), State-of-the-Art Decision-Making Tools in theInformation-Intensive Age (pp. 55–107). https://doi.org/10.1287/educ.1080.0044

Adomavicius, G., & Tuzhilin, A. (2001). Using data mining methods tobuild customer profiles. Computer, 34(2), 74–82. https://doi.org/10.1109/2.901170.

Aggarwal, P., & McGill, A. L. (2007). Is that Car smiling at me? Schemacongruity as a basis for evaluating anthropomorphized products.Journal of Consumer Research, 34(4), 468–479. https://doi.org/10.1086/518544.

Aguirre, E., Roggeveen, A. L., Grewal, D., & Wetzels, M. (2016). Thepersonalization-privacy paradox: Implications for new media.Journal of Consumer Marketing, 33(2), 98–110.

Al-Natour, S., Benbasat, I., & Cenfetelli, R. T. (2006). The role of designcharacteristics in shaping perceptions of similarity: The case of on-line shopping assistants. Journal of the Association for InformationSystems; Atlanta, 7(12), 821–861.

Al-Natour, S., Benbasat, I., & Cenfetelli, R. T. (2011). The adoption ofonline shopping assistants: Perceived similarity as an antecedent toevaluative beliefs. Journal of the Association for InformationSystems, 12(5), 347–374. https://doi.org/10.17705/1jais.00267.

Altman, I. (1975). The Environment and Social Behavior: Privacy,Personal Space, Territory, and Crowding.

Anderson, C. L., & Agarwal, R. (2011). The digitization of healthcare:boundary risks, emotion, and consumer willingness to disclose

Table 7 (continued)

Article Features Theory Variables Results Context Method

Mende et al.(2019)

Appearance;EthnicityTone;Name;Physical

The idea of theuncanny valley

Mediator: EerinessLevel, PerceivedThreat to HumanIdentity;Machinization of CA

Moderator: SocialBelonging & FoodType

DV: CompensatoryResponse; CustomerFood Consumption

Consumers display compensatoryresponses when they interact with CAsrather than a human. CAs elicit greaterconsumer discomfort (i.e., eeriness anda threat to human identity), which inturn results in the enhancement ofcompensatory consumption.Compensatory responses are mitigatedwhen social belongingness is high,food is perceived as more healthful,and the CA is machinized (rather thananthropomorphized)

Qiu &Benbasat(2009)

Appearance;Tone;Online

Social RelationshipPerspective

Mediator: SocialPresence

DVs: Trust; Enjoyment;Intention

Anthropomorphism influences users’perceptions of social presence, whichin turn enhances users’ trusting beliefs,perceptions of enjoyment, andultimately, their intentions to use theagent as a decision aid

E-commerce Experiment

van Doornet al. (2017)

Tendency toAnthropomorp-hize

Social cognition &PsychologicalOwnership

Mediator: SocialCognition;PsychologicalOwnership

Moderator: Customer’sRelationshipOrientation;TechnologyReadiness

DVs: Social Presence

We have proposed that social cognitionand psychological ownership mediatethe relationship between automatedsocial presence (ASP) and service out-comes and that key factors (relation-ship orientation, anthropomorphism,and technology readiness) moderatethe effects of

ASP.

Services Conceptual

Verhagen et al.(2014)

Appearance;Personality;CommunicationStyle;

Online

Implicit Personality,Social Response,EmotionalContagion,Social Interaction

IVs: Friendliness;Expertise; Smile

DVs: Social Presence;Personalization;Satisfaction

An empirical study confirms the crosschannel applicability of friendlinessand expertise as determinants of socialpresence and personalization

E-commerce Experiment

J. of the Acad. Mark. Sci. (2020) 48:43–6360

personal health information. Information Systems Research, 22(3),469-490.

Araujo, T. (2018). Living up to the chatbot hype: The influence of an-thropomorphic design cues and communicative agency framing onconversational agent and company perceptions. Computers inHuman Behavior, 85, 183-189.

Arazy, O., Kumar, N., & Shapira, B. (2010). A Theory-Driven DesignFramework for Social Recommender Systems. Journal of theAssociation for Information Systems, 11(9). Retrieved fromhttps://aisel.aisnet.org/jais/vol11/iss9/2

Arora, N., Dreze, X., Ghose, A., Hess, J. D., Iyengar, R., Jing, B., et al.(2008). Putting one-to-one marketing to work: Personalization, cus-tomization, and choice. Marketing Letters, 19(3–4), 305–321.https://doi.org/10.1007/s11002-008-9056-z.

Bakken, S. A., Moeller, K., & Sandberg, S. (2018). Coordination prob-lems in cryptomarkets: Changes in cooperation, competition andvaluation. European Journal of Criminology, 15(4), 442–460.

Barratt, M. J., Ferris, J. A., & Winstock, A. R. (2016). Safer scoring?Cryptomarkets, social supply and drug market violence.International Journal of Drug Policy, 35, 24–31.

Benjamin, V., Valacich, J. S., & Chen, H. (2019). DICE-E: A frameworkfor conducting Darknet identification, collection, evaluation withethics. MIS Quarterly, 43(1), 1–22.

Ben Mimoun, M. S., Poncin, I., & Garnier, M. (2012). Case study:Embodied virtual agents: An analysis on reasons for failure.Journal of Retailing and Consumer Services, 19, 605.

Mimoun, M. S. B., Poncin, I., & Garnier, M. (2017). Animated conver-sational agents and e-consumer productivity: The roles of agents andindividual characteristics. Information & Management, 54(5), 545–559.

Bennett, C. J., & Raab, C. (2006). The governance of privacy: Policyinstruments in global perspective. Cambridge: The MIT Press.

Bleier, A., & Eisenbeiss, M. (2015). The importance of trust for person-alized online advertising. Journal of Retailing, 91(3), 390–409.

Bleier, A., Harmeling, C. M., & Palmatier, R. W. (2019). Creating effec-tive online customer experiences. Journal of Marketing, 83(2), 98–119.

Bojei, J., Julian, C. C., Wel, C. A. B. C., & Ahmed, Z. U. (2013). Theempirical link between relationship marketing tools and consumerretention in retail marketing. Journal of Consumer Behaviour, 12(3),171–181. https://doi.org/10.1002/cb.1408.

Brandtzaeg, P. B., & Følstad, A. (2017). Why People Use Chatbots. In I.Kompatsiaris, J. Cave, A. Satsiou, G. Carle, A. Passani, E.Kontopoulos, … D. McMillan (Eds.), Internet Science (Vol.10673, pp. 377–392). https://doi.org/10.1007/978-3-319-70284-1_30

Brandtzaeg, P. B., & Følstad, A. (2018). Chatbots: Changing user needsandmotivations. Interactions, 25(5), 38–43. https://doi.org/10.1145/3236669.

Bucklin, R. E., & Sismeiro, C. (2003). A model of web site browsingbehavior estimated on clickstream data. Journal of MarketingResearch, 40(3), 249–267.

Burke, R. R. (1997). Do you see what I see? The future of virtual shop-ping. Journal of the Academy ofMarketing Science, 25(4), 352–360.

Bursztein, E. (2017). Understanding how people use private browsing.Retrieved February 6, 2019, from Elie Bursztein’s site website:https://www.elie.net/blog/privacy/understanding-how-people-use-private-browsing

Caudevilla, F., Ventura, M., Fornís, I., Barratt, M. J., Vidal, C., Quintana,P., et al. (2016). Results of an international drug testing service forcryptomarket users. International Journal of Drug Policy, 35, 38–41.

Caudill, E. M., & Murphy, P. E. (2000). Consumer online privacy: Legaland ethical issues. Journal of Public Policy & Marketing, 19(1), 7–19.

Chakrabarti, C., & Luger, G. F. (2015). Artificial conversations for cus-tomer service chatter bots. Expert Systems with Applications,42(20), 6878–6897. https://doi.org/10.1016/j.eswa.2015.04.067.

Chattaraman, V., Kwon, W. S., & Gilbert, J. E. (2012). Virtual agents inretail web sites: Benefits of simulated social interaction for olderusers. Computers in Human Behavior, 28(6), 2055-2066.

Chatterjee, P., Hoffman, D. L., & Novak, T. P. (2003). Modeling theclickstream: Implications for web-based advertising efforts.Marketing Science, 22(4), 520–541.

Chen, H., Sun, M., Tu, C., Lin, Y., & Liu, Z. (2016, November). Neuralsentiment classification with user and product attention. InProceedings of the 2016 Conference on Empirical Methods inNatural Language Processing (pp. 1650-1659).

Chung, T. S., Wedel, M., & Rust, R. T. (2016). Adaptive personalizationusing social networks. Journal of the Academy of MarketingScience, 44(1), 66–87. https://doi.org/10.1007/s11747-015-0441-x.

Claycomb, C., & Martin, C. L. (2001). Building customer relationships:An inventory of service providers’ objectives and practices.Marketing Intelligence & Planning, 19(6), 385–399. https://doi.org/10.1108/EUM0000000006109.

Cortes, C., Fisher, K., Pregibon, D., Rogers, A., & Smith, F. (2000).Hancock: A language for extracting signatures from data streams.In Proc. of the 2000 ACM SIGKDD Intl. Conf. on KnowledgeDiscovery and Data Mining, 9–17.

Danaher, B., Dhanasobhon, S., Smith, M. D., & Telang, R. (2010).Converting pirates without cannibalizing purchasers: The impactof digital distribution on physical sales and internet piracy.Marketing Science, 29(6), 1138–1151.

Daugherty, P. R., & Wilson, H. J. (2018). Human + machine:Reimagining work in the age of AI. Retrieved from http://public.eblib.com/choice/publicfullrecord.aspx?p=5180063

Deighton, J. (1997). Commentary on" exploring the implications of theinternet for consumer marketing". Journal of the Academy ofMarketing Science, 25(4), 347–351.

Dholakia, N., & Zwick, D. (2001). Privacy and consumer agency in theinformation age: between prying profilers and preening webcams.Journal of Research for Consumers, 1(1).

Dinev, T., & Hart, P. (2006). An extended privacy Calculus model for E-commerce transactions. Information Systems Research, 17(1), 61–80. https://doi.org/10.1287/isre.1060.0080.

Duxbury, S. W., & Haynie, D. L. (2019). Criminal network security: Anagent-based approach to evaluating network resilience.Criminology, 57(2), 314–342.

Edvardsson, B., Tronvoll, B., & Gruber, T. (2011). Expanding under-standing of service exchange and value co-creation: A social con-struction approach. Journal of the Academy of Marketing Science,39(2), 327–339.

Elkins, A. C., & Derrick, D. C. (2013). The sound of trust: voice as ameasurement of trust during interactions with embodied conversa-tional agents. Group decision and negotiation, 22(5), 897-913.

Gao,M., Liu, K., &Wu, Z. (2010). Personalisation in web computing andinformatics: Theories, techniques, applications, and future research.Information Systems Frontiers, 12(5), 607–629. https://doi.org/10.1007/s10796-009-9199-3.

Garfinkel, S., Matthews, J., Shapiro, S. S., & Smith, J. M. (2017). Towardalgorithmic transparency and accountability.Communications of theACM, 60(9), 5–5. https://doi.org/10.1145/3125780.

Ghose, A., Ipeirotis, P. G., & Li, B. (2012). Designing ranking systemsfor hotels on travel search engines by mining user-generated andcrowdsourced content. Marketing Science, 31(3), 493–520.

Givon, M., Mahajan, V., & Muller, E. (1995). Software piracy:Estimation of lost sales and the impact on software diffusion.Journal of Marketing, 59(1), 29–37.

Gnewuch, U., Morana, S., & Maedche, A. (2017). Towards DesigningCooperative and Social Conversational Agents for CustomerService. 15.

J. of the Acad. Mark. Sci. (2020) 48:43–63 61

Hann, I. H., Hui, K. L., Lee, S. Y. T., & Png, I. P. (2008). Consumerprivacy and marketing avoidance: A static model. ManagementScience, 54(6), 1094–1103.

Holt, T. J., Smirnova, O., & Chua, Y. T. (2016). Exploring and estimatingthe revenues and profits of participants in stolen data markets.Deviant Behavior, 37(4), 353–367.

Hosanagar, K. (2019). A Human’s guide to machine intelligence: Howalgorithms are shaping our lives and how we can stay in control.New York: Viking.

Huang, M.-H., & Rust, R. T. (2017). Technology-driven service strategy.Journal of the Academy of Marketing Science, 45(6), 906–924.https://doi.org/10.1007/s11747-017-0545-6.

Jardine, E. (2018). Tor, what is it good for? Political repression and theuse of online anonymity-granting technologies. New Media &Society, 20(2), 435–452.

Jain, S. (2008). Digital piracy: A competitive analysis. MarketingScience, 27(4), 610–626.

Jenkins, M.-C., Churchill, R., Cox, S., & Smith, D. (2007). Analysis ofUser Interaction with Service Oriented Chatbot Systems.Proceedings of the 12th International Conference on Human-Computer Interaction: Intelligent Multimodal InteractionEnvironments, 76–83. Retrieved from http://dl.acm.org/citation.cfm?id=1769590.1769600

Johar, M., Mookerjee, V., & Sarkar, S. (2014). Selling vs. profiling:Optimizing the offer set in web-based personalization. InformationSystems Research, 25(2), 285–306. https://doi.org/10.1287/isre.2014.0518.

Kehr, F., Kowatsch, T., Wentzel, D., & Fleisch, E. (2015). Blissfullyignorant: the effects of general privacy concerns, general institution-al trust, and affect in the privacy calculus. Information SystemsJournal, 25(6), 607-635.

Knijnenburg, B. P., & Willemsen, M. C. (2016). Inferring capabilities ofintelligent agents from their external traits. ACM trans. Interact.Intell. Syst., 6(4), 28:1–28:25. https://doi.org/10.1145/2963106.

Kumar, N., & Benbasat, I. (2006). Research note: The influence of rec-ommendations and consumer reviews on evaluations of websites.Information Systems Research, 17(4), 425–439. https://doi.org/10.1287/isre.1060.0107.

Ladegaard, I. (2019). Crime displacement in digital drug markets.International Journal of Drug Policy, 63, 113–121.

Lamberton, C., & Stephen, A. T. (2016). A thematic exploration of dig-ital, social media, and mobile marketing: Research evolution from2000 to 2015 and an agenda for future inquiry. Journal ofMarketing, 80(6), 146–172.

Lambrecht, A., & Tucker, C. (2013). When does retargeting work?Information specificity in online advertising. Journal of MarketingResearch, 50(5), 561–576.

Langer, E. J. (1992). Matters of mind: Mindfulness/mindlessness in per-spective. Consciousness and Cognition, 1(3), 289–305. https://doi.org/10.1016/1053-8100(92)90066-J.

Larivière, B., Bowen, D., Andreassen, T. W., Kunz, W., Sirianni, N. J.,Voss, C., et al. (2017). “Service encounter 2.0”: An investigationinto the roles of technology, employees and customers. Journal ofBusiness Research, 79, 238–246. https://doi.org/10.1016/j.jbusres.2017.03.008.

Leong, B., & Selinger, E. (2019). Robot Eyes Wide Shut: UnderstandingDishonest Anthropomorphism. Proceedings of the Conference onFairness, Accountability, and Transparency - FAT* ‘19, 299–308.https://doi.org/10.1145/3287560.3287591

Li, S., & Karahanna, E. (2015). Online recommendation systems in aB2C E-commerce context: A review and future directions. Journalof the Association for Information Systems, 16(2), 72–107. https://doi.org/10.17705/1jais.00389.

Li, H., Sarathy, R., & Xu, H. (2011). The role of affect and cognition ononline consumers' decision to disclose personal information to un-familiar online vendors. Decision Support Systems, 51(3), 434-445.

Li, W., Chen, H., & Nunamaker, J. F. (2016). Identifying and profilingkey sellers in cyber carding community: AZSecure text mining sys-tem. Journal of Management Information Systems, 33(4), 1059–1086.

Louwerse, M. M., Graesser, A. C., Lu, S., & Mitchell, H. H. (2005).Applied Cognitive Psychology: The Official Journal of the Societyfor Applied Research in Memory and Cognition. Social cues inanimated conversational agents, 19(6), 693-704.

Mannila, H., Toivonen, H., & Verkamo, A. I. (1997). Discovery ofFrequent Episodes in Event Sequences. 31.

Martin, K. D., & Murphy, P. E. (2017). The role of data privacy inmarketing. Journal of the Academy of Marketing Science, 45(2),135–155. https://doi.org/10.1007/s11747-016-0495-4.

Mathews, A. Xie, L., & He, X. (2015). SentiCap: generating image de-scriptions with sentiments, Conference of the Association for theAdvancement of Artificial Intelligence (AAAI), 2015.

McAfee, A., & Brynjolfsson, E. (2012). Big Data: The ManagementRevolution. Harvard Business Review, (October 2012). Retrievedfrom https://hbr.org/2012/10/big-data-the-management-revolution

McTear, M., Callejas, Z., & Griol, D. (2016). The ConversationalInterface: Talking to Smart Devices (1st ed.). Springer PublishingCompany, Incorporated.

Mende, M., Scott, M. L., van Doorn, J., Grewal, D., & Shanks, I. (2019).Service robots rising: How humanoid robots influence service ex-periences and elicit compensatory consumer responses. Journal ofMarketing Research, 56(4), 535–556. https://doi.org/10.1177/0022243718822827.

Mimoun,M. S. B., & Poncin, I. (2015). Avalued agent: How ECAs affectwebsite customers' satisfaction and behaviors. Journal of Retailingand Consumer Services, 26, 70-82.

Mobasher, B. (2007). The AdaptiveWeb (P. Brusilovsky, A. Kobsa, &W.Nejdl, Eds.). Retrieved from http://dl.acm.org/citation.cfm?id=1768197.1768201

Mobasher, B., Cooley, R., & Srivastava, J. (2000). Automatic personali-zation based on web usage mining. Communications of the ACM,43(8), 142–151. https://doi.org/10.1145/345124.345169.

Montgomery, A. L., Li, S., Srinivasan, K., & Liechty, J. C. (2004).Modeling online browsing and path analysis using clickstream data.Marketing Science, 23(4), 579–595.

Moon, Y. (2000). Intimate exchanges: Using computers to elicit self-disclosure from consumers. Journal of Consumer Research, 26(4),323–339. https://doi.org/10.1086/209566.

Moon, Y. (2003). Don’t blame the computer: When self-disclosure mod-erates the self-serving bias. Journal of Consumer Psychology,13(1/2), 14.

Mori, M., MacDorman, K. F., & Kageki, N. (2012). The uncanny valley[from the field]. IEEE Robotics & Automation Magazine, 19(2), 98-100.

Murthi, B. P. S., & Sarkar, S. (2003). The role of the management sci-ences in research on personalization. Management Science, 49(10),19.

Nass, C., & Moon, Y. (2000). Machines and mindlessness: Social re-sponses to computers. Journal of Social Issues, 56(1), 81–103.https://doi.org/10.1111/0022-4537.00153.

Nass, C., Steuer, J., & Tauber, E. R. (1994). Computers Are SocialActors. Proceedings of the SIGCHI Conference on Human Factorsin Computing Systems, 72–78. https://doi.org/10.1145/191666.191703

Neff, G., & Nagy, P. (2016). Talking to bots: Symbiotic agency and thecase of Tay. International Journal of Communication, 10, 17.

Niu, L., Yan, X.-W., Zhang, C.-Q., & Zhang, S.-C. (2002). Producthierarchy-based customer profiles for electronic commerce recom-mendation. Proceedings. International Conference on MachineLearning and Cybernetics, 2, 1075–1080 vol.2. https://doi.org/10.1109/ICMLC.2002.1174549

J. of the Acad. Mark. Sci. (2020) 48:43–6362

Padmanabhan, B., Zheng, Z., & Kimbrough, S. O. (2001).Personalization from Incomplete Data: What You Don’t KnowCan Hurt. In Proceedings of the Seventh ACM SIGKDDInternational Conference on Knowledge Discovery and DataMining (KDD-2001, 154–163.

Peterson, R. A., Balasubramanian, S., & Bronnenberg, B. J. (1997).Exploring the implications of the internet for consumer marketing.Journal of the Academy of Marketing Science, 25(4), 329.

Phelps, J., Nowak, G., & Ferrell, E. (2000). Privacy concerns and con-sumer willingness to provide personal information. Journal ofPublic Marketing, 19, 27–44.

Picard, R. W. (1997). MIT Media Laboratory; Perceptual Computing; 20Ames St., Cambridge, MA 02139 [email protected], http://www.media.mit.edu/˜picard/. 16.

Qiu, L., & Benbasat, I. (2009). Evaluating anthropomorphic product rec-ommendation agents: A social relationship perspective to designinginformation systems. Journal of Management Information Systems,25(4), 145–182. https://doi.org/10.2753/MIS0742-1222250405.

Sahni, N. S., Wheeler, S. C., & Chintagunta, P. (2018). Personalization inemail marketing: The role of noninformative advertising content.Marketing Science, 37(2), 236–258. https://doi.org/10.1287/mksc.2017.1066.

Sankar, R., & Balakrishnan, K. (2016). Implementation of an inquisitivechatbot for database supported knowledge bases. Sadhana, 41(10),6.

Schuetzler, R. M., Grimes, M., Giboney, J. S., & Buckman, J. (2014).Facilitating Natural Conversational Agent Interactions: Lessonsfrom a Deception Experiment. Human Computer Interaction, 17.

Schumann, J. H., von Wangenheim, F., & Groene, N. (2014). Targetedonline advertising: Using reciprocity appeals to increase acceptanceamong users of free web services. Journal of Marketing, 78(1), 59–75. https://doi.org/10.1509/jm.11.0316.

Shechtman, N., & Horowitz, L. M. (2003). Media inequality in conver-sation: How people behave differently when interacting with com-puters and people. Proceedings of the Conference onHuman Factorsin Computing Systems - CHI ‘03, 281. https://doi.org/10.1145/642611.642661

Short, J., Williams, E., & Christie, B. (1976). The social psychology oftelecommunications. London; New York: Wiley.

Shum, H., He, X., & Li, D. (2018). From Eliza to XiaoIce: Challengesand opportunities with social chatbots. Frontiers of InformationTechnology & Electronic Engineering, 19(1), 10–26. https://doi.org/10.1631/FITEE.1700826.

Sinha, R. K., Machado, F. S., & Sellman, C. (2010). Don't think twice, it'sall right: Music piracy and pricing in a DRM-free environment.Journal of Marketing, 74(2), 40–54.

Sinha, R. K., & Mandel, N. (2008). Preventing digital music piracy: Thecarrot or the stick? Journal of Marketing, 72(1), 1–15.

Smith, H. J., Dinev, T., & Xu, H. (2011). Information privacy research:An interdisciplinary review. MIS Quarterly, 35(4), 989–1016.

Smith, H. J. (2001). Information privacy and marketing: What the USshould (and shouldn't) learn from Europe. California ManagementReview, 43(2), 8–33.

Spangler, W. E., Hartzel, K. S., & Gal-Or, M. (2006). Exploring theprivacy implications of addressable advertising and viewer profil-ing. Communications of the ACM, 49(5), 119–123. https://doi.org/10.1145/1125944.1125951.

Staff, F. M. (2016). Try Hello HipmunkTM to Chat Before You Pack.Retrieved February 6, 2019, from Tailwind by Hipmunk website:/tailwind/lets-talk-travel-try-hello-hipmunk-to-chat-before-you-pack/.

Stewart, D.W. (2017). A comment on privacy. Journal of the Academy ofMarketing Science, 45(2), 156–159.

Summers, C. A., Smith, R. W., & Reczek, R. W. (2016). An audience ofone: Behaviorally targeted ads as implied social labels. Journal ofConsumer Research, 43(1), 156–178.

Tam, & Ho. (2006). Understanding the impact of web personalization onuser information processing and decision outcomes.MIS Quarterly,30(4), 865. https://doi.org/10.2307/25148757.

Trusov, M., Ma, L., & Jamal, Z. (2016). Crumbs of the cookie: Userprofiling in customer-base analysis and behavioral targeting.Marketing Science, 35(3), 405–426.

Tur, G., & Deng, L. (2011). Intent determination and spoken utteranceclassification. Spoken language understanding: systems forextracting semantic information from speech. Wiley, Chichester,93-118.

Turkle, S. (2017). Alone together: Why we expect more from technologyand less from each other. /z-wcorg/.

Tuzhilin, A. (2008). Personalization: The state of the art and future direc-tions. In Business Computing. Retrieved from https://www.researchgate.net/publication/302958247_Personalization_The_state_of_the_art_and_future_directions

Urban, G. L., Liberali, G., MacDonald, E., Bordley, R., & Hauser, J. R.(2013). Morphing banner advertising. Marketing Science, 33(1),27–46.

Van Buskirk, J., Roxburgh, A., Bruno, R., Naicker, S., Lenton, S.,Sutherland, R., et al. (2016). Characterising dark net marketplacepurchasers in a sample of regular psychostimulant users.International Journal of Drug Policy, 35, 32–37.

van Doorn, J., Mende, M., Noble, S. M., Hulland, J., Ostrom, A. L.,Grewal, D., & Petersen, J. A. (2017). Domo arigato Mr. Roboto:Emergence of automated social presence in organizational frontlinesand customers’ service experiences. Journal of Service Research,20(1), 43–58. https://doi.org/10.1177/1094670516679272.

Verhagen, T., van Nes, J., Feldberg, F., & van Dolen, W. (2014). Virtualcustomer service agents: Using social presence and personalizationto shape online service encounters. Journal of Computer-MediatedCommunication, 19(3), 529–545. https://doi.org/10.1111/jcc4.12066.

Vinyals, O., & Le, Q. (2015). A neural conversational model. arXivpreprint arXiv:1506.05869.

Vlahos, J. (2018). Barbie Wants to Get to Know Your Child. The NewYork Times. Retrieved from https://www.nytimes.com/2015/09/20/magazine/barbie-wants-to-get-to-know-your-child.html

Wang,W., & Benbasat, I. (2008). Attributions of trust in decision supporttechnologies: A study of recommendation agents for e-commerce.Journal of Management Information Systems, 24(4), 249–273.

Warren, S. D., & Brandeis, L. D. (1890). Right to privacy. Harv. L. Rev.,4, 193.

Westin, A. F. (1967). Privacy and freedom. New York: Atheneum.Yang, Z, Yang, D, Dyer, C., He, X., Smola, A., & Hovy, E. (2016).

Hierarchical attention networks for document classification. InProceedings of the 2016 Conference of the North AmericanChapter of the Association for Computational Linguistics: HumanLanguage Technologies.

Yu, J., Hu, P. J. H., &Cheng, T. H. (2015). Role of affect in self-disclosureon social network websites: A test of two competing models.Journal of Management Information Systems, 32(2), 239-277.

Yue, T. W., Wang, Q. H., & Hui, K. L. (2019). See no evil, hear no evil?Dissecting the impact of online hacker forums. MIS Quarterly,43(1), 73.

Xu, H., Teo, H.-H., Tan, B. C. Y., &Agarwal, R. (2010). The role of push-pull technology in privacy calculus: The case of location-based ser-vices. Journal of Management Information Systems, 26(3), 135–174.

Publisher’s note Springer Nature remains neutral with regard to jurisdic-tional claims in published maps and institutional affiliations.

J. of the Acad. Mark. Sci. (2020) 48:43–63 63