ethical analysis applied to user experience

35
Ethical Analysis Applied to User Experience 2 Introduction 3 Setting the Stage: User Experience 3 Case Studies 7 Case 1 | How “Smart” Should the Car be?: Technology & Consequence 7 Case 2 | Return of the Pop-Ups: Advertising & Principles 14 Case 3 | Ethics of UX Research: Facebook & Rights 19 Case 4 | Shared & (almost) Stolen: Instagram & Justice 23 Case 5 | Addictive UX: MMORPG & Virtues 28 Summary 33 References 34 MGMT705 | Assignment 1 Joe Jancsics

Upload: joe-jancsics

Post on 08-Jan-2017

134 views

Category:

Design


4 download

TRANSCRIPT

Ethical Analysis Applied to User Experience 2

Introduction 3 Setting the Stage: User Experience 3

Case Studies 7 Case 1 | How “Smart” Should the Car be?: Technology & Consequence 7 Case 2 | Return of the Pop-Ups: Advertising & Principles 14 Case 3 | Ethics of UX Research: Facebook & Rights 19 Case 4 | Shared & (almost) Stolen: Instagram & Justice 23 Case 5 | Addictive UX: MMORPG & Virtues 28

Summary 33

References 34

MGMT705 | Assignment 1

Joe Jancsics

Joseph Jancsics MGMT705 | Assignment 1 | 9.17.2014 2

Ethical Analysis Applied to User Experience

The purpose of this essay is to evaluate ethical issues within the user experience (UX)

profession. The ethical code of conduct provided by the User Experience Professionals Association

(UXPA) will be referenced to set the basis for case analysis, along with an overview of what UX

represents as a discipline. Five unique ethical issues (cases) related to the UX field will be

summarized and analyzed, within the context of the following five ethical foundations: Consequences,

principles, rights, justice, and virtues. Each of the five unique case discussions will be restricted to one

of the aforementioned ethical foundations; thereby none of the unique cases will be evaluated on

more than a single ethical foundation. Cases are aligned with the most suitable ethical foundation for

the specific issues in question. Psychological dimensions and their potential impact on the decision-

making process will be applied, along with a recommended solution to the ethical issue at hand for

the respective case.

Joseph Jancsics MGMT705 | Assignment 1 | 9.17.2014 3

Introduction

Setting the Stage: User Experience

The User Experience Professionals Association (UXPA) states, "User experience professionals

may do a broad range of work from interviews and observations to creating wireframes for a product

or service. Some have a design background, and some have a library science degree." ("UXPA," n.d.,

About UXPA Section.) Further exploration of their website leads to the UXPA Code of Professional

Conduct (n.d.), which states the following ethical principles:

Act in the best interest of everyone

Be honest with everyone

Do no harm and if possible provide benefits

Act with integrity

Avoid conflicts of interest

Respect privacy, confidentiality, and anonymity

Provide all resultant data (para. 8, UXPA Code of Professional Conduct)

The UXPA goes into further detail of the abovementioned principles by providing examples of each.

User experience (UX) designers are involved in a wide range of responsibilities related to the design

process for product and services. A UX designer working alone is often expected to act as a

researcher, usability tester, and designer. Recent demand-driven trends have further stretched the UX

role. This sometimes includes participation in coding and development, and also collaborating with

marketing teams to share knowledge about user (consumer) research insights for positioning

products and services.

Joseph Jancsics MGMT705 | Assignment 1 | 9.17.2014 4

The UX designer approaches problems from various angles and a scenario-driven focus, with

the ultimate objective being to craft the best experience possible for the user. Essentially, one that

balances all aspects between the technologies helping a user accomplish a goal in a way they find

enjoyable. This often includes, but is not limited to, the immediate user interface (UI) of a

technology, and any/all implications that go beyond the visible controls. A basic example of potential

complexity would be the design for a privacy setting control on a social media platform. A user may

change a setting, but the resulting change goes beyond the immediate UI as it dictates what all other

users on the social platform will have access to; some of which may not be in alignment with the

intended goal of the user. A responsible UX designer, or team of practitioners, will work to create an

experience that communicates the appropriate feedback to the user upon altering a critical setting,

protecting them from harm without coming off as unpleasant or annoying.

The UX designer role exists within a wide range of industries such as banking, consumer

electronics, healthcare, automobiles, transportation systems, and the list goes on. Chances are most

products, services, and software you engage with have been shaped in some way by a user experience

designer, especially within the last decade. The term 'User Experience Designer' was coined

sometime around 1995 by Don Norman when he worked for Apple (Norman, Miller, &

Henderson, 1995). Since then the term has been so frequently misused that the true meaning has

nearly been lost, and to resolve confusion it helps to mention a few of the things that it is not: Web

design, user-centered design, graphic design, usability testing, customer satisfaction, information

architecture, etc (UX Design Defined, 2010). Part of the confusion around the term is that "user

experience design" is a broad discipline. Any attempt to pin it down to just aesthetics, research, or

Joseph Jancsics MGMT705 | Assignment 1 | 9.17.2014 5

prototyping would be ignoring the bigger picture and true definition. A summary definition

provided by Nielsen and Norman (n.d.) states, "'User experience' encompasses all aspects of the end-

user's interaction with the company, its services, and its products" (summary section, para. 1).

Most large organizations will create well-rounded UX teams with specialized members to

cover research, testing, prototype development, and the visual look and feel. Specializations aside, it's

expected for an individual UX designer to have basic competency in most, if not all, of these areas.

UX research methods are adapted from anthropology and sociology (Aretz, 2013). UX testing

methods rely on elements of cognitive psychology for determining the appropriate architecture and

load of information presented in a service, product, or system. Iterative rapid prototyping methods

are the UX designer’s tools to determine the best experience to allow users to complete tasks and

accomplish their respective goals.

UX is not a traditional line of work, but rather a relatively new field that is experiencing

tremendous growth in the climate of technology innovation. Employing organizations and designers

of all types frequently mislabel their own UX opportunities and experience, respectively. UI design,

research, testing - these things are components within the UX design discipline and process, but they

are not to be considered anything more than tasks toward a greater calling. One can easily argue

that a jewelry designer is a UX practitioner if their process and output incorporate the right

elements; and this idea will become more apparent with the emergence of wearable technology

devices. The fact that uxdesign.com had to publish a list of what it is not to segue into a 1,300 word

definition explaining what it is, with illustrations, underscores the continued existence of confusion

and challenges in providing a comprehensive overview (UX Design Defined, 2010). A field with

Joseph Jancsics MGMT705 | Assignment 1 | 9.17.2014 6

more history behind it, for example accounting or law, would probably require less introduction and

guided explanation to the reader. However, in an effort to adequately set the stage it is important to

understand the scope of the discipline, as it defines the context for the case selections and ethical

analysis hereinafter. For the purposes of this paper the ethical issues will be analyzed from a UX

team and/or practitioner perspective, focusing on research and design decisions and the implications

on primary and secondary stakeholders.

Joseph Jancsics MGMT705 | Assignment 1 | 9.17.2014 7

Case Studies

Case 1 | How “Smart” Should the Car be?: Technology & Consequence

Everyone knows they shouldn't text and drive but they do it anyways, often times the stated

intent of a user and their behavior don't match up. Safety officials have called for solutions to

prevent distracted driving for some time now. The National Highway Traffic Safety Administration

(NHTSA) has published comprehensive voluntary guidelines for automakers to minimize in-vehicle

distractions. Scott Tibbitts, a former engineer who worked on technologies for NASA, was inspired

by a tragic event, "...he discovered that the executive with whom he was supposed to meet had been

killed that very morning in a car crash caused by a teenager, who, Mr. Tibbitts was told, was

texting."(Richtel, 2014, para. 5). Tibbitts went to work on a solution and launched a company

called Katasi. He developed a system that will block incoming and outgoing texts, along with

preventing phone calls (Richtel, 2014). American Family insurance invested in the technology, and

Sprint allowed Katasi to use their network for testing and development.

When cellphone carriers relied heavily on minutes-based revenues their target market was

drivers, and the more time people spent in the cars the more minutes they would use. In recent years

the model has shifted to smartphones with unlimited data plans, and carriers have been promoting

anti-texting campaigns. Various technological issues began to surface for Katasi when evaluating the

use-cases further, like how would the technology know the difference between a driver and

passenger? Adding more hesitation for Sprint to participate was the possibility that the technology

may simply fail or have a random error, and with that there could be consequences. If you make a

Joseph Jancsics MGMT705 | Assignment 1 | 9.17.2014 8

promise to keep a driver safe by blocking the distractions, and a single text message slips through the

system there could be major liability issues.

Automakers are important stakeholders in the equation as they are increasingly adding

'infotainment' systems that work to leverage features from smartphones. The most commonly

discussed problem is texting, but automakers are putting these features into the dashboard. Prior to

NHTSA publishing guidelines it seemed like automakers were chomping at the bit to arrive first

with a tablet-like experience in the dash, with social media apps included, and none of that is truly

the best approach from a UX practitioner perspective. "While these technologies deliver

unquestionable value and pleasure to the driver and passengers, they indisputably divide the

operator’s attention, distracting him or her from the stated purpose of driving, leading to life-

threatening situations (and that’s not even including texting while driving)" (Gribbons, 2013, The

Dangers of Distraction section). Anti-texting laws have been difficult to enforce because dialing the

phone or using a device for music is not considered an offense, and seeing a phone in hand is rarely

enough to determine the actual use. If these features are in the dashboard they don't necessarily

become less distracting, much of that outcome depends on responsible UX design.

Katasi serves as an example for an immediate solution to a potentially bigger problem, but

the proposed solution is perceived as a risk to carriers and a threat to technology advancement. Most

of the smartcar and infotainment systems heavily leverage smartphone connectivity features, taking

away the phone would essentially remove much more than just the phone. This is a great example of

where the result of one decision within a technology system can have results that affect the entire

Joseph Jancsics MGMT705 | Assignment 1 | 9.17.2014 9

landscape for advancement and innovation. If the phone is blocked what features can the system no

longer perform? Could blocking the phone create significant problems in certain situations?

Ethical issue or decision:

Should data connections and non-driving features be blocked while driving?

Stakeholders:

Primary: Drivers, cell phone companies, Tibbitts (Katasi), automakers, OEM suppliers

Secondary: Family members, taxpayers, shareholders

Ethical Foundation - Consequences (Utilitarian):

Question: Should data connections and non-driving features be blocked while driving?

Stakeholder Do allow - Harms

Do Allow - Benefits

Do Not Allow - Harms

Do Not Allow - Benefits

Drivers – Primary 10 10 15 7 Cellphone Carriers – Primary 4 10 20 5 Katasi (Tibbitts) - Primary 2 0 0 3 Automakers - Primary 6 10 8 2 OEM suppliers - Primary 3 8 12 2 Family members - Secondary 3 8 12 4 Taxpayers - Secondary 5 0 0 3 Shareholders - Secondary 2 5 8 2 35 51 75 28

Referencing the table above allows us to measure consequences and results as they impact

stakeholders. Drivers are the most obvious primary stakeholders, and from a UX perspective they are

the user, and therefore they have the most weight in the table. The harm and benefit of allowing

data connections and non-driving features are measured as being equal because to some degree this

depends on individual behavior and random chance. Teens are the most common to have accidents

while texting, but they have also been raised in a world with smartphones and have not been driving

Joseph Jancsics MGMT705 | Assignment 1 | 9.17.2014 10

very long. When age and experience are factored in the analysis it seems to balance the harms and

benefits results, or at least it establishes more context associated with allowing these technologies in

the vehicle. Not allowing data or other technologies could have harms related to the lack of ability to

call for help when in distress. Responsible drivers have a phone in their possession with the

expectation they can use it for an emergency. If a safety feature blocks the ability to call for help it

has potential to cause harm. For the cellphone carriers they seem to face harm if the phones are

blocked while driving because they will likely disappoint users. It is not hard to imagine sales a

carrier would lose if they decided to block data while driving while other carriers continue to allow

it. Responsible adults are most likely to choose the phone that works all the time.

Katasi (Tibbitts) was assigned less weight because the company only represents a small

amount of people, mainly Tibbitts and a circle of partners, but he and his company are still

considered primary stakeholders. If technology is not blocked while driving it essentially puts him

out of business, or perhaps back to the drawing board. If a carrier will partner with him it will likely

benefit him more than the level of harm if his idea is rejected, but Katasi is assigned less weight

because it’s the smallest stakeholder. It should also be mentioned that the efforts they are making

help to raise awareness of the issues, and this is further promotes exploration towards solutions.

Automakers and OEM suppliers would both be harmed if data connections were blocked

because it would restrict their ability to explore new experiences for users. The automakers are

impacted less because they can still offer vehicles. The OEM suppliers depend heavily on the

existence and refinement of technology within the vehicle because it is their primary business, and

therefore the harm consequences are highest if technology is blocked.

Joseph Jancsics MGMT705 | Assignment 1 | 9.17.2014 11

For secondary stakeholders the families of drivers see significant harms if blocking of

technology is allowed. Although there are benefits in the table for them, these are representative of

the less experienced drivers, or teens, where parents may desire to have the restriction in place. On

the other hand, if a teenager is being chased by someone in a road rage incident the potential harm

associated with not being able to contact help adds points to the overall largest harm of blocking

technologies. Taxpayers as secondary stakeholders pay for emergency response and NHTSA

guidelines, and if technology were blocked it would allow these funds to be used elsewhere. The final

secondary stakeholders are the shareholders of the companies involved (carriers, automakers, and

OEM suppliers). Potential innovation and growth would be slowed if restrictions were put in place,

but some level of risks exists if no action is taken and this is reflected accordingly in the table.

Psychological Dimensions:

Moral awareness is evident with magnitude of consequence and the potential to inflict serious

harm. The leading cause of death among teenagers is texting while driving, and studies show texting

drivers are 23 times more likely to have an accident ("Auto Safety," n.d.). Social consciousness is also

apparent in increased campaigns against distracted driving and NHTSA guidelines have recently

caused many automotive-related stakeholders to take a more cautious approach rather than jamming

in new technologies. Euphemistic language played a role in the initial rush to implement technologies

quickly into automobiles. Terms like "Smart Car" and "Infotainment" minimized the ethical

concerns around things like driver distraction and overall safety.

Automakers seem to be slowing down their rush after being called out on some major

oversights in terms of UX. A rather harsh review of the Cadillac CUE touchscreen system stated,

Joseph Jancsics MGMT705 | Assignment 1 | 9.17.2014 12

"CUE would work much better if Cadillac was to bring back some key physical push buttons and

knobs for the climate and audio controls." (Gruener, 2013, IX Design and Distraction section). Don

Norman, who coined the term 'UX designer', was quoted on his impressions of the BMW's iDrive

system "Look at BMW with iDrive. It was crazy, just crazy. Disaster. You could customize

everything. You could customize up to something like 700 variables." (Lombardi, 2007, para. 2).

The most obvious mistakes with the early infotainment systems seemed to involve a design approach

that lost sight of the environmental elements, primary user goals, and cognitive processing while

driving. Simply because you can present technologies to a user doesn't always mean you should. UX

and marketing can sometimes have conflicting interests, and usability suffers when the focus shifts

away from core user goals and towards adding more features.

For now the NHTSA guidelines are voluntary, but most automakers are aware that ignoring

them could be dangerous. Long-term it can be expected that if the guidelines eventually become

mandatory it could reflect poorly on those who strayed from the recommended path. This is a sign

of level two conventional cognitive moral development; where the actions are based on conformity,

mutual expectations, social accord, and system maintenance (Trevino & Nelson, 2014, p. 85).

Cognitive barriers such as confirmation trap can be recognized by the initial rush by

automakers to implement new technologies first. Along with this was the illusion of optimism and

illusion of control, displayed several years ago when Cadillac was offering classroom-style sessions on

how to use their CUE system. A well-designed user experience would ideally not require after-sales

classroom sessions to help a user operate the system.

Joseph Jancsics MGMT705 | Assignment 1 | 9.17.2014 13

Moral disengagement may be evident on automakers and cell providers because they have not

established a clear plan of action. While the automakers have NHTSA guidelines to follow, if they

choose, the cell carriers seem to promote awareness campaigns without any real design solutions.

Through the campaigns they may be thinking it diffuses responsibility and perhaps attributes blame

onto users who are continuing to text while driving.

Proposed Solution:

The UXPA ethical principles have significant amounts of language to reinforce a utilitarian

approach (UXPA Code of Professional Conduct, n.d.). When analyzing the issues within the

framework of the ethical principles of consequence it seems clear that blocking technologies entirely

has a high level of harm, but allowing restriction also offers a moderate level of benefit. This

indicates that the solution should not be focused on preventing a user from doing things entirely,

but rather to take a more holistic approach towards designing responsible solutions. With more

hands-free features on the horizon, improved control systems, and speech recognition it can be

clearly seen that the solution lies in designing a better solution. Ideally, a system with enough

flexibility to enable parents to lock out texting, but perhaps allow calling certain numbers for

emergencies while driving would be best. As a UX practitioner it would not make sense to propose a

design within this paper, but it is easy to see reasons why increased awareness of the issues will lead

to better solutions in the future. Moving forward, the UX efforts should be focused on handling the

primary goal of driving safely, while eliminating any distractions associated with media or

communications systems if/when needed by the user to accomplish a secondary goal.

Joseph Jancsics MGMT705 | Assignment 1 | 9.17.2014 14

Case 2 | Return of the Pop-Ups: Advertising & Principles

“The Web is full of all kinds of annoying and distracting advertising tricks, but the

experience of having the close button fail to close something delivers a unique brand of torture.”

(Leonard, 2013, para. 3). This case focuses on the recent revival of pop-up advertisements and how

they mislead users into unintended actions. Sometimes buttons are created that appear to serve the

goal to dismiss an advertisement, but when pressed they do not work. These are not glitches or

accidents, they are intentionally designed to mislead users, to trick them into viewing pages not

aligned with their goals.

In addition to the common pop-up ads, some designers are manipulating users by the

selection of design pattern they implement. Mitchell (2014) wrote about the psychology of waiting

and load times, illustrating how users can be manipulated into blaming a system or an application

based merely on the style of animation presented:

“It was from this perspective that I noted that custom loading animations can be valuable to

distract and entertain your users while content is retrieved. But I also noted a word of

warning. This warning pertained to a Facebook test indicating that when their users were

presented with a custom loading animation in the Facebook iOS app (left) they blamed the

app for the delay. But when users were shown the iOS system spinner (right), they were

more likely to blame the system itself.” (para. 4)

In a response posting on his own blog, UX designer Chris Kiess (2014) stated:

“We don’t often think too much about ethics as UX professionals. But, there are a variety of

reasons we should and areas of our profession where ethics can become pertinent. In the case

Joseph Jancsics MGMT705 | Assignment 1 | 9.17.2014 15

of the Facebook example above – assuming it is true – it seems this is only unethical once

you discover how to mislead the user and intentionally devise a means to do so.” (para. 4)

UX designers are forced to balance the pressures of business goals with the ethical principle guideline

“Do no harm and if possible provide benefits” (UXPA Code of Professional Conduct, n.d., Ethical

Principles section). Further guidance is provided by UXPA in the example, “UX practitioners shall

not expose participants to any unreasonable physical, mental, or emotional stress.” (UXPA Code of

Professional Conduct, n.d., Examples of the Practice of the Principles section).

Ethical issue or decision:

Should UX designers intentionally mislead users?

Stakeholders:

Primary: Users, test participants, UX practitioner, organizations

Secondary: Marketing departments

Ethical Foundation - Principles (Deontological):

The primary components for a principles analysis are to do no harm, do unto others as you

would have them do unto you, and Kant’s categorical imperative: “Act as if the maxim of thy action

were to become by thy will a universal law of nature.” (Trevino & Nelson, 2014, p. 43, para. 5).

These deontological theories align with the third UXPA Ethical Principle “Do no harm and if

possible provide benefits” (“UXPA Code of Professional Conduct”, n.d., para. 8, Ethical Principles

section, 3rd list item). When thinking of the user, and the UX designer’s responsibility to create the

most suitable experience possible for them, it is clearly inappropriate to intentionally mislead users.

Joseph Jancsics MGMT705 | Assignment 1 | 9.17.2014 16

Pop-up ads, false buttons, these seem harmless on a typical media website designed for information

consumption, but what about if everyone designed like this? What if everything was like this?

Systems that are designed to help users accomplish goals exist in critical settings, such as air-traffic

control centers and emergency rooms, would false buttons or pop-up windows misleading the user

ever be acceptable in those applications? A responsible UX practitioner would consider a wide

spectrum of users and their potential emotions. A designer does not know what mental state their

user will be in when they interact with a system, and even if it is a website focused on entertainment

the user might be one bad interaction away from complete frustration. The major moral principles

should guide a UX designer to produce an experience that will delight users, not frustrate or mislead

them.

Psychological Dimensions:

Depending on the goal of the user, or the level of potential harm, it is possible that a low

magnitude of consequence is associated with experiences where not much immediate or obvious

potential harm is identified. The UX designer responsible for making a false button or a misleading

pop-up most likely justifies their decision based on a level 1 pre-conventional cognitive moral

development, driven by obedience and concerned with personal reward (Trevino & Nelson, 2014). If

the marketing team asks for it and the UX practitioner is operating at level 1 or 2 on the cognitive

moral development spectrum they are likely to set aside principles and follow orders. Level 2 focuses on

approval by others in a close group and fulfillment of agreed-upon duties (Trevino & Nelson, 2014).

This means that if the UX team has been established for some time and there is cultural precedence

to follow orders, or if implementing pop-up features is something agreed upon in the past by the

Joseph Jancsics MGMT705 | Assignment 1 | 9.17.2014 17

team, then level 1 pre-conventional and level 2 conventional psychological factors will influence

decision making behavior.

Moral disengagement and the distortion of consequence may contribute to a reduced feeling of

personal responsibility. Distortion of consequence and illusion of optimism may lead to further

reinforcement of the decision to mislead users. If an organization is measuring success by the number

of clicks rather than how a user is feeling about the experience they are missing entire purpose of

ethical design. Overconfidence and confirmation trap may influence choices of which facts to gather,

thus confirming preconceived preferred choices (Trevino & Nelson, 2014).

Proposed Solution:

Acting in the interest of the UXPA Ethical Guidelines it would be recommended for the UX

designer to voice concerns over the practice of implementing pop-up ads, false buttons, or any other

misleading experiences. While the secondary stakeholder in a marketing department might not agree

immediately, it is the UX designer’s responsibility to inform them that these types of experiences

frustrate users, hurt the brand, and have much more downside risk than upside potential. The

UXPA Examples of Practice states:

“UX practitioners shall avoid all known conflicts of interest with their employers or clients

and shall promptly inform their employers or clients of any business association, interests, or

circumstances that could influence their judgments or the quality of their services.” (“UXPA

Code of Professional Conduct”, n.d., Examples of Practice section, Avoid conflicts of

interest, item 5.1).

Joseph Jancsics MGMT705 | Assignment 1 | 9.17.2014 18

This above-referenced example specifically applies to reinforce the recommendation that the ethical

UX practitioner should voice their concerns. Efforts should be made to present better solutions to

the user in an effort to preserve the integrity of the brand and service.

Joseph Jancsics MGMT705 | Assignment 1 | 9.17.2014 19

Case 3 | Ethics of UX Research: Facebook & Rights

Facebook is currently one of the most influential and innovative players in technology and

social media. Part of this success stems from their focus on understanding users, and making very

deliberate and calculated design decisions. When a user joins a social media service they agree to

terms, and typically almost no one reads these. The Facebook terms of service (TOS) has an item

regarding research that states they may use information “for internal operations, including

troubleshooting, data analysis, testing, research and service improvement.” (“Facebook,” n.d., How

we use the Information we Receive section, 6th list item).

Facebook conducted a research study on nearly 700,000 users, permission for this was

granted under the TOS agreement that all users are bound to equally. The study was focused on

“emotional contagion through social networks” and it involved manipulating the information

presented to users within their newsfeeds (Waldman, 2014). The testing focused on how adjusting

the flow of positive and negative updates would affect a user. “Some people were fed primarily

neutral to happy information from their friends; others, primarily neutral to sad. Then everyone’s

subsequent posts were evaluated for affective meanings.” (Waldman, 2014, para. 3). Facebook was

essentially filtering out things from a user’s network, that they otherwise would have seen in an un-

tampered experience, with the intent to shift their emotional state.

Ethical issue or decision:

Should Facebook deliberately try to influence the mood of users in their research efforts?

Stakeholders:

Primary: Users, Facebook

Joseph Jancsics MGMT705 | Assignment 1 | 9.17.2014 20

Secondary: Other users, family, coworkers

Ethical Foundation - Rights (Deontological):

Applying the ethical principles of rights involves the aspects an individual deserves by being

human. The right to the pursuit of happiness seems as though it is being infringed upon by

Facebook. They are providing a service where research efforts are intentionally and artificially

crafting a negative social experience for the user. Furthering this violation of rights is the basic right

not to be harmed or interfered with by others. Facebook has no idea what a user is dealing with in

their un-shared (offline) lives, whether it be illness or depression, and to manipulate the experience

with the intent to deflate mood could violate the rights to health and safety.

Psychological Dimensions:

On the moral awareness element of psychological factors it is reasonable to assume the

magnitude of consequence could be one of serious potential harm. Users are humans, each with

unique situations in life and personalities, and when Facebook tries to manipulate their mood from a

distance they have very little control over the potential harm caused to the user to secondary

stakeholders. Euphemistic language within the TOS agreement presents their research activities in a

way that makes it sound as if only good things can happen; little does a user know that they might

be intentionally dragging their mood down with the research efforts.

It is likely the researchers at Facebook have some level of moral disengagement when they

deploy these algorithms to nearly 700,000 users. With numbers that large, and methods violating

basic rights, it is hard to imagine that they think of users as anything more than just a number or a

Joseph Jancsics MGMT705 | Assignment 1 | 9.17.2014 21

line of code. Moral justification is also likely a psychological factor in their decision, as stated by

Trevino and Nelson (2014) “unethical behavior is thought to be okay because it contributes to some

socially valued outcome.” (p. 85, para. 3). The UX team and researchers at Facebook probably view

themselves as pioneers by conducting this seemingly innovative research, but in the context of rights

and the amount of users involved the ethics of it are flawed.

Proposed Solution:

For guidance on this issue a UX practitioner can reference the UXPA Examples of Practice

section 'Do no Harm', all three examples are applicable and stated as follows:

“3.1. UX practitioners shall not expose participants to any unreasonable physical, mental or

emotional stress.

3.2. UX practitioners shall take reasonable steps to avoid harming their clients or employers,

study participants, and others with whom they work, and to minimize harm where it is

foreseeable and avoidable.

3.3. UX practitioners shall review for special needs when working with the elderly, the

disabled, and children. Precautions taken to avoid risks associated with such groups shall be

clearly identified and reviewed by the client or employer.” (“UXPA Code of Professional

Conduct”, n.d., Examples of Practice section, Do no Harm and if Possible Provide Benefits,

items 3.1-3.3).

In the case research there was no evidence that Facebook excluded groups that may be disabled, and

it doesn’t seem like they knew what would be the result of the research. In other words, if there was a

wave of suicides because they implemented the negative mood filter on users with severe depression,

was there any way they would have been prepared for that? The example may seem a bit extreme,

Joseph Jancsics MGMT705 | Assignment 1 | 9.17.2014 22

but emotional responses from users within a system that delivers a highly personalized experience can

be serious. The proposed solution is that Facebook should recognize that the world is already full of

good days and bad, happy status updates and sad occur naturally, and therefore there is little reason

for them to meddle with the organic ecosystem of emotions that already exists within their social

network platform. Users have enough going on, and Facebook has plenty of naturally-occurring

drama to study; a responsible UX designer/researcher respects the rights of users and their

environment enough to know tampering only makes the research impure. A solution to continue

this type of research would be to assemble a specific treatment group that is aware of their

participation in a study, and then use the regular user base as the control group for results

comparison.

Joseph Jancsics MGMT705 | Assignment 1 | 9.17.2014 23

Case 4 | Shared & (almost) Stolen: Instagram & Justice

Instagram is a social photo-sharing application for mobile platforms that delivers an easy and

fun way to share photos free of charge. Users can follow each other, leave comments, ‘like’ photos,

which makes it sound almost like a photo-based version of Facebook. Not surprisingly, the team at

Facebook recognized the strategic alignment and in 2012 they acquired Instagram. Users on

Instagram are frequently treated to revised interface designs through software updates, and the

service offers simple tools to apply filters to make photos look more unique. Facebook and Instagram

present separate experiences to the user and each service has unique terms of service (TOS)

agreements.

A change to the TOS was announced in late 2012, and to say the least the community of

users did not react positively:

“The new Terms of Service suggested Instagram would be allowed to use pictures in

advertisements without notifying or compensating users, and to disclose user data to

Facebook and to advertisers. Instagram also proposed that the parents of minors implicitly

consent to the use of their childrens' images for advertising purposes.” (Mintz, 2013, para. 3)

Additional language in the agreement forced users to waive their rights to file a class action lawsuit

(Mintz, 2013).

Online campaigns started to organize protests around the new TOS, and a large number of

users deleted their Instagram accounts. In response, Instagram revised the agreement and removed

the most controversial parts about displaying users’ photos without providing respective

compensation. However, there were still some questionable terms in the revision. They presented

Joseph Jancsics MGMT705 | Assignment 1 | 9.17.2014 24

language to the user that anyone under 18 years old would be implying through acceptance of terms

that their parent or guardian has reviewed and agreed to them, which seems like a strange

expectation for an organization to have (Mintz, 2013). Users were able to bring legal action against

Instagram, and eventually they reversed all the changes and rolled back the TOS to the version prior

to all the controversy.

Ethical issue or decision:

Should Instagram be able to alter terms of service and leverage content from users for their

own commercial benefit?

Stakeholders:

Primary: Instagram users, Instagram/Facebook

Secondary: Other users, people being photographed

Ethical Foundation - Justice (Deontological):

The ethical foundation of justice focuses on concepts of fairness. The users of Instagram felt

threatened by the proposed change because they were all being affected by a sweeping immediate

change of the rules within the experience. While there may have been a veil of ignorance when

applying the new terms of service to all users, instead of only a select few, it is reasonable to assume

they were not planning to use poorly shot photos for their commercial purposes. The actual terms

and rules were applied to everyone; but some users on Instagram are professional photographers, and

the high quality of their work would make them a more likely target for undesired use by Instagram.

In this sense the distributive justice would be tilted unfairly against those with the most impressive

Joseph Jancsics MGMT705 | Assignment 1 | 9.17.2014 25

portfolios, for their work would probably be used by Instagram for commercial purposes without

remuneration to the user. Essentially, the most talented users with the best photos would be the

most likely to have the worst outcome under the new terms.

Psychological Dimensions:

The decision to present terms of service that would transfer content ownership away from

users most likely consisted of cognitive barriers, such as illusion of control and scripts. Illusion of

control would have Instagram's team believing they were in complete control of the terms. For scripts,

Trevino and Nelson (2014) wrote, "Scripts are cognitive frameworks that guide human thought and

action. Although they are generally not written down, scripts contain information about the

appropriate sequence of events in routine situations." (p. 101, para. 2). It is likely the team at

Instagram was expecting users to take the action of accepting the new terms without reading them

carefully, as users often do. Somewhat ironically, Instagram and Facebook's very own social networks

may have been the primary reason word spread so quickly about the new policies and content

ownership changes. Like most legal documents, the amount of wording was significant and the most

salient changes to terms were buried within legalese, which may be thought of as a type of

euphemistic language disengagement. Burying the most important changes within a large amount of

less important material can be seen as a way to minimize the potential recognition of ethical

infringement. There may very well have been a confirmation trap creating an assumption that users

would not bother to read the terms carefully before accepting.

Several years ago it would have been more reasonable to assume a user would not read, or

even become informed about, the specifics within a TOS agreement. With more exposure to

Joseph Jancsics MGMT705 | Assignment 1 | 9.17.2014 26

technology users have become increasingly sophisticated and more comfortable at evaluating TOS

agreements. On a social media platform any troubling information regarding TOS can be shared

quickly among many users. In that sense it only takes a few users to actually read and understand the

terms and then they can spread information to many more via social media platforms. If ethically

questionable language is recognized by users, which pertains to the very same social media service

they are using, then they will be highly motivated to share it within the network to inform other

users and raise awareness around the issue.

Proposed Solution:

If a user captured a photo through the lens of their own device, then the underlying

ownership and ability to generate revenues from that photo should not be suddenly transferred to

the service provider who is storing a file copy on their server. Essentially, in terms of ethical design

analysis this would be considered stealing of work. With the level of customization a user can

manipulate on Instagram, and the personal nature of photography as an artistic expression, it is safe

to assume the photos have some meaning or importance to the users. Just because the new service

agreement would make earning revenues from user content legal, does not mean it is ethical.

A reasonable solution for Instagram would be do research and find other ways to accomplish

whatever goals they were seeking through the controversial TOS. Perhaps they were changing the

language so they could leverage user content to expand their own library of assets. If this is the case

they should explore solutions where the users have a choice within the interface of which photos, if

any at all, that they will forfeit the rights to, or perhaps a model that incentivizes the user to allow

use. UX designers are encouraged to empathize with their users, and as a result of this the

Joseph Jancsics MGMT705 | Assignment 1 | 9.17.2014 27

experiences created should not cause a user to feel unfairly disadvantaged. If they want to leverage

the user content from the best photographers then design a way for those users to step up into such

an agreement.

Joseph Jancsics MGMT705 | Assignment 1 | 9.17.2014 28

Case 5 | Addictive UX: MMORPG & Virtues

Massively multiplayer online role playing games (MMORPG) have been considered

‘addictive’ by players since becoming popular in the early 2000s. A game by Sony called EverQuest is

commonly referred to by users as “EverCrack” due to the long hours they spend playing (Patrizio,

2002). The gaming addiction with MMORPG titles is such an issue that in 2009 the Chinese

government mandated game developers to implement anti-gaming addiction systems in their games

(Toyad, 2014). Tests of the effectiveness for the anti-addiction systems in China showed that only

26 percent of games were accomplishing the goal correctly, emphasizing the challenges in controlling

addictive experiences.

In the UX Magazine article “Towards an Ethics of Persuasion” Anderson (2011) discusses

seductive interaction design, “Of course you can't discuss a topic like seduction or what motivates

people without some awareness that, no matter how playful or well-meaning your intentions are,

these things will certainly be abused. “ (para. 1). In light of this one must ask, where is the line

drawn when addictive patterns emerge among users?

In 2001 a Wisconsin man named Shawn Woolley committed suicide, and family members

believe it was due to his addiction to the MMORPG EverQuest. Woolley was prone to epileptic

seizures and while playing for extended periods, as he often did, he would sometimes suffer from a

seizure (Patrizio, 2002). “Shawn got involved in the game in 2000 and by 2001 it had consumed his

life, Elizabeth said. He'd quit his job and played almost non-stop, eventually being evicted from his

apartment and moving in with his mother, before leaving her home and then being put in a group

home for addictive behavior.” (Patrizio, 2002, para. 11). Sony Online Entertainment (SOE), the

Joseph Jancsics MGMT705 | Assignment 1 | 9.17.2014 29

makers of EverQuest, was unwilling to share account usage activity for the week leading up to the

suicide with the Wooley’s mother. In response to the request they cited privacy concerns for other

players, but Woolley’s mother suspects he played for almost the entire week.

Ethical issue or decision:

Should intentionally addictive experiences have systems to regulate behavior patterns?

Stakeholders:

Primary: Shawn Woolley, his family, UX designers, Sony Online Entertainment

Secondary: Other EverQuest players

Ethical Foundation - Virtue (Character & Integrity):

According to Trevino and Nelson (2014), “The virtue ethics approach focuses more on the

integrity of the moral actor (the person) than on the moral act itself (the decision or behavior). The

goal here is to be a good person because that is the type of person you wish to be” (p.46, para. 3).

With intent playing a large part of virtue ethics, it is important to consider if game companies

understand the role of the entertainment services they provide in the greater context of a user’s life. A

UX designer always has the aim of creating pleasurable experiences, and it is no accident that

MMORPG games are addictive. It is common knowledge that too much time spent on

entertainment, such as video games, will likely reduce the overall quality of life that a user has.

Virtue ethics involve character, which is defined by measurement against expectations of the

relevant moral community. For this we can reference the first item of the UXPA Code of

Professional Conduct (n.d.), “Act in the best interest of everyone” (para. 8, 1st list item). The

Joseph Jancsics MGMT705 | Assignment 1 | 9.17.2014 30

moment it becomes apparent an addictive experience may diminish the user’s overall quality of life,

or reduce their ability to contribute to society, the ethically responsible UX designer would consider

solutions to manage it.

A person with a focus on integrity will measure the consequences of secrecy by using a

disclosure rule: “If you’re inclined to keep it a secret, that should be a clue something isn’t right”

(Trevino & Nelson, 2014, p.55, para. 4). If interface patterns and reward structures in a game reveal

potential addictive responses then it makes sense to measure how this could affect users in the

context of their lives as students, professionals, parents, and other roles they may fill within society.

With limited hours in each day, one can presume that a student playing an MMORPG game 6+

hours each day will have less academic success than a student who spends the same amount of time

studying. The same can be evaluated for adult gamers or parents, the hours playing an MMORPG

take away from the daily allowance for social or family time.

The ethical foundation of virtue would motivate a UX designer with the simple intent to be

a good person and do the right thing. As a designer they would still strive to provide a greatly

enjoyable design, and with the foundation of virtue they would make efforts to mitigate collateral

damage caused by addiction patterns. This would involve delivering the most enjoyable experience

with a level of balance, showing some intent to not allow the user to neglect their other

responsibilities in life.

Psychological Dimensions:

Moral disengagement can be a psychological factor influencing the decision to release a game

with the knowledge it may lead to addiction. The team at Sony Online Entertainment is probably

Joseph Jancsics MGMT705 | Assignment 1 | 9.17.2014 31

aggressively focused on intentionally catering to pleasure signals that they dehumanize the victims.

Allowing players to create avatars and exist for long periods of time within a virtual world is evidence

of the dehumanization theme. Attribution of blame is another way they are likely reducing

identification with the addictive behaviors exemplified by users. By not voluntarily implementing

measures to break the addiction patterns they are displacing the responsibility and blame on the user.

MMORPG games use euphemistic language when rewarding players or distributing

achievements. Killing monsters, finding gold, playing an extra hour to ‘level up’ – all of these things

are presented as great accomplishments, but it was never a considered to suggest a user take a break.

The longer the user plays the game the more they are praised and awarded achievements, each new

one being more attractive than all the ones prior. The consequences of missing important

commitments in the real world are diminished by the constant rewards structure in the virtual game

world.

Proposed Solution:

A proposed solution to the problem could come in the form of a reminders system. A

designer could create rules within the game such as the following: If a playtime in excess of ‘X’

number of hours across ‘Y’ number of days, then present message ‘Z’. In the same way that ethics

training can change the way a person responds to situations, seeing a message in the game can work

to remind a user to change their behavior patterns if they are trending towards addiction. Virtue

ethics are based on intent and this would be a reasonable first step to show good intentions. As with

any of these solutions, there are many other ways to handle the problem. From a UX designer’s

Joseph Jancsics MGMT705 | Assignment 1 | 9.17.2014 32

perspective, the ideal solution would preserve the enjoyable aspects of the experience while

mitigating the risks of addiction or harm in an adaptable way.

Joseph Jancsics MGMT705 | Assignment 1 | 9.17.2014 33

Summary

As demonstrated in the cases presented, UX practitioners can create significant benefit or

harm to society based on the decisions they make. The variety of subject matter within the cases,

along with the comprehensive coverage of ethical foundations, proves how problems can arise from a

number of sources and there is no “one size fits all” solution. By following the UXPA Code of

Professional Conduct a UX practitioner should expect to achieve great success in their field without

causing harm when balancing conflict. Understanding and applying the material is a step in the right

direction, but true ethical expertise for a UX practitioner must be demonstrated through action.

Joseph Jancsics MGMT705 | Assignment 1 | 9.17.2014 34

References

Anderson, Stephen P. (2011, December 13). Towards an ethics of persuasion. UX Magazine. Retrieved September 16, 2014, from http://uxmag.com/articles/towards-an-ethics-of-

persuasion Aretz, A. (2013, July 17). The social sciences are alive and well in UX research. Retrieved September 9, 2014, from http://www.momentdesign.com/blog/social-sciences-are-alive-and-well-ux- research#.VBZccvldV8E Auto Safety. (n.d.). Texting while driving now leading cause of US teen deaths. Retrieved September 15, 2014, from http://safety.trw.com/texting-while-driving-now-leading-cause-of-us-teen- deaths/0710/ Bryan, P. (2013, December 23). User experience versus users. Retrieved September 16, 2014, from http://www.uxmatters.com/mt/archives/2013/12/user-experience-versus-users.php Facebook. (n.d.). Information we receive and how it is used. Retrieved September 14, 2014, from

https://www.facebook.com/about/privacy/your-info Gribbons, B. (2013, March 9). Emerging technologies are creating new ethical challenges for UX designers. Retrieved September 14, 2014, from https://gigaom.com/2013/03/09/emerging-t echnologies-are-creating-new-ethical-challenges-for-ux-designers/ Gruener, W. (2013, January 6). Cadillac CUE: why a touch screen is not always a good idea. Retrieved September 15, 2014, from http://www.tomshardware.com/news/cadillac-cue-entertainment- ix-ux,20143.html Kiess, C. (2014, March 24). Placebo buttons, misleading users & the ethics of UX design. Retrieved

September 16, 2014, from http://chriskiess.net/placebo-buttons-misleading-users-the-ethics-of-ux-design/

Leonard, A. (2013, October 7). When the close button doesn’t close. Retrieved September 16, 2014,

from http://www.salon.com/2013/10/07/when_the_close_button_doesnt_close/ Lombardi, C. (2007, August 28). Are drivers ready for the high-tech onslaught? Retrieved September 14, 2014, from http://news.cnet.com/Are-drivers-ready-for-high-tech-onslaught---page- 2/2100-11389_3-6204706-2.html Mintz, S. (2013, March 18). Instagram’s terms of service and privacy issues. Ethics Sage. Retrieved

September 15, 2014, from http://www.ethicssage.com/2013/03/instagrams-terms-of-service-and-privacy-issues.html

Joseph Jancsics MGMT705 | Assignment 1 | 9.17.2014 35

Mitchell, R. (2014, February 6). The psychology of waiting, loading animations, and facebook.

Retrieved September 15, 2014, from http://mercury.io/blog/the-psychology-of-waiting-loading-animations-and-facebook

NHTSA. (2013, April 23). U.S. DOT releases guidelines to minimize in-vehicle distractions [Press release]. Retrieved September 14,2014, from http://www.nhtsa.gov/About+NHTSA/Press+Releases/U.S.+DOT+Releases+Guidelines+to+ Minimize+In-Vehicle+Distractions Nielsen, J., & Norman, D. (n.d.). The definition of user experience. Retrieved September 16, 2014, from http://www.nngroup.com/articles/definition-user-experience/ Norman, D., Miller, J., Henderson, A. (1995). What you see, some of what's in the future, and how we go about doing it. Retrieved September 9, 2014, from http://www.sigchi.org/chi95/proceedings/orgover/dan_bdy.htm Patrizio, A. (2002, April 3). Did game play a role in suicide? Retrieved September 16, 2014, from

http://archive.wired.com/gaming/gamingreviews/news/2002/04/51490 Richtel, M. (2014, September 13). Trying to hit the brake on texting while driving. Retrieved September 14, 2014, from http://www.nytimes.com/2014/09/14/business/trying-to-hit-the- brake-on-texting-while-driving.html?_r=2 Trevino, L., & Nelson, K. (2014). Managing business ethics: Straight talk about how to do it right (Sixth ed.). Hoboken, NJ: Wiley & Sons. Toyad, J. (2014, March 13). China’s anti-gaming addiction measures not effective. Retrieved

September 15, 2014, from http://e27.co/chinas-anti-gaming-addiction-measures-effective/ UXPA. (n.d.). Retrieved September 9, 2014, from https://uxpa.org/ UXPA Code of Professional Conduct. (n.d.). Retrieved September 9, 2014, from https://uxpa.org/resources/uxpa-code-professional-conduct UX Design Defined. (2010, August 16). Retrieved September 9, 2014, from http://uxdesign.com/ux- defined Waldman, K. (2014, June 28). Facebook’s unethical experiment. Retrieved September 16, 2014, from

http://www.slate.com/articles/health_and_science/science/2014/06/facebook_unethical_experiment_it_made_news_feeds_happier_or_sadder_to_manipulate.html