manipulation and abuse on social media and abuse on social media emilio ferrara ... criminal gangs...

9
Manipulation and abuse on social media Emilio Ferrara School of Informatics and Computing, and Indiana University Network Science Institute Indiana University, Bloomington, IN, USA The computer science research community has became increasingly interested in the study of social media due to their pervasiveness in the everyday life of millions of individuals. Methodological questions and technical challenges abound as more and more data from social platforms become available for analysis. This data deluge not only yields the unprecedented opportunity to unravel questions about online individuals’ behavior at scale, but also allows to explore the potential perils that the massive adoption of social media brings to our society. These communication channels provide plenty of incentives (both economical and social) and opportunities for abuse. As social media activity became increasingly intertwined with the events in the offline world, individuals and organizations have found ways to exploit these platforms to spread misinformation, to attack and smear others, or to deceive and manipulate. During crises, social media have been effectively used for emergency response, but fear-mongering actions have also triggered mass hysteria and panic. Criminal gangs and terrorist organizations like ISIS adopt social media for propaganda and recruitment. Synthetic activity and social bots have been used to coordinate orchestrated astroturf campaigns, to manipulate political elections and the stock market. The lack of effective content verification systems on many of these platforms, including Twitter and Facebook, rises concerns when younger users become exposed to cyber-bulling, harassment, or hate speech, inducing risks like depression and suicide. This article illustrates some of the recent advances facing these issues and discusses what it remains to be done, including the challenges to address in the future to make social media a more useful and accessible, safer and healthier environment for all users. Introduction Social media play a central role in the social life of millions of people every day. They help us connect [Gilbert and Karahalios 2009; De Meo et al. 2014], access news and share information [Kwak et al. 2010; Ferrara et al. 2013], hold conversations and discuss our topics of interest [Ferrara et al. 2013; JafariAsbagh et al. 2014]. The wide adoption of social media arguably brought several positive effects to our soci- ety: social media played a pivotal role during recent social mobilizations across the world [Conover et al. 2013; Conover et al. 2013], helping democratizing the discussion of social issues [Varol et al. 2014]; they have been used during emergencies to coordinate disas- ter responses [Sakaki et al. 2010]; they have also proved effective in enhancing the social awareness about health issues such as obesity [Centola 2011], or increasing voting partici- pation during recent political elections [Bond et al. 2012]. SIGWEB Newsletter Spring 2015 arXiv:1503.03752v2 [cs.SI] 13 Mar 2015

Upload: ngoliem

Post on 21-Apr-2018

218 views

Category:

Documents


6 download

TRANSCRIPT

Manipulation and abuse on social media

Emilio FerraraSchool of Informatics and Computing, andIndiana University Network Science InstituteIndiana University, Bloomington, IN, USA

The computer science research community has became increasingly interested in the study of social

media due to their pervasiveness in the everyday life of millions of individuals. Methodological

questions and technical challenges abound as more and more data from social platforms becomeavailable for analysis. This data deluge not only yields the unprecedented opportunity to unravel

questions about online individuals’ behavior at scale, but also allows to explore the potential perils

that the massive adoption of social media brings to our society. These communication channelsprovide plenty of incentives (both economical and social) and opportunities for abuse. As social

media activity became increasingly intertwined with the events in the offline world, individuals

and organizations have found ways to exploit these platforms to spread misinformation, to attackand smear others, or to deceive and manipulate. During crises, social media have been effectively

used for emergency response, but fear-mongering actions have also triggered mass hysteria and

panic. Criminal gangs and terrorist organizations like ISIS adopt social media for propaganda andrecruitment. Synthetic activity and social bots have been used to coordinate orchestrated astroturf

campaigns, to manipulate political elections and the stock market. The lack of effective contentverification systems on many of these platforms, including Twitter and Facebook, rises concerns

when younger users become exposed to cyber-bulling, harassment, or hate speech, inducing risks

like depression and suicide. This article illustrates some of the recent advances facing these issuesand discusses what it remains to be done, including the challenges to address in the future to

make social media a more useful and accessible, safer and healthier environment for all users.

Introduction

Social media play a central role in the social life of millions of people every day. Theyhelp us connect [Gilbert and Karahalios 2009; De Meo et al. 2014], access news and shareinformation [Kwak et al. 2010; Ferrara et al. 2013], hold conversations and discuss ourtopics of interest [Ferrara et al. 2013; JafariAsbagh et al. 2014].

The wide adoption of social media arguably brought several positive effects to our soci-ety: social media played a pivotal role during recent social mobilizations across the world[Conover et al. 2013; Conover et al. 2013], helping democratizing the discussion of socialissues [Varol et al. 2014]; they have been used during emergencies to coordinate disas-ter responses [Sakaki et al. 2010]; they have also proved effective in enhancing the socialawareness about health issues such as obesity [Centola 2011], or increasing voting partici-pation during recent political elections [Bond et al. 2012].

SIGWEB Newsletter Spring 2015

arX

iv:1

503.

0375

2v2

[cs

.SI]

13

Mar

201

5

2 · Emilio Ferrara

Researchers in computing quickly realized the fresh opportunities brought by the rise ofsocial media: blending computational frameworks, machine learning techniques, and ques-tions from social and behavioral sciences, nowadays computational social science studiesthe impact of socio-technical systems on our society [Lazer et al. 2009; Vespignani 2009].

Although a large body of literature has been devoted to the usage of social media [Java et al.2007; Huberman et al. 2008; Asur and Huberman 2010], only recently our communitystarted realizing some potentially harmful effects that the abuse of these platforms mightcause to our society. Due to a mixture of social and economical incentives, a lack ofeffective policies against misbehavior, and insufficient technical solutions to timely detectand hinder improper use, social media have been recently characterized by widespreadabuse. In the following, I illustrate what are the perils that arise from some forms of socialmedia abuse, provide examples of the effects of such behaviors on our society, and discusspossible solutions.

Misinformation, fear, manipulation and abuse

On Tuesday April 23rd, 2013 at 1:07 p.m. the official Twitter account of the AssociatedPress (AP), one of the most influential American news agencies, posted a tweet reportingtwo explosions at the White House allegedly causing President Barack Obama to remaininjured. The tweet garnered several thousands retweets in a few minutes, and generatedcountless variants that spread uncontrolled reaching millions. In the short interval of timethat took to other agencies to challenge the veracity of this news, and to realize that theAP Twitter account had been hacked, the panic of a terror attack started diffusing throughthe population; as a direct consequence, the Dow Jones plummeted 147 points in a matterof 3 minutes, one of the largest point drops in its history; shortly after the confirmationof the hack, the Dow recovered but the crash erased $136 billion dollars. This event,captured in Fig. 1, is the first reported case of a planetary-scale misinformation spreadingwith tangible real-world effects and devastating economic loss. A later revindication bythe Syrian Electronic Army raised concerns about possibly new forms of cyber-terrorism,leveraging social media, aimed at generating mass hysteria to trigger market crashes.1

The AP hack exposed the risks related to security of social media from attacks aimed atimpersonation or identity theft. Yet, most importantly, it showed the potentially dangerouspower and effects social media conversation has on the offline, physical world —especiallyon the fragile and increasingly interconnected financial markets. This was not one isolatedevent: shadows have been cast [Hwang et al. 2012] on the role of social media during theinfamous 2010 flash crash of the stock market,2 after an inconclusive report by the Se-curities and Exchange Commission.3 More recently, a company named Cynk TechnologyCorp underwent the unfathomable gain of more than 36,000% when its penny stocks surgedfrom less than $0.10 to above $20 a share in a matter of few weeks, reaching a market capof above $6 billions.4 Further investigation revealed what seems to be an orchestrated at-

1Syrian hackers claim AP hack that tipped stock market by $136 billion. Is it terrorism? — washingtonpost.com/blogs/worldviews/wp/2013/04/23/syrian-hackers-claim-ap-hack2Wikipedia: 2010 Flash Crash — en.wikipedia.org/wiki/2010_Flash_Crash3Findings regarding the market events on May 6, 2010 — http://www.sec.gov/news/studies/2010/marketevents-report.pdf4The Curious Case of Cynk, an Abandoned Tech Company Now Worth $5 Billion — mashable.com/2014/

SIGWEB Newsletter Spring 2015

Manipulation and abuse on social media · 3

Fig. 1. The fake tweet causing the Dow Jones to plunge on April 23rd, 2013 in 3 minutes erased $136 billiondollars in equity market value.

tempt of touting the stock’s performance through artificial social media discussion, usingsocial bots and spam accounts [Ferrara et al. 2014; Yang et al. 2014]. The abundance ofsynthetically-generated content has direct implications for business ventures as well: busi-ness intelligence and analytics companies that use social media signals for market analysis(for example to predict how well a movie will do at the box office [Asur and Huberman2010; Mestyan et al. 2013], or to determine how a new TV show is being received by ittarget audience) have their results affected by the noise produced by synthetic/spam ac-counts.5

The presence of social bots on social media undermines the very roots of our informationsociety: they can be employed to fake grassroots political support (a phenomenon calledastroturf ) [Ratkiewicz et al. 2011; Ratkiewicz et al. 2011], or to reach millions of indi-viduals by using automated algorithms tuned for optimal interaction. One recent exampleis provided by ISIS, which is using social bots for propaganda and recruitment [Bergerand Morgan 2015], adopting different manipulation strategies according to the targets oftheir campaigns.6 The ongoing efforts of our community to fight social bots and syntheticactivity are summarized in a recent survey [Ferrara et al. 2014].

The spreading of manipulation campaigns has overwhelming societal effects. In politics,for example, smearing attacks have been perpetrated to defame candidates and damagetheir public images during various elections, including the 2009 Massachusetts Senate spe-

07/10/cynk5Nielsen’s New Twitter TV Ratings Are a Total Scam. Here’s Why. — defamer.gawker.com/nielsens-new-twitter-tv-ratings-are-a-total-scam-here-14422148426The Evolution of Terrorist Propaganda: The Paris Attack and Social Media — brookings.edu/research/testimony/2015/01/27-terrorist-propaganda-social-media-berger

SIGWEB Newsletter Spring 2015

4 · Emilio Ferrara

Fig. 2. The diffusion of information need on Twitter (RTs and MTs).

cial election [Metaxas and Mustafaraj 2012] and the 2010 U.S. Senate election [Ratkiewiczet al. 2011], while governments and other entities attempted to manipulate the public per-ception on impeding social issues7,8. The viral dynamics of information spreading onsocial media have been target of recent studies [Weng et al. 2013; 2014], however muchremains to be done to understand how to contain or hinder the diffusion of dangerous cam-paigns, including deception and hijacked ones.9,10

7Russian Twitter political protests ‘swamped by spam’ — bbc.com/news/technology-161088768Fake Twitter accounts used to promote tar sands pipeline — theguardian.com/environment/2011/aug/05/fake-twitter-tar-sands-pipeline9McDonald’s Twitter Campaign Goes Horribly Wrong #McDStories — businessinsider.com/mcdonalds-twitter-campaign-goes-horribly-wrong-mcdstories-2012-110NYPD’s Twitter campaign backfires — usatoday.com/story/news/nation-now/2014/04/23/nypd-twitter-mynypd-new-york/8042209/

SIGWEB Newsletter Spring 2015

Manipulation and abuse on social media · 5

Fig. 3. The spreading of fear-rich content across different Twitter communities (RTs and MTs).

Time Content

2014-10-17 13:33:57@FoxNews WHY WON’T YOU REPORT THE CRUISE SHIP IS BEINGDENIED ENTRY INTO BELIZE? CRUCIAL FOR CLOSING OUR BORDERS!

2014-10-17 13:39:44@FoxNews Those hc workers are totally wacky.Y today do they decide to tellsome1 they handled specimens from Ebola pt, esp on ship&not sick?

2014-10-17 13:40:38@NBCNews Many countries are refusing entry from flights from Ebola nations.Why aren’t you asking the WH about their crazy policy?

2014-10-17 13:56:20@CNN Why two doctors treated for EBOLA & INFECTED no one else includingCARE GIVERS but it’s not the same case for TX? Who screwed up?

2014-10-17 13:59:20@FoxNews Are they paying these potential Ebola victims money to get on shipsand planes? I am beginning to really wonder

2014-10-17 14:09:03OMG! He cld b infected ”@NBCNews: Who is man wearing plain clothesduring an #Ebola patient’s transfer?” via @NBCDFW

Table I. Examples of concerned or fear-rich tweets spreading during the Ebola emergency of Oct. 17th, 2014.

SIGWEB Newsletter Spring 2015

6 · Emilio Ferrara

Social media have been extensively adopted during crises and emergencies [Hughes andPalen 2009], in particular to coordinate disaster response [Yates and Paquette 2011], en-hance situational awareness [Yin et al. 2012], and sense the health state of the population[Sakaki et al. 2010]. However, manipulation of information (e.g., promotion of fake news)and misinformation spreading can cause panic and fear in the population, which can inturn become mass hysteria. The effects of such types of social media abuse have beenobserved during Hurricane Sandy at the end of 2012, after the Boston Marathon bombingsin April 2013, and increasingly every since. During Sandy, a storm of fake news struck theTwitter-sphere:11 examples of such misinformation spreading include rumors, misleadingor altered photos,12 sharing of untrue stories, and false alarms or unsubstantiated requestsfor help/support. After the Boston bombings, tweets reporting fake deaths or promotingfake donation campaigns spread uncontrolled during the first few hours after the events[Gupta et al. 2013]. Rumors and false claims about the capture of the individuals respon-sible for the bombing occurred throughout the four days after the event (the period duringwhich the man hunt was carried out).13

The examples above show how frequent social media abuse is during crises and emer-gencies. My current research aims at illustrating some of the devastating societal conse-quences of such abuse: by extracting features of crisis-related content generated by dif-ferent types of information producers (e.g., news agencies, Twitter influentials, or officialorganizations) along different dimensions (content sentiment, temporal patterns, networkdiffusion, etc.), we can study the characteristics of content produced on social media inreaction to external stimuli. Conversations can exhibit different classes of induced reac-tions (e.g., awareness vs. fear). We can highlight announcements and news that likelytrigger positive reactions in the population (awareness), and pinpoint to those that likelyyield negative feedback (fear, panic, etc.). By identifying fear-rich content and informationneeds [Zhao and Mei 2013], we can characterize the relation between communication bymedia/organizations and emotional responses (awareness, concerns, doubts, etc.) in thepopulation. We can study how fear and concern are triggered or mitigated by media andofficial communications, how this depends on language, timing, demography of the targetusers, etc.

Fig. 2 shows the spreading of information need [Zhao and Mei 2013] on Twitter: thedata capture all tweets produced during a short interval of 1 hour on October 17th, 2014,in the context of the discussion about the 2014 Ebola emergency. Particularly prominentnodes in the discussion are labeled, and they are positioned according to their centrality;different colors identify different topical communities [Ferrara et al. 2013; JafariAsbaghet al. 2014]. From a central core of users that start asking questions and information aboutEbola, the flow of retweets (RTs) and mentions (MTs) reaches thousands, affecting evencommunities several hops far away from such topics, including those around influentialaccounts like news agencies.

11Hurricane Sandy brings storm of fake news and photos to New York — theguardian.com/world/us-news-blog/2012/oct/30/hurricane-sandy-storm-new-york12Snopes.com: Hurricane Sandy Photographs — http://www.snopes.com/photos/natural/sandy.asp13Wikipedia: Boston Marathon bombings — en.wikipedia.org/wiki/Boston_Marathon_bombings

SIGWEB Newsletter Spring 2015

Manipulation and abuse on social media · 7

Fig. 3 illustrates another example of such analysis, showing the spread of concern and fear-rich content during the same interval of time (some example tweets are reported in Table1). Positioning, size and colors again represent the prominence of the accounts involvedin the discussion and different Twitter topical communities. We can observe how panicand fear spread virally, reaching large audiences at great diffusion speed: the exposureto contents that leverage human’s fears obfuscates our judgment and therefore our abilityto discriminate between factual truths and exaggerated news.14 In turn, this fosters thespreading of misinformation, rumors, and unverified news, potentially creating panic inthe population, and the generation of further, more negative content. This self-reinforcedmechanism can be interrupted by the implementation of ad-hoc intervention campaigns,designed to be effective on specific targets, based on their characteristics and susceptibility.

We urge a computational framework to deal with abuse in social media (especially duringcrises) that goes beyond simple data analytics, capable of actively design and implementeffective policies in order to support decision-making strategies and timely interventions.

Conclusions

In this article I illustrated some of the issues concerned with abuse on social media. Phe-nomena such as misinformation spreading are greatly magnified by the massive reach andpervasiveness that social media lately obtained. I brought several examples of the risksboth at the economic and social level that rise from social media abuse, and discussedexamples of such abuse in the context of political discussion, stock market manipulation,propaganda and recruitment. I finally exposed the consequences of the spreading of fearand panic during emergencies, as a consequence of improper communication on socialmedia, or fear-mongering activity by specific parties.

I highlighted some of the recent advances to challenge these issues, including the detectionof social bots, campaigns, and spam, yet arguing that much remains to be done to makesocial media a more useful and accessible, safer and healthier environment for all users.

Notably, recent studies illustrated how particular populations, such as younger social me-dia users, are even more exposed to abuse, for example in the form of cyber-bullying andharassment, or being targeted by stigmas and hate speech, increasing health risks like de-pression and suicide [Boyd 2008; 2014].

As a community, we need to face the methodological challenges and the technical issues todetect social media abuse to secure online social platforms, and at the same time try reducethe abundant social and economical incentives by creating frameworks of effective policiesagainst social media abuse.

REFERENCES

ASUR, S. AND HUBERMAN, B. A. 2010. Predicting the future with social media. In 2010 IEEE/WIC/ACMInternational Conference on Web Intelligence and Intelligent Agent Technology. IEEE, 492–499.

BERGER, J. AND MORGAN, J. 2015. The ISIS Twitter Census: Defining and describing the population of ISISsupporters on Twitter. The Brookings Project on U.S. Relations with the Islamic World 3, 20.

14Fear, Misinformation, and Social Media Complicate Ebola Fight — time.com/3479254/ebola-social-media/

SIGWEB Newsletter Spring 2015

8 · Emilio Ferrara

BOND, R. M., FARISS, C. J., JONES, J. J., KRAMER, A. D., MARLOW, C., SETTLE, J. E., AND FOWLER,J. H. 2012. A 61-million-person experiment in social influence and political mobilization. Nature 489, 7415,295–298.

BOYD, D. 2014. It’s complicated: The social lives of networked teens. Yale University Press.BOYD, D. M. 2008. Taken out of context: American teen sociality in networked publics. ProQuest.CENTOLA, D. 2011. An experimental study of homophily in the adoption of health behavior. Science 334, 6060,

1269–1272.CONOVER, M. D., DAVIS, C., FERRARA, E., MCKELVEY, K., MENCZER, F., AND FLAMMINI, A. 2013. The

geospatial characteristics of a social movement communication network. PloS one 8, 3, e55957.CONOVER, M. D., FERRARA, E., MENCZER, F., AND FLAMMINI, A. 2013. The digital evolution of Occupy

Wall Street. PloS one 8, 5, e64679.DE MEO, P., FERRARA, E., FIUMARA, G., AND PROVETTI, A. 2014. On Facebook, most ties are weak.

Communications of the ACM 57, 11, 78–84.FERRARA, E., JAFARIASBAGH, M., VAROL, O., QAZVINIAN, V., MENCZER, F., AND FLAMMINI, A. 2013.

Clustering memes in social media. In 2013 IEEE/ACM International Conference on Advances in SocialNetworks Analysis and Mining. IEEE, 548–555.

FERRARA, E., VAROL, O., DAVIS, C., MENCZER, F., AND FLAMMINI, A. 2014. The rise of social bots. arXivpreprint arXiv:1407.5225.

FERRARA, E., VAROL, O., MENCZER, F., AND FLAMMINI, A. 2013. Traveling trends: social butterflies orfrequent fliers? In First ACM conference on Online social networks. ACM, 213–222.

GILBERT, E. AND KARAHALIOS, K. 2009. Predicting tie strength with social media. In 27th SIGCHI Confer-ence on Human Factors in Computing Systems. ACM, 211–220.

GUPTA, A., LAMBA, H., AND KUMARAGURU, P. 2013. $1.00 per RT #BostonMarathon #PrayForBoston:Analyzing fake content on Twitter. In eCrime Researchers Summit. IEEE, 1–12.

HUBERMAN, B., ROMERO, D., AND WU, F. 2008. Social networks that matter: Twitter under the microscope.First Monday 14, 1.

HUGHES, A. L. AND PALEN, L. 2009. Twitter adoption and use in mass convergence and emergency events.International Journal of Emergency Management 6, 3, 248–260.

HWANG, T., PEARCE, I., AND NANIS, M. 2012. Socialbots: Voices from the fronts. Interactions 19, 2, 38–45.JAFARIASBAGH, M., FERRARA, E., VAROL, O., MENCZER, F., AND FLAMMINI, A. 2014. Clustering memes

in social media streams. Social Network Analysis and Mining 4, 1, 1–13.JAVA, A., SONG, X., FININ, T., AND TSENG, B. 2007. Why we Twitter: understanding microblogging usage

and communities. In 2007 Workshop on Web mining and social network analysis. ACM, 56–65.KWAK, H., LEE, C., PARK, H., AND MOON, S. 2010. What is Twitter, a social network or a news media? In

19th International Conference on World Wide Web. ACM, 591–600.LAZER, D., PENTLAND, A. S., ADAMIC, L., ARAL, S., BARABASI, A. L., BREWER, D., CHRISTAKIS, N.,

CONTRACTOR, N., FOWLER, J., GUTMANN, M., ET AL. 2009. Life in the network: the coming age ofcomputational social science. Science 323, 5915, 721.

MESTYAN, M., YASSERI, T., AND KERTESZ, J. 2013. Early prediction of movie box office success based onwikipedia activity big data. PloS one 8, 8, e71226.

METAXAS, P. T. AND MUSTAFARAJ, E. 2012. Social media and the elections. Science 338, 6106, 472–473.RATKIEWICZ, J., CONOVER, M., MEISS, M., GONCALVES, B., FLAMMINI, A., AND MENCZER, F. 2011.

Detecting and tracking political abuse in social media. In 5th International AAAI Conference on Weblogs andSocial Media. 297–304.

RATKIEWICZ, J., CONOVER, M., MEISS, M., GONCALVES, B., PATIL, S., FLAMMINI, A., AND MENCZER,F. 2011. Truthy: mapping the spread of astroturf in microblog streams. In 20th International Conferencecompanion on World Wide Web. ACM, 249–252.

SAKAKI, T., OKAZAKI, M., AND MATSUO, Y. 2010. Earthquake shakes Twitter users: real-time event detectionby social sensors. In 19th International Conference on World Wide Web. ACM, 851–860.

VAROL, O., FERRARA, E., OGAN, C. L., MENCZER, F., AND FLAMMINI, A. 2014. Evolution of online userbehavior during a social upheaval. In 2014 ACM conference on Web science. ACM, 81–90.

VESPIGNANI, A. 2009. Predicting the behavior of techno-social systems. Science 325, 5939, 425.

SIGWEB Newsletter Spring 2015

Manipulation and abuse on social media · 9

WENG, L., MENCZER, F., AND AHN, Y.-Y. 2013. Virality prediction and community structure in social net-works. Scientific Reports 3.

WENG, L., MENCZER, F., AND AHN, Y.-Y. 2014. Predicting successful memes using network and communitystructure. In 8th International AAAI Conference on Weblogs and Social Media.

YANG, Z., WILSON, C., WANG, X., GAO, T., ZHAO, B. Y., AND DAI, Y. 2014. Uncovering social networksybils in the wild. ACM Transactions on Knowledge Discovery from Data 8, 1, 2.

YATES, D. AND PAQUETTE, S. 2011. Emergency knowledge management and social media technologies: Acase study of the 2010 Haitian earthquake. International Journal of Information Management 31, 1, 6–13.

YIN, J., LAMPERT, A., CAMERON, M., ROBINSON, B., AND POWER, R. 2012. Using social media to enhanceemergency situation awareness. IEEE Intelligent Systems 27, 6, 52–59.

ZHAO, Z. AND MEI, Q. 2013. Questions about questions: An empirical analysis of information needs onTwitter. In Proceedings of the 22nd international conference on World Wide Web. International World WideWeb Conferences Steering Committee, 1545–1556.

Emilio Ferrara is a Research Assistant Professor at the School of Informatics and Computing of Indiana Uni-versity, Bloomington (USA) and a Research Scientist at the Indiana University Network Science Institute. Hisresearch interests lie at the intersection between Network Science, Data Science, Machine Learning, and Com-putational Social Science. His work explores Social Networks and Social Media Analysis, Criminal Networks,and Knowledge Engineering, and it appears on leading journals like Communications of the ACM and PhysicalReview Letters, several ACM and IEEE Transactions and Conference Proceedings.

SIGWEB Newsletter Spring 2015