sentiment analysis

Download Sentiment Analysis

If you can't read please download the document

Upload: read

Post on 25-Feb-2016

163 views

Category:

Documents


1 download

DESCRIPTION

Sentiment Analysis. What is Sentiment Analysis?. Positive or negative movie review?. unbelievably disappointing Full of zany characters and richly applied satire, and some great plot twists this is the greatest screwball comedy ever filmed - PowerPoint PPT Presentation

TRANSCRIPT

Information Extraction

Sentiment AnalysisWhat is Sentiment Analysis?Dan Jurafsky1Positive or negative movie review?unbelievably disappointing Full of zany characters and richly applied satire, and some great plot twists this is the greatest screwball comedy ever filmed It was pathetic. The worst part about it was the boxing scenes.

2

Dan JurafskyGoogle Product Searcha3

Dan JurafskyBing Shoppinga4

Dan JurafskyTwitter sentiment versus Gallup Poll of Consumer Confidence

Brendan O'Connor, Ramnath Balasubramanyan, Bryan R. Routledge, and Noah A. Smith. 2010. From Tweets to Polls: Linking Text Sentiment to Public Opinion Time Series. In ICWSM-2010Dan JurafskyThanks to Brendan OConnor and Noah Smith (email, 1/18/12) for permission to use this figure.5Twitter sentiment:Johan Bollen, Huina Mao, Xiaojun Zeng. 2011. Twitter mood predicts the stock market,Journal of Computational Science 2:1, 1-8. 10.1016/j.jocs.2010.12.007.

6

Dan JurafskyThanks to Johan Bollen (email, 1/18/12) for permission to use this figure.67

Dow JonesCALM predicts DJIA 3 days laterAt least one current hedge fund uses this algorithmCALMBollen et al. (2011)Dan JurafskyThanks to Johan Bollen (email, 1/18/12) for permission to use this figure.7Target Sentiment on TwitterTwitter Sentiment AppAlec Go, Richa Bhayani, Lei Huang. 2009. Twitter Sentiment Classification using Distant Supervision

8

Dan JurafskySentiment analysis has many other namesOpinion extractionOpinion miningSentiment miningSubjectivity analysis9Dan JurafskyWhy sentiment analysis?Movie: is this review positive or negative?Products: what do people think about the new iPhone?Public sentiment: how is consumer confidence? Is despair increasing?Politics: what do people think about this candidate or issue?Prediction: predict election outcomes or market trends from sentiment

10Dan JurafskyScherer Typology of Affective StatesEmotion: brief organically synchronized evaluation of a major event angry, sad, joyful, fearful, ashamed, proud, elatedMood: diffuse non-caused low-intensity long-duration change in subjective feelingcheerful, gloomy, irritable, listless, depressed, buoyantInterpersonal stances: affective stance toward another person in a specific interactionfriendly, flirtatious, distant, cold, warm, supportive, contemptuousAttitudes: enduring, affectively colored beliefs, dispositions towards objects or persons liking, loving, hating, valuing, desiringPersonality traits: stable personality dispositions and typical behavior tendenciesnervous, anxious, reckless, morose, hostile, jealousDan JurafskyScherer Typology of Affective StatesEmotion: brief organically synchronized evaluation of a major event angry, sad, joyful, fearful, ashamed, proud, elatedMood: diffuse non-caused low-intensity long-duration change in subjective feelingcheerful, gloomy, irritable, listless, depressed, buoyantInterpersonal stances: affective stance toward another person in a specific interactionfriendly, flirtatious, distant, cold, warm, supportive, contemptuousAttitudes: enduring, affectively colored beliefs, dispositions towards objects or persons liking, loving, hating, valuing, desiringPersonality traits: stable personality dispositions and typical behavior tendenciesnervous, anxious, reckless, morose, hostile, jealousDan JurafskySentiment AnalysisSentiment analysis is the detection of attitudesenduring, affectively colored beliefs, dispositions towards objects or personsHolder (source) of attitudeTarget (aspect) of attitudeType of attitudeFrom a set of typesLike, love, hate, value, desire, etc.Or (more commonly) simple weighted polarity: positive, negative, neutral, together with strengthText containing the attitudeSentence or entire document

13Dan JurafskySentiment AnalysisSimplest task:Is the attitude of this text positive or negative?More complex:Rank the attitude of this text from 1 to 5Advanced:Detect the target, source, or complex attitude types

Dan JurafskySentiment AnalysisSimplest task:Is the attitude of this text positive or negative?More complex:Rank the attitude of this text from 1 to 5Advanced:Detect the target, source, or complex attitude types

Dan JurafskySentiment AnalysisWhat is Sentiment Analysis?Dan Jurafsky16Sentiment AnalysisA Baseline AlgorithmDan Jurafsky17Sentiment Classification in Movie ReviewsPolarity detection:Is an IMDB movie review positive or negative?Data: Polarity Data 2.0: http://www.cs.cornell.edu/people/pabo/movie-review-dataBo Pang, Lillian Lee, and Shivakumar Vaithyanathan. 2002. Thumbs up? Sentiment Classification using Machine Learning Techniques. EMNLP-2002, 7986.Bo Pang and Lillian Lee. 2004. A Sentimental Education: Sentiment Analysis Using Subjectivity Summarization Based on Minimum Cuts. ACL, 271-278Dan JurafskyIMDB data in the Pang and Lee databasewhen _star wars_ came out some twenty years ago , the image of traveling throughout the stars has become a commonplace image . []when han solo goes light speed , the stars change to bright lines , going towards the viewer in lines that converge at an invisible point . cool . _october sky_ offers a much simpler imagethat of a single white dot , traveling horizontally across the night sky . [. . . ] snake eyes is the most aggravating kind of movie : the kind that shows so much potential then becomes unbelievably disappointing . its not just because this is a brian depalma film , and since hes a great director and one whos films are always greeted with at least some fanfare . and its not even because this was a film starring nicolas cage and since he gives a brauvara performance , this film is hardly worth his talents . Dan JurafskyBaseline Algorithm (adapted from Pang and Lee)TokenizationFeature ExtractionClassification using different classifiersNave BayesMaxEntSVMDan JurafskySentiment Tokenization IssuesDeal with HTML and XML markupTwitter mark-up (names, hash tags)Capitalization (preserve for words in all caps)Phone numbers, datesEmoticonsUseful code:Christopher Potts sentiment tokenizerBrendan OConnor twitter tokenizer

21[]? # optional hat/brow[:;=8] # eyes[\-o\*\']? # optional nose[\)\]\(\[dDpP/\:\}\{@\|\\] # mouth | #### reverse orientation[\)\]\(\[dDpP/\:\}\{@\|\\] # mouth[\-o\*\']? # optional nose[:;=8] # eyes[]? # optional hat/browPotts emoticonsDan JurafskyExtracting Features for Sentiment ClassificationHow to handle negationI didnt like this movie vsI really like this movieWhich words to use?Only adjectivesAll wordsAll words turns out to work better, at least on this data22Dan JurafskyNegationAdd NOT_ to every word between negation and following punctuation:

didnt like this movie , but I

didnt NOT_like NOT_this NOT_movie but IDas, Sanjiv and Mike Chen. 2001. Yahoo! for Amazon: Extracting market sentiment from stock message boards. In Proceedings of the Asia Pacific Finance Association Annual Conference (APFA).Bo Pang, Lillian Lee, and Shivakumar Vaithyanathan. 2002. Thumbs up? Sentiment Classification using Machine Learning Techniques. EMNLP-2002, 7986.Dan JurafskyReminder: Nave Bayes24

Dan JurafskyBinarized (Boolean feature) Multinomial Nave BayesIntuition:For sentiment (and probably for other text classification domains)Word occurrence may matter more than word frequencyThe occurrence of the word fantastic tells us a lotThe fact that it occurs 5 times may not tell us much more.Boolean Multinomial Nave BayesClips all the word counts in each document at 125Dan JurafskyBoolean Multinomial Nave Bayes: LearningCalculate P(cj) termsFor each cj in C do docsj all docs with class =cj

Textj single doc containing all docsjFor each word wk in Vocabulary nk # of occurrences of wk in TextjFrom training corpus, extract VocabularyCalculate P(wk | cj) termsRemove duplicates in each doc:For each word type w in docj Retain only a single instance of wDan JurafskyBoolean Multinomial Nave Bayes on a test document d27First remove all duplicate words from dThen compute NB using the same equation:

Dan JurafskyNormal vs. Boolean Multinomial NBNormalDocWordsClassTraining1Chinese Beijing Chinesec2Chinese Chinese Shanghaic3Chinese Macaoc4Tokyo Japan ChinesejTest5Chinese Chinese Chinese Tokyo Japan?28BooleanDocWordsClassTraining1Chinese Beijingc2Chinese Shanghaic3Chinese Macaoc4Tokyo Japan ChinesejTest5Chinese Tokyo Japan?Dan JurafskyBinarized (Boolean feature) Multinomial Nave BayesBinary seems to work better than full word countsThis is not the same as Multivariate Bernoulli Nave BayesMBNB doesnt work well for sentiment or other text tasksOther possibility: log(freq(w))

29B. Pang, L. Lee, and S. Vaithyanathan. 2002. Thumbs up? Sentiment Classification using Machine Learning Techniques. EMNLP-2002, 7986.V. Metsis, I. Androutsopoulos, G. Paliouras. 2006. Spam Filtering with Naive Bayes Which Naive Bayes? CEAS 2006 - Third Conference on Email and Anti-Spam.K.-M. Schneider. 2004. On word frequency information and negative evidence in Naive Bayes text classication. ICANLP, 474-485.JD Rennie, L Shih, J Teevan. 2003. Tackling the poor assumptions of naive bayes text classifiers. ICML 2003Dan JurafskyCross-ValidationBreak up data into 10 folds(Equal positive and negative inside each fold?)For each foldChoose the fold as a temporary test setTrain on 9 folds, compute performance on the test foldReport average performance of the 10 runs

Dan JurafskyOther issues in ClassificationMaxEnt and SVM tend to do better than Nave Bayes

31Dan JurafskyProblems: What makes reviews hard to classify?Subtlety:Perfume review in Perfumes: the Guide:If you are reading this because it is your darling fragrance, please wear it at home exclusively, and tape the windows shut. Dorothy Parker on Katherine HepburnShe runs the gamut of emotions from A to B32Dan JurafskyThwarted Expectationsand Ordering EffectsThis film should be brilliant. It sounds like a great plot, the actors are first grade, and the supporting cast is good as well, and Stallone is attempting to deliver a good performance. However, it cant hold up.Well as usual Keanu Reeves is nothing special, but surprisingly, the very talented Laurence Fishbourne is not so good either, I was surprised.33Dan JurafskySentiment AnalysisA Baseline AlgorithmDan Jurafsky34Sentiment AnalysisSentiment LexiconsDan Jurafsky35The General InquirerHome page: http://www.wjh.harvard.edu/~inquirerList of Categories: http://www.wjh.harvard.edu/~inquirer/homecat.htmSpreadsheet: http://www.wjh.harvard.edu/~inquirer/inquirerbasic.xlsCategories:Positiv (1915 words) and Negativ (2291 words)Strong vs Weak, Active vs Passive, Overstated versus UnderstatedPleasure, Pain, Virtue, Vice, Motivation, Cognitive Orientation, etcFree for Research UsePhilip J. Stone, Dexter C Dunphy, Marshall S. Smith, Daniel M. Ogilvie. 1966. The General Inquirer: A Computer Approach to Content Analysis. MIT PressDan JurafskyLIWC (Linguistic Inquiry and Word Count)Pennebaker, J.W., Booth, R.J., & Francis, M.E. (2007). Linguistic Inquiry and Word Count: LIWC 2007. Austin, TXHome page: http://www.liwc.net/2300 words, >70 classesAffective Processesnegative emotion (bad, weird, hate, problem, tough)positive emotion (love, nice, sweet)Cognitive ProcessesTentative (maybe, perhaps, guess), Inhibition (block, constraint)Pronouns, Negation (no, never), Quantifiers (few, many) $30 or $90 feeDan JurafskyMPQA Subjectivity Cues LexiconHome page: http://www.cs.pitt.edu/mpqa/subj_lexicon.html6885 words from 8221 lemmas2718 positive4912 negativeEach word annotated for intensity (strong, weak)GNU GPL

38Theresa Wilson, Janyce Wiebe, and Paul Hoffmann (2005). Recognizing Contextual Polarity in Phrase-Level Sentiment Analysis. Proc. of HLT-EMNLP-2005.

Riloff and Wiebe (2003). Learning extraction patterns for subjective expressions. EMNLP-2003.

Dan JurafskyBing Liu Opinion LexiconBing Liu's Page on Opinion Mininghttp://www.cs.uic.edu/~liub/FBS/opinion-lexicon-English.rar

6786 words2006 positive4783 negative

39Minqing Hu and Bing Liu. Mining and Summarizing Customer Reviews. ACM SIGKDD-2004.Dan JurafskySentiWordNetStefano Baccianella, Andrea Esuli, and Fabrizio Sebastiani. 2010 SENTIWORDNET 3.0: An Enhanced Lexical Resource for Sentiment Analysis and Opinion Mining. LREC-2010Home page: http://sentiwordnet.isti.cnr.it/All WordNet synsets automatically annotated for degrees of positivity, negativity, and neutrality/objectiveness [estimable(J,3)] may be computed or estimated Pos 0 Neg 0 Obj 1 [estimable(J,1)] deserving of respect or high regard Pos .75 Neg 0 Obj .25

Dan JurafskyDisagreements between polarity lexiconsOpinion LexiconGeneral InquirerSentiWordNetLIWCMPQA33/5402 (0.6%)49/2867 (2%)1127/4214 (27%)12/363 (3%)Opinion Lexicon32/2411 (1%)1004/3994 (25%)9/403 (2%)General Inquirer520/2306 (23%)1/204 (0.5%)SentiWordNet174/694 (25%)LIWC41Christopher Potts, Sentiment Tutorial, 2011 Dan JurafskyAnalyzing the polarity of each word in IMDBHow likely is each word to appear in each sentiment class?Count(bad) in 1-star, 2-star, 3-star, etc.But cant use raw counts: Instead, likelihood:

Make them comparable between wordsScaled likelihood:Potts, Christopher. 2011. On the negativity of negation. SALT 20, 636-659.

Dan JurafskyThanks to Chris Potts for permission to use figure42Analyzing the polarity of each word in IMDB

Scaled likelihoodP(w|c)/P(w)Scaled likelihoodP(w|c)/P(w)Potts, Christopher. 2011. On the negativity of negation. SALT 20, 636-659.Dan JurafskyThanks to Chris Potts for permission to use figures

43Other sentiment feature: Logical negationIs logical negation (no, not) associated with negative sentiment?Potts experiment:Count negation (not, nt, no, never) in online reviewsRegress against the review ratingPotts, Christopher. 2011. On the negativity of negation. SALT 20, 636-659.Dan JurafskyPotts 2011 Results:More negation in negative sentimenta

Scaled likelihoodP(w|c)/P(w)Dan JurafskyThanks to Chris Potts for permission to use figures

45Sentiment AnalysisSentiment LexiconsDan Jurafsky46Sentiment AnalysisLearning Sentiment LexiconsDan Jurafsky47Semi-supervised learning of lexiconsUse a small amount of informationA few labeled examplesA few hand-built patternsTo bootstrap a lexicon48Dan JurafskyHatzivassiloglou and McKeown intuition for identifying word polarityAdjectives conjoined by and have same polarityFair and legitimate, corrupt and brutal*fair and brutal, *corrupt and legitimateAdjectives conjoined by but do notfair but brutal

49Vasileios Hatzivassiloglou and Kathleen R. McKeown. 1997. Predicting the Semantic Orientation of Adjectives. ACL, 174181Dan JurafskyHatzivassiloglou & McKeown 1997Step 1Label seed set of 1336 adjectives (all >20 in 21 million word WSJ corpus)657 positiveadequate central clever famous intelligent remarkable reputed sensitive slender thriving679 negativecontagious drunken ignorant lanky listless primitive strident troublesome unresolved unsuspecting

50Dan Jurafsky

Hatzivassiloglou & McKeown 1997Step 2Expand seed set to conjoined adjectives

51nice, helpfulnice, classyDan JurafskyHatzivassiloglou & McKeown 1997Step 3Supervised classifier assigns polarity similarity to each word pair, resulting in graph:52classynicehelpfulfairbrutalirrationalcorruptDan JurafskyHatzivassiloglou & McKeown 1997Step 4Clustering for partitioning the graph into two53classynicehelpfulfairbrutalirrationalcorrupt+-Dan JurafskyOutput polarity lexiconPositivebold decisive disturbing generous good honest important large mature patient peaceful positive proud sound stimulating straightforward strange talented vigorous wittyNegativeambiguous cautious cynical evasive harmful hypocritical inefficient insecure irrational irresponsible minor outspoken pleasant reckless risky selfish tedious unsupported vulnerable wasteful54Dan JurafskyOutput polarity lexiconPositivebold decisive disturbing generous good honest important large mature patient peaceful positive proud sound stimulating straightforward strange talented vigorous wittyNegativeambiguous cautious cynical evasive harmful hypocritical inefficient insecure irrational irresponsible minor outspoken pleasant reckless risky selfish tedious unsupported vulnerable wasteful55Dan JurafskyTurney AlgorithmExtract a phrasal lexicon from reviewsLearn polarity of each phraseRate a review by the average polarity of its phrases56Turney (2002): Thumbs Up or Thumbs Down? Semantic Orientation Applied to Unsupervised Classification of ReviewsDan JurafskyExtract two-word phrases with adjectivesFirst WordSecond WordThird Word (not extracted)JJNN or NNSanythingRB, RBR, RBSJJNot NN nor NNSJJJJNot NN or NNSNN or NNSJJNor NN nor NNSRB, RBR, or RBSVB, VBD, VBN, VBGanything57Dan JurafskyHow to measure polarity of a phrase?Positive phrases co-occur more with excellentNegative phrases co-occur more with poorBut how to measure co-occurrence?58Dan JurafskyPointwise Mutual InformationMutual information between 2 random variables X and Y

Pointwise mutual information: How much more do events x and y co-occur than if they were independent?

Dan Jurafsky59Pointwise Mutual InformationPointwise mutual information: How much more do events x and y co-occur than if they were independent?

PMI between two words: How much more do two words co-occur than if they were independent?

Dan Jurafsky60How to Estimate Pointwise Mutual InformationQuery search engine (Altavista)P(word) estimated by hits(word)/NP(word1,word2) by hits(word1 NEAR word2)/N(More correctly the bigram denominator should be kN, because there are a total of N consecutive bigrams (word1,word2), but kN bigrams that are k words apart, but we just use N on the rest of this slide and the next.)

Dan Jurafsky61Does phrase appear more with poor or excellent?62

Dan JurafskyPhrases from a thumbs-up review63PhrasePOS tagsPolarityonline serviceJJ NN2.8online experienceJJ NN2.3direct depositJJ NN1.3local branchJJ NN0.42low feesJJ NNS0.33true serviceJJ NN-0.73other bankJJ NN-0.85inconveniently locatedJJ NN-1.5Average0.32Dan JurafskyPhrases from a thumbs-down review64PhrasePOS tagsPolaritydirect depositsJJ NNS5.8online webJJ NN1.9very handyRB JJ1.4virtual monopolyJJ NN-2.0lesser evilRBR JJ-2.3other problemsJJ NNS-2.8low fundsJJ NNS-6.8unethical practicesJJ NNS-8.5Average-1.2Dan JurafskyResults of Turney algorithm410 reviews from Epinions170 (41%) negative240 (59%) positiveMajority class baseline: 59%Turney algorithm: 74%

Phrases rather than wordsLearns domain-specific information65Dan JurafskyUsing WordNet to learn polarityWordNet: online thesaurus (covered in later lecture).Create positive (good) and negative seed-words (terrible)Find Synonyms and AntonymsPositive Set: Add synonyms of positive words (well) and antonyms of negative words Negative Set: Add synonyms of negative words (awful) and antonyms of positive words (evil)Repeat, following chains of synonymsFilter66 S.M. Kim and E. Hovy. 2004. Determining the sentiment of opinions. COLING 2004M. Hu and B. Liu. Mining and summarizing customer reviews. In Proceedings of KDD, 2004Dan JurafskySummary on Learning LexiconsAdvantages:Can be domain-specificCan be more robust (more words)IntuitionStart with a seed set of words (good, poor)Find other words that have similar polarity:Using and and butUsing words that occur nearby in the same documentUsing WordNet synonyms and antonyms

Use seeds and semi-supervised learning to induce lexiconsDan JurafskySentiment AnalysisLearning Sentiment LexiconsDan Jurafsky68Sentiment AnalysisOther Sentiment TasksDan Jurafsky69Finding sentiment of a sentenceImportant for finding aspects or attributesTarget of sentiment

The food was great but the service was awful70Dan JurafskyFinding aspect/attribute/target of sentimentFrequent phrases + rulesFind all highly frequent phrases across reviews (fish tacos)Filter by rules like occurs right after sentiment wordgreat fish tacos means fish tacos a likely aspectCasinocasino, buffet, pool, resort, bedsChildrens Barberhaircut, job, experience, kidsGreek Restaurantfood, wine, service, appetizer, lambDepartment Storeselection, department, sales, shop, clothingM. Hu and B. Liu. 2004. Mining and summarizing customer reviews. In Proceedings of KDD.S. Blair-Goldensohn, K. Hannan, R. McDonald, T. Neylon, G. Reis, and J. Reynar. 2008. Building a Sentiment Summarizer for Local Service Reviews. WWW Workshop.Dan JurafskyFinding aspect/attribute/target of sentimentThe aspect name may not be in the sentenceFor restaurants/hotels, aspects are well-understoodSupervised classificationHand-label a small corpus of restaurant review sentences with aspectfood, dcor, service, value, NONETrain a classifier to assign an aspect to asentenceGiven this sentence, is the aspect food, dcor, service, value, or NONE

72Dan JurafskyPutting it all together:Finding sentiment for aspects73ReviewsFinalSummarySentences& PhrasesSentences& PhrasesSentences& PhrasesTextExtractorSentimentClassifierAspectExtractorAggregatorS. Blair-Goldensohn, K. Hannan, R. McDonald, T. Neylon, G. Reis, and J. Reynar. 2008. Building a Sentiment Summarizer for Local Service Reviews. WWW WorkshopDan JurafskyResults of Blair-Goldensohn et al. methodRooms (3/5 stars, 41 comments)(+) The room was clean and everything worked fine even the water pressure ...(+) We went because of the free room and was pleasantly pleased ...(-) the worst hotel I had ever stayed at ...Service (3/5 stars, 31 comments)(+) Upon checking out another couple was checking early due to a problem ...(+) Every single hotel staff member treated us great and answered every ...(-) The food is cold and the service gives new meaning to SLOW.Dining (3/5 stars, 18 comments)(+) our favorite place to stay in biloxi.the food is great also the service ...(+) Offer of free buffet for joining the PlayDan JurafskyBaseline methods assume classes have equal frequencies!If not balanced (common in the real world) cant use accuracies as an evaluation need to use F-scoresSevere imbalancing also can degrade classifier performanceTwo common solutions:Resampling in trainingRandom undersamplingCost-sensitive learning Penalize SVM more for misclassification of the rare thing

75Dan JurafskyHow to deal with 7 stars?Map to binaryUse linear or ordinal regressionOr specialized models like metric labeling76Bo Pang and Lillian Lee. 2005. Seeing stars: Exploiting class relationships for sentiment categorization with respect to rating scales. ACL, 115124Dan JurafskySummary on SentimentGenerally modeled as classification or regression taskpredict a binary or ordinal labelFeatures:Negation is importantUsing all words (in nave bayes) works well for some tasksFinding subsets of words may help in other tasksHand-built polarity lexiconsUse seeds and semi-supervised learning to induce lexiconsDan JurafskyScherer Typology of Affective StatesEmotion: brief organically synchronized evaluation of a major event angry, sad, joyful, fearful, ashamed, proud, elatedMood: diffuse non-caused low-intensity long-duration change in subjective feelingcheerful, gloomy, irritable, listless, depressed, buoyantInterpersonal stances: affective stance toward another person in a specific interactionfriendly, flirtatious, distant, cold, warm, supportive, contemptuousAttitudes: enduring, affectively colored beliefs, dispositions towards objects or persons liking, loving, hating, valuing, desiringPersonality traits: stable personality dispositions and typical behavior tendenciesnervous, anxious, reckless, morose, hostile, jealousDan JurafskyComputational work on other affective statesEmotion: Detecting annoyed callers to dialogue systemDetecting confused/frustrated versus confident studentsMood: Finding traumatized or depressed writersInterpersonal stances: Detection of flirtation or friendliness in conversationsPersonality traits: Detection of extrovertsDan JurafskyDetection of FriendlinessFriendly speakers use collaborative conversational styleLaughterLess use of negative emotional wordsMore sympathy Thats too bad Im sorry to hear thatMore agreementI think so tooLess hedgeskind of sort of a little

80Ranganath, Jurafsky, McFarlandDan JurafskySentiment AnalysisOther Sentiment TasksDan Jurafsky81