polaritytrust: measuring trust and reputation in social networks
TRANSCRIPT
Departamento deLenguajes y Sistemas Informáticos
escuela técnica superiorde ingeniería informática
PolarityTrust: measuring Trust and Reputation in Social Networks
F. Javier [email protected]é A. Troyano
[email protected]ín L. Cruz
[email protected] Enríquez de Salamanca
Motivation
♦ Example: on-line marketplaces
Motivation
Motivation
How can I choose the best seller?
The one with highest amount of sales?The one with most positive opinions?The cheapest one?
How can I make the most from these transactions?
Selling more products but cheaper?Selling rare (and maybe expensive) articles?Free shipping?
Motivation
♦ Δ Reputation => Δ Sales
♦ Gaining high reputation:
● Obtain (false) positive opinions from other accounts (not neccesarily other users).
● Sell some bargains to obtain high reputation from the buyers.
● Give negative opinions for sellers that can be competitors.
Motivation
♦ Goals:● Compute a ranking of users according to their
trustworthiness
● Process a network with positive and negative links (opinions) between the nodes (users)
● Avoid the effects of the actions performed by malicious users in order to increase their reputation
Roadmap
♦ Introduction
♦ PolarityTrust
♦ Evaluation
♦ Conclusions
Introduction
♦ Trust and Reputation Systems (TRS) manage trustworthiness of users in social networks.
♦ Common mechanisms:● Moderators (on-line forums)● Votes from users to users (eBay)● Karma (Slashdot, Meneame)● Graph-based ranking algorithms (EigenTrust)
Introduction
♦ Users feedback needed!
♦ Problems:● Positive bias● Incentives for users feedback● Cold-start problem● Exit problem● Duplicity of identities
Introduction
♦ Malicious users strategies to gain high reputation:♦ Orchestrated attacks: Obtaining positive
opinions from other accounts (not neccesarily other users).
♦ Camouflage behind good behavior: selling some bargains to obtain high reputation from the buyers.
♦ Malicious spies: using a honest account to provide positive opinions to a malicious user.
♦ Camouflage behind judgments: giving negative opinions from seller that can be competitors.
Introduction
♦ Malicious users strategies to gain high reputation:♦ Orchestrated attacks: Obtaining positive
opinions from other accounts (not neccesarily other users).
8
9
67
3
2
54
0
1
Introduction
♦ Malicious users strategies to gain high reputation:♦ Camouflage behind good behavior: selling
some bargains to obtain high reputation from the buyers.
8
9
67
3
2
54
0
1
Introduction
♦ Malicious users strategies to gain high reputation:♦ Malicious spies: using a honest account to
provide positive opinions to a malicious user.
8
9
67
3
2
54
0
1
Introduction
♦ Malicious users strategies to gain high reputation:♦ Camouflage behind judgments: giving
negative opinions from seller that can be competitors.
8
9
67
3
2
54
0
1
PolarityTrust
♦ Graph-based ranking algorithm
♦ Two scores for each node: PT and PT⁺ ⁻
♦ Propagation of trust and distrust over the network
♦ PT and PT influence each other depending on the ⁺ ⁻polarity of the links between a node and its neighbours.
PolarityTrust
♦ Propagation mechanism:● Given a set of trustworthy users● Their PT and PT scores are propagated to their ⁺ ⁻
neighbours, and so on.
8
9
67
3
2
54
0
1
0
1
3
2
54
67
8
9
PolarityTrust
♦ Propagation rules:● Positive opinions => direct relation between scores● Negative opinions => cross relation between scores
a
b
c
a
b
c
♦ Non-negative Propagation extension:● Avoid the propagation of negative opinions from negative
users
Evaluation
♦ Baselines:● EigenTrust● Fans Minus Freaks
♦ Evaluation metrics:● Number of inversions: bad users in good positions● Incremental number of bad nodes
♦ Dataset:● Randomly generated graphs: Barabasi and Albert model.● Malicious users added in order to perform common attacks
Evaluation
♦ Performance against common attacks:
Models ET FmF PT PT+NN
A 50 0 0 0
B 197 36 0 0
C 63 207 94 94
D 86 9 9 9
E 74 4 0 0
Models ET FmF PT PT+NN
A 50 0 0 0
B 197 36 0 0
B+C 155 873 27 27
B+C+D 169 871 26 26
B+C+D+E 183 849 38 36
A: No attacks
B: Orchestrated attacks
C: Camouflage behind good behaviour
D: Malicious Spies
E: Camouflage behind judgments
Evaluation
♦ Performance against incremental number of malicious users:
Conclusions
♦ Something