1
Reaching Agreements Reaching Agreements NegotiationNegotiation
2
Truthful voters vote for the candidate they think is bestTruthful voters vote for the candidate they think is bestWhy would you vote for something you didnrsquot want Why would you vote for something you didnrsquot want
(run off election ndash want to pick competition) (more (run off election ndash want to pick competition) (more than two canddiates figure your candidate doesnrsquot than two canddiates figure your candidate doesnrsquot have a chance)have a chance)
We vote in awarding scholarships teacher of the year We vote in awarding scholarships teacher of the year person to hireperson to hire
Rank feasible social outcomes based on agents individual ranking of those outcomes
A - set of n agents O - set of m feasible outcomes Each agent has a preference relation lti O x O
asymmetric and transitive
2
VotingVoting
3
Social choice rule (good for society) InputInput the agent preference relations (lt1 hellip ltn)
OutputOutput elements of O sorted according the input - gives the social preference relation lt of the agent group
In other words ndash creates ordering for the group
3
4
Desirable properties of the social choice rule
A social preference ordering lt should exist for all possible inputs (individual preferences)
lt should be defined for every pair (o o)O lt should be asymmetric and transitive over O The outcomes should be Pareto efficient
if i A o lti o then o lt olsquo (not misorder if all agree) The scheme should be independent of irrelevant alternatives (if
all agree on relative ranking of two should retain ranking in social choice)
if i A lt and ltlsquo are rankings based on different sets of choices and satisfy o lti o and o lti olsquo (their relative rankings are unaffected by other choices being present) then the social ranking of o and o should have same relationship
No agent should be a dictator in the sense thato lti o implies o lt o for all preferences of the other
agents
5
Arrows impossibility theoremArrows impossibility theorem No social choice rule satisfies all of the six conditions Must relax desired attributes
May not require gt to always be defined We may not require that gt is asymmetic and transitiveUse plurality protocol all votes are cast simultaneously and
highest vote count wins Introducing an irrelevant alternative may split the
majority causing the old majority and the new irrelevant to drop out of favor (The Ross Perot effect)
A binary protocol involves voting pairwise ndash single eliminationThe order of the pairing can totally change the results
(Figure below is fascinating) Reason for rankings in basketball tournament
6
One voter ranks c gt d gt b gt aOne voter ranks a gt c gt d gt bOne voter ranks b gt a gt c gt dNotice just rotates preferences
winner (c (winner (a winner(bd)))=awinner (d (winner (b winner(ca)))=d
winner (d (winner (c winner(ab)))=c
winner (b (winner (d winner(ca)))=b
surprisingly order of pairing yields different winner
7
Borda protocol (used if binary protocol is too slow) = assigns an alternative |O| points for the highest preference |O|-1 points for the second and so on
The counts are summed across the voters and the alternative with the highest count becomes the social choice
Winner turns loser and loser turns winner if the lowest ranked alternative is removed (does this surprise you) See Table on next slide
7
8
Borda Paradox ndash remove loser winner changes(notice c is always ahead of removed item)bull a gt b gt c gtd bull b gt c gt d gtabull c gt d gt a gt bbull a gt b gt c gt dbull b gt c gt dgt abull c gtd gt a gtbbull a ltb ltc lt da=18 b=19 c=20
d=13
a gt b gt c b gt c gta c gt a gt b a gt b gt c b gt c gt a c gt a gtb a ltb ltc
a=15b=14 c=13
When loser is removed next loser becomes winner
9
Strategic (insincere) votersbull Suppose your choice will likely come in second
place If you rank the first choice of rest of group very low you may lower that choice enough so yours is first
bull True story Deanrsquos selection Each committee member told they had 5 points to award and could spread out any way among the candidates The recipient of the most points wins I put all my points on one candidate Most split their points I swung the vote What was my gamble
bull Want to get the results as if truthful voting were done
10
Typical Competition Mechanisms
bull Auction allocate goods or tasks to agents through market Need a richer technique for reaching agreements
bull Negotiation reach agreements through interaction
bull Argumentation resolve confliction through debates
11
Negotiation
bull May involve
ndash Exchange of information
ndash Relaxation of initial goals
ndash Mutual concession
12
Mechanisms Protocols Strategies
bull Negotiation is governed by a mechanism or a
protocol
ndash defines the rdquorules of encounterrdquo between the agents
ndash the public rules by which the agents will come to
agreements
bull Given a particular protocol how can a particular
strategy be designed that individual agents can use
13
Negotiation is the process of reaching agreements on matters of common interest It usually proceeds in a series of rounds with every agent making a proposal at every round
Negotiation Mechanism
Issues in negotiation processbull Negotiation Space All possible deals that agents can make ie t
he set of candidate deals bull Negotiation Protocol ndash A rule that determines the process of a ne
gotiation how and when a proposal can be made when a deal has been struck when the negotiation should be terminated and so
bull Negotiation Strategy When and what proposals should be made
14
Protocol
bull Means kinds of deals that can be made
bull Means sequence of offers and counter-offers
bull Protocol is like rules of chess game whereas strategy is way in which player decides which move to make
15
Game Theory
bull Computers make concrete the notion of strategy which is central to game playing
16
Mechanisms Design
bull Mechanism design is the design of protocols for governing multi-
agent interactions
bull Desirable properties of mechanisms are
ndash Convergenceguaranteed success
ndash Maximising global welfare sum of agent benefits are maximized
ndash Pareto efficiency
ndash Individual rationality
ndash Stability no agent should have incentive to deviate from strategy
ndash Simplicity low computational demands little communication
ndash Distribution no central decision maker
ndash Symmetry not want agents to play different roles (all agents have same
choice of actions)
17
Attributes not universally accepted
bull Canrsquot always achieve every attribute so look at tradeoffs of choices (for example) efficiency and stability are sometimes in conflict with each other
18
Negotiation Protocol
bull Who beginsbull Take turnsbull Build off previous offersbull Give feed back (or not)bull Tell what utility is (or not)bull Obligations bull Privacybull Allowed proposals you can make as a result of
negotiation history
19
Thought Question
bull Why not just compute a joint solution ndash using linear programming
20
Negotiation Process 1
bull Negotiation usually proceeds in a series of rounds
with every agent making a proposal at every round
bull Communication during negotiation
Proposal
Counter Proposal
Agenti concedes
Agenti Agentj
21
Negotiation Process 2
bull Another way of looking at the negotiation
process is (can talk about 5050 or 9010
depending on who rdquomovesrdquo the farthest)
Proposals by AjProposals by AiPoint of
Acceptanceaggreement
22
Many types of interactive concession based methods
bull Some use multiple objective linear programming ndash ndash requires that the players construct a crude linear
approximation of t heir utility functions
bull Jointly Improving Direction method Start out with a neutral suggestive value continue until no joint improvements are possible ndash Used in Camp Daivd peace negotiations (EgyptIsrael
ndash Jimmy Carter Nobel Peace Prize 2002)
23
Jointly Improving Direction method
Iterate overbull Mediator helps players criticize a tentative
agreement (could be status quo)bull Generates a compromise direction (where each
of the k issues is a direction in k-space)bull Mediator helps players to find a jointly preferred
outcome along the compromise direction and then proposes a new tentative agreement
24
Typical Negotiation ProblemsTask-Oriented Domains(TOD) an agents activity can be defined in terms of a set of tasks that it has to achieve The target of a negotiation is to minimize the cost of completing the tasks
State Oriented Domains(SOD) each agent is concerned with moving the world from an initial state into one of a set of goal states The target of a negotiation is to achieve a common goal Main attribute actions have side effects (positivenegative)
Worth Oriented Domains(WOD) agents assign a worth to each potential state which captures its desirability for the agent The target of a negotiation is to maximize mutual worth (rather than worth to individual)
25
Complex Negotiations
bull Some attributes that make the negotiation process
complex are
ndash Multiple attributes
bull Single attribute (price) ndash symmetric scenario (both benefit in the
same way by a cheaper price)
bull Multiple attributes ndash several inter-related attributes eg buying a
car
ndash The number of agents and the way they interact
bull One-to-one eg single buyer and single seller
bull Many-to-one eg multiple buyers and a single seller auctions
bull Many-to-many eg multiple buyers and multiple sellers
26
Single issue negotiation
bull Like moneybull Symmetric (If roles were reversed I would
benefit the same way you would) ndash If one task requires less travel both would benefit
equally by having less travelndash utility for a task is experienced the same way by
whomever is assigned to that taskbull Non-symmetric ndash we would benefit differently if
roles were reversedndash if you delivered the picnic table you could just throw it
in the back of your van If I delivered it I would have to rent a U-haul to transport it (as my car is small)
27
Multiple Issue negotiation
bull Could be hundreds of issues (cost delivery date size quality)
bull Some may be inter-related (as size goes down cost goes down quality goes up)
bull Not clear what a true concession is (larger may be cheaper but harder to store or spoils before can be used)
bull May not even be clear what is up for negotiation (I didnrsquot realize not having any test was an option) (on the jobhellipAsk for stock options bigger office work from home)
28
How many agents are involved
bull One to one
bull One to many (auction is an example of one seller and many buyers)
bull Many to many (could be divided into buyers and sellers or all could be identical in role)ndash n(n-1)2 number of pairs
29
Negotiation DomainsTask-oriented
bull rdquoDomains in which an agentrsquos activity can be defined
in terms of a set of tasks that it has to achieverdquo (Rosenschein amp Zlotkin 1994)
bull An agent can carry out the tasks without interference (or
help) from other agents ndash such as rdquowho will deliver the
mailrdquo
bull All resources are available to the agent
bull Tasks redistributed for the benefit of all agents
30
Task-oriented Domain Definitionbull How can an agent evaluate the utility of a specific deal
ndash Utility represents how much an agent has to gain from the deal (it is
always based on change from original allocation)
ndash Since an agent can achieve the goal on its own it can compare the cost of
achieving the goal on its own to the cost of its part of the deal
bull If utilitylt0 it is worse off than performing tasks on its own
bull Conflict deal (stay with status quo) if agents fail to reach an
agreement
ndash where no agent agrees to execute tasks other than its own
bull utlity = 0
31
Formalization of TODA Task Oriented Domain(TOD) is a triple ltT Ag cgt
wherendash T is a finite set of all possible tasks
ndash Ag=A1 A2hellip An is a list of participant agentsndash c(T)R+ defines cost of executing each subset of tasks
Assumptions on cost function1 c() = 02 The cost of a subset of tasks does not depend on who carries out
them (Idealized situation)3 Cost function is monotonic which means that more tasks more
cost (It canrsquot cost less to take on more tasks) i T1 T2 implies c(T1) c(T2)
32
Redistribution of TasksGiven a TOD ltT A1A2 cgt T is original assignment D i
s assignment after the ldquodealrdquobull An encounter (instance) within the TOD is an ordered
list (T1 T2) such that for all k Tk T This is an original allocation of tasks that they might want to reallocate
bull A pure deal on an encounter is the redistribution of tasks among agents (D1 D2) such that all tasks are reassigned
D1 D2= T1 T2
Specifically (D1 D2)=(T1 T2) is called the conflict deal bull For each deal =(D1 D2) the cost of such a deal to
agent k is Costk()=c(Dk) (ie cost to k of deal is cost of Dk krsquos part of deal)
33
Examples of TOD
bull Parcel Delivery
Several couriers have to deliver sets of parcels to different cities The target of negotiation is to reallocate deliveries so that the cost of travel to each courier is minimalbull Database Queries
Several agents have access to a common database and each has to carry out a set of queries The target of negotiation is to arrange queries so as to maximize efficiency of database operations (Join Projection Union Intersection hellip) You are doing a join as part of another operation so please save the results for me
34
Possible DealsConsider an encounter from the Parcel Delivery Domain Suppose we have two agents Both agents have parcels to deliver to city a and only agent 2 has parcels to deliver to city b There are nine distinct pure deals in this encounter
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
the conflict deal
35
Figure deals knowing union must be ab
bull Choices for first agent a b ab
bull Second agent must ldquopick up the slackrdquo
bull a for agent 1 b|ab (for agent 2)
bull b for agent 1a|ab
bull ab for agent 1 a|ab|b|
bull for agent 1 ab
36
Utility Function for AgentsGiven an encounter (T1 T2) the utility function for each agent is just the difference of costs and is defined as follow
Utilityk()=c(Tk)-Costk() = c(Tk)- c(Dk)
where =(D1 D2) is a deal
ndash c(Tk) is the stand-alone cost to agent k (the cost of achieving its goal with no help)
ndash Costk() is the cost of its part of the deal
Note that the utility of the conflict deal is always 0
37
Parcel Delivery Domain (assuming do not have to return home ndash like
Uhaul)Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
38
Dominant Dealsbull Deal dominates deal if is better for at least one agent
and not worse for the other ie is at least as good for every agent as
k12 Utilityk() Utilityk()
is better for some agent than
k12 Utilityk()gt Utilityk()
bull Deal weakly dominates deal if at least the first condition holds (deal isnrsquot worse for anyone)
Any reasonable agent would prefer (or go along with) over
if dominates or weakly dominates
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
2
Truthful voters vote for the candidate they think is bestTruthful voters vote for the candidate they think is bestWhy would you vote for something you didnrsquot want Why would you vote for something you didnrsquot want
(run off election ndash want to pick competition) (more (run off election ndash want to pick competition) (more than two canddiates figure your candidate doesnrsquot than two canddiates figure your candidate doesnrsquot have a chance)have a chance)
We vote in awarding scholarships teacher of the year We vote in awarding scholarships teacher of the year person to hireperson to hire
Rank feasible social outcomes based on agents individual ranking of those outcomes
A - set of n agents O - set of m feasible outcomes Each agent has a preference relation lti O x O
asymmetric and transitive
2
VotingVoting
3
Social choice rule (good for society) InputInput the agent preference relations (lt1 hellip ltn)
OutputOutput elements of O sorted according the input - gives the social preference relation lt of the agent group
In other words ndash creates ordering for the group
3
4
Desirable properties of the social choice rule
A social preference ordering lt should exist for all possible inputs (individual preferences)
lt should be defined for every pair (o o)O lt should be asymmetric and transitive over O The outcomes should be Pareto efficient
if i A o lti o then o lt olsquo (not misorder if all agree) The scheme should be independent of irrelevant alternatives (if
all agree on relative ranking of two should retain ranking in social choice)
if i A lt and ltlsquo are rankings based on different sets of choices and satisfy o lti o and o lti olsquo (their relative rankings are unaffected by other choices being present) then the social ranking of o and o should have same relationship
No agent should be a dictator in the sense thato lti o implies o lt o for all preferences of the other
agents
5
Arrows impossibility theoremArrows impossibility theorem No social choice rule satisfies all of the six conditions Must relax desired attributes
May not require gt to always be defined We may not require that gt is asymmetic and transitiveUse plurality protocol all votes are cast simultaneously and
highest vote count wins Introducing an irrelevant alternative may split the
majority causing the old majority and the new irrelevant to drop out of favor (The Ross Perot effect)
A binary protocol involves voting pairwise ndash single eliminationThe order of the pairing can totally change the results
(Figure below is fascinating) Reason for rankings in basketball tournament
6
One voter ranks c gt d gt b gt aOne voter ranks a gt c gt d gt bOne voter ranks b gt a gt c gt dNotice just rotates preferences
winner (c (winner (a winner(bd)))=awinner (d (winner (b winner(ca)))=d
winner (d (winner (c winner(ab)))=c
winner (b (winner (d winner(ca)))=b
surprisingly order of pairing yields different winner
7
Borda protocol (used if binary protocol is too slow) = assigns an alternative |O| points for the highest preference |O|-1 points for the second and so on
The counts are summed across the voters and the alternative with the highest count becomes the social choice
Winner turns loser and loser turns winner if the lowest ranked alternative is removed (does this surprise you) See Table on next slide
7
8
Borda Paradox ndash remove loser winner changes(notice c is always ahead of removed item)bull a gt b gt c gtd bull b gt c gt d gtabull c gt d gt a gt bbull a gt b gt c gt dbull b gt c gt dgt abull c gtd gt a gtbbull a ltb ltc lt da=18 b=19 c=20
d=13
a gt b gt c b gt c gta c gt a gt b a gt b gt c b gt c gt a c gt a gtb a ltb ltc
a=15b=14 c=13
When loser is removed next loser becomes winner
9
Strategic (insincere) votersbull Suppose your choice will likely come in second
place If you rank the first choice of rest of group very low you may lower that choice enough so yours is first
bull True story Deanrsquos selection Each committee member told they had 5 points to award and could spread out any way among the candidates The recipient of the most points wins I put all my points on one candidate Most split their points I swung the vote What was my gamble
bull Want to get the results as if truthful voting were done
10
Typical Competition Mechanisms
bull Auction allocate goods or tasks to agents through market Need a richer technique for reaching agreements
bull Negotiation reach agreements through interaction
bull Argumentation resolve confliction through debates
11
Negotiation
bull May involve
ndash Exchange of information
ndash Relaxation of initial goals
ndash Mutual concession
12
Mechanisms Protocols Strategies
bull Negotiation is governed by a mechanism or a
protocol
ndash defines the rdquorules of encounterrdquo between the agents
ndash the public rules by which the agents will come to
agreements
bull Given a particular protocol how can a particular
strategy be designed that individual agents can use
13
Negotiation is the process of reaching agreements on matters of common interest It usually proceeds in a series of rounds with every agent making a proposal at every round
Negotiation Mechanism
Issues in negotiation processbull Negotiation Space All possible deals that agents can make ie t
he set of candidate deals bull Negotiation Protocol ndash A rule that determines the process of a ne
gotiation how and when a proposal can be made when a deal has been struck when the negotiation should be terminated and so
bull Negotiation Strategy When and what proposals should be made
14
Protocol
bull Means kinds of deals that can be made
bull Means sequence of offers and counter-offers
bull Protocol is like rules of chess game whereas strategy is way in which player decides which move to make
15
Game Theory
bull Computers make concrete the notion of strategy which is central to game playing
16
Mechanisms Design
bull Mechanism design is the design of protocols for governing multi-
agent interactions
bull Desirable properties of mechanisms are
ndash Convergenceguaranteed success
ndash Maximising global welfare sum of agent benefits are maximized
ndash Pareto efficiency
ndash Individual rationality
ndash Stability no agent should have incentive to deviate from strategy
ndash Simplicity low computational demands little communication
ndash Distribution no central decision maker
ndash Symmetry not want agents to play different roles (all agents have same
choice of actions)
17
Attributes not universally accepted
bull Canrsquot always achieve every attribute so look at tradeoffs of choices (for example) efficiency and stability are sometimes in conflict with each other
18
Negotiation Protocol
bull Who beginsbull Take turnsbull Build off previous offersbull Give feed back (or not)bull Tell what utility is (or not)bull Obligations bull Privacybull Allowed proposals you can make as a result of
negotiation history
19
Thought Question
bull Why not just compute a joint solution ndash using linear programming
20
Negotiation Process 1
bull Negotiation usually proceeds in a series of rounds
with every agent making a proposal at every round
bull Communication during negotiation
Proposal
Counter Proposal
Agenti concedes
Agenti Agentj
21
Negotiation Process 2
bull Another way of looking at the negotiation
process is (can talk about 5050 or 9010
depending on who rdquomovesrdquo the farthest)
Proposals by AjProposals by AiPoint of
Acceptanceaggreement
22
Many types of interactive concession based methods
bull Some use multiple objective linear programming ndash ndash requires that the players construct a crude linear
approximation of t heir utility functions
bull Jointly Improving Direction method Start out with a neutral suggestive value continue until no joint improvements are possible ndash Used in Camp Daivd peace negotiations (EgyptIsrael
ndash Jimmy Carter Nobel Peace Prize 2002)
23
Jointly Improving Direction method
Iterate overbull Mediator helps players criticize a tentative
agreement (could be status quo)bull Generates a compromise direction (where each
of the k issues is a direction in k-space)bull Mediator helps players to find a jointly preferred
outcome along the compromise direction and then proposes a new tentative agreement
24
Typical Negotiation ProblemsTask-Oriented Domains(TOD) an agents activity can be defined in terms of a set of tasks that it has to achieve The target of a negotiation is to minimize the cost of completing the tasks
State Oriented Domains(SOD) each agent is concerned with moving the world from an initial state into one of a set of goal states The target of a negotiation is to achieve a common goal Main attribute actions have side effects (positivenegative)
Worth Oriented Domains(WOD) agents assign a worth to each potential state which captures its desirability for the agent The target of a negotiation is to maximize mutual worth (rather than worth to individual)
25
Complex Negotiations
bull Some attributes that make the negotiation process
complex are
ndash Multiple attributes
bull Single attribute (price) ndash symmetric scenario (both benefit in the
same way by a cheaper price)
bull Multiple attributes ndash several inter-related attributes eg buying a
car
ndash The number of agents and the way they interact
bull One-to-one eg single buyer and single seller
bull Many-to-one eg multiple buyers and a single seller auctions
bull Many-to-many eg multiple buyers and multiple sellers
26
Single issue negotiation
bull Like moneybull Symmetric (If roles were reversed I would
benefit the same way you would) ndash If one task requires less travel both would benefit
equally by having less travelndash utility for a task is experienced the same way by
whomever is assigned to that taskbull Non-symmetric ndash we would benefit differently if
roles were reversedndash if you delivered the picnic table you could just throw it
in the back of your van If I delivered it I would have to rent a U-haul to transport it (as my car is small)
27
Multiple Issue negotiation
bull Could be hundreds of issues (cost delivery date size quality)
bull Some may be inter-related (as size goes down cost goes down quality goes up)
bull Not clear what a true concession is (larger may be cheaper but harder to store or spoils before can be used)
bull May not even be clear what is up for negotiation (I didnrsquot realize not having any test was an option) (on the jobhellipAsk for stock options bigger office work from home)
28
How many agents are involved
bull One to one
bull One to many (auction is an example of one seller and many buyers)
bull Many to many (could be divided into buyers and sellers or all could be identical in role)ndash n(n-1)2 number of pairs
29
Negotiation DomainsTask-oriented
bull rdquoDomains in which an agentrsquos activity can be defined
in terms of a set of tasks that it has to achieverdquo (Rosenschein amp Zlotkin 1994)
bull An agent can carry out the tasks without interference (or
help) from other agents ndash such as rdquowho will deliver the
mailrdquo
bull All resources are available to the agent
bull Tasks redistributed for the benefit of all agents
30
Task-oriented Domain Definitionbull How can an agent evaluate the utility of a specific deal
ndash Utility represents how much an agent has to gain from the deal (it is
always based on change from original allocation)
ndash Since an agent can achieve the goal on its own it can compare the cost of
achieving the goal on its own to the cost of its part of the deal
bull If utilitylt0 it is worse off than performing tasks on its own
bull Conflict deal (stay with status quo) if agents fail to reach an
agreement
ndash where no agent agrees to execute tasks other than its own
bull utlity = 0
31
Formalization of TODA Task Oriented Domain(TOD) is a triple ltT Ag cgt
wherendash T is a finite set of all possible tasks
ndash Ag=A1 A2hellip An is a list of participant agentsndash c(T)R+ defines cost of executing each subset of tasks
Assumptions on cost function1 c() = 02 The cost of a subset of tasks does not depend on who carries out
them (Idealized situation)3 Cost function is monotonic which means that more tasks more
cost (It canrsquot cost less to take on more tasks) i T1 T2 implies c(T1) c(T2)
32
Redistribution of TasksGiven a TOD ltT A1A2 cgt T is original assignment D i
s assignment after the ldquodealrdquobull An encounter (instance) within the TOD is an ordered
list (T1 T2) such that for all k Tk T This is an original allocation of tasks that they might want to reallocate
bull A pure deal on an encounter is the redistribution of tasks among agents (D1 D2) such that all tasks are reassigned
D1 D2= T1 T2
Specifically (D1 D2)=(T1 T2) is called the conflict deal bull For each deal =(D1 D2) the cost of such a deal to
agent k is Costk()=c(Dk) (ie cost to k of deal is cost of Dk krsquos part of deal)
33
Examples of TOD
bull Parcel Delivery
Several couriers have to deliver sets of parcels to different cities The target of negotiation is to reallocate deliveries so that the cost of travel to each courier is minimalbull Database Queries
Several agents have access to a common database and each has to carry out a set of queries The target of negotiation is to arrange queries so as to maximize efficiency of database operations (Join Projection Union Intersection hellip) You are doing a join as part of another operation so please save the results for me
34
Possible DealsConsider an encounter from the Parcel Delivery Domain Suppose we have two agents Both agents have parcels to deliver to city a and only agent 2 has parcels to deliver to city b There are nine distinct pure deals in this encounter
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
the conflict deal
35
Figure deals knowing union must be ab
bull Choices for first agent a b ab
bull Second agent must ldquopick up the slackrdquo
bull a for agent 1 b|ab (for agent 2)
bull b for agent 1a|ab
bull ab for agent 1 a|ab|b|
bull for agent 1 ab
36
Utility Function for AgentsGiven an encounter (T1 T2) the utility function for each agent is just the difference of costs and is defined as follow
Utilityk()=c(Tk)-Costk() = c(Tk)- c(Dk)
where =(D1 D2) is a deal
ndash c(Tk) is the stand-alone cost to agent k (the cost of achieving its goal with no help)
ndash Costk() is the cost of its part of the deal
Note that the utility of the conflict deal is always 0
37
Parcel Delivery Domain (assuming do not have to return home ndash like
Uhaul)Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
38
Dominant Dealsbull Deal dominates deal if is better for at least one agent
and not worse for the other ie is at least as good for every agent as
k12 Utilityk() Utilityk()
is better for some agent than
k12 Utilityk()gt Utilityk()
bull Deal weakly dominates deal if at least the first condition holds (deal isnrsquot worse for anyone)
Any reasonable agent would prefer (or go along with) over
if dominates or weakly dominates
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
3
Social choice rule (good for society) InputInput the agent preference relations (lt1 hellip ltn)
OutputOutput elements of O sorted according the input - gives the social preference relation lt of the agent group
In other words ndash creates ordering for the group
3
4
Desirable properties of the social choice rule
A social preference ordering lt should exist for all possible inputs (individual preferences)
lt should be defined for every pair (o o)O lt should be asymmetric and transitive over O The outcomes should be Pareto efficient
if i A o lti o then o lt olsquo (not misorder if all agree) The scheme should be independent of irrelevant alternatives (if
all agree on relative ranking of two should retain ranking in social choice)
if i A lt and ltlsquo are rankings based on different sets of choices and satisfy o lti o and o lti olsquo (their relative rankings are unaffected by other choices being present) then the social ranking of o and o should have same relationship
No agent should be a dictator in the sense thato lti o implies o lt o for all preferences of the other
agents
5
Arrows impossibility theoremArrows impossibility theorem No social choice rule satisfies all of the six conditions Must relax desired attributes
May not require gt to always be defined We may not require that gt is asymmetic and transitiveUse plurality protocol all votes are cast simultaneously and
highest vote count wins Introducing an irrelevant alternative may split the
majority causing the old majority and the new irrelevant to drop out of favor (The Ross Perot effect)
A binary protocol involves voting pairwise ndash single eliminationThe order of the pairing can totally change the results
(Figure below is fascinating) Reason for rankings in basketball tournament
6
One voter ranks c gt d gt b gt aOne voter ranks a gt c gt d gt bOne voter ranks b gt a gt c gt dNotice just rotates preferences
winner (c (winner (a winner(bd)))=awinner (d (winner (b winner(ca)))=d
winner (d (winner (c winner(ab)))=c
winner (b (winner (d winner(ca)))=b
surprisingly order of pairing yields different winner
7
Borda protocol (used if binary protocol is too slow) = assigns an alternative |O| points for the highest preference |O|-1 points for the second and so on
The counts are summed across the voters and the alternative with the highest count becomes the social choice
Winner turns loser and loser turns winner if the lowest ranked alternative is removed (does this surprise you) See Table on next slide
7
8
Borda Paradox ndash remove loser winner changes(notice c is always ahead of removed item)bull a gt b gt c gtd bull b gt c gt d gtabull c gt d gt a gt bbull a gt b gt c gt dbull b gt c gt dgt abull c gtd gt a gtbbull a ltb ltc lt da=18 b=19 c=20
d=13
a gt b gt c b gt c gta c gt a gt b a gt b gt c b gt c gt a c gt a gtb a ltb ltc
a=15b=14 c=13
When loser is removed next loser becomes winner
9
Strategic (insincere) votersbull Suppose your choice will likely come in second
place If you rank the first choice of rest of group very low you may lower that choice enough so yours is first
bull True story Deanrsquos selection Each committee member told they had 5 points to award and could spread out any way among the candidates The recipient of the most points wins I put all my points on one candidate Most split their points I swung the vote What was my gamble
bull Want to get the results as if truthful voting were done
10
Typical Competition Mechanisms
bull Auction allocate goods or tasks to agents through market Need a richer technique for reaching agreements
bull Negotiation reach agreements through interaction
bull Argumentation resolve confliction through debates
11
Negotiation
bull May involve
ndash Exchange of information
ndash Relaxation of initial goals
ndash Mutual concession
12
Mechanisms Protocols Strategies
bull Negotiation is governed by a mechanism or a
protocol
ndash defines the rdquorules of encounterrdquo between the agents
ndash the public rules by which the agents will come to
agreements
bull Given a particular protocol how can a particular
strategy be designed that individual agents can use
13
Negotiation is the process of reaching agreements on matters of common interest It usually proceeds in a series of rounds with every agent making a proposal at every round
Negotiation Mechanism
Issues in negotiation processbull Negotiation Space All possible deals that agents can make ie t
he set of candidate deals bull Negotiation Protocol ndash A rule that determines the process of a ne
gotiation how and when a proposal can be made when a deal has been struck when the negotiation should be terminated and so
bull Negotiation Strategy When and what proposals should be made
14
Protocol
bull Means kinds of deals that can be made
bull Means sequence of offers and counter-offers
bull Protocol is like rules of chess game whereas strategy is way in which player decides which move to make
15
Game Theory
bull Computers make concrete the notion of strategy which is central to game playing
16
Mechanisms Design
bull Mechanism design is the design of protocols for governing multi-
agent interactions
bull Desirable properties of mechanisms are
ndash Convergenceguaranteed success
ndash Maximising global welfare sum of agent benefits are maximized
ndash Pareto efficiency
ndash Individual rationality
ndash Stability no agent should have incentive to deviate from strategy
ndash Simplicity low computational demands little communication
ndash Distribution no central decision maker
ndash Symmetry not want agents to play different roles (all agents have same
choice of actions)
17
Attributes not universally accepted
bull Canrsquot always achieve every attribute so look at tradeoffs of choices (for example) efficiency and stability are sometimes in conflict with each other
18
Negotiation Protocol
bull Who beginsbull Take turnsbull Build off previous offersbull Give feed back (or not)bull Tell what utility is (or not)bull Obligations bull Privacybull Allowed proposals you can make as a result of
negotiation history
19
Thought Question
bull Why not just compute a joint solution ndash using linear programming
20
Negotiation Process 1
bull Negotiation usually proceeds in a series of rounds
with every agent making a proposal at every round
bull Communication during negotiation
Proposal
Counter Proposal
Agenti concedes
Agenti Agentj
21
Negotiation Process 2
bull Another way of looking at the negotiation
process is (can talk about 5050 or 9010
depending on who rdquomovesrdquo the farthest)
Proposals by AjProposals by AiPoint of
Acceptanceaggreement
22
Many types of interactive concession based methods
bull Some use multiple objective linear programming ndash ndash requires that the players construct a crude linear
approximation of t heir utility functions
bull Jointly Improving Direction method Start out with a neutral suggestive value continue until no joint improvements are possible ndash Used in Camp Daivd peace negotiations (EgyptIsrael
ndash Jimmy Carter Nobel Peace Prize 2002)
23
Jointly Improving Direction method
Iterate overbull Mediator helps players criticize a tentative
agreement (could be status quo)bull Generates a compromise direction (where each
of the k issues is a direction in k-space)bull Mediator helps players to find a jointly preferred
outcome along the compromise direction and then proposes a new tentative agreement
24
Typical Negotiation ProblemsTask-Oriented Domains(TOD) an agents activity can be defined in terms of a set of tasks that it has to achieve The target of a negotiation is to minimize the cost of completing the tasks
State Oriented Domains(SOD) each agent is concerned with moving the world from an initial state into one of a set of goal states The target of a negotiation is to achieve a common goal Main attribute actions have side effects (positivenegative)
Worth Oriented Domains(WOD) agents assign a worth to each potential state which captures its desirability for the agent The target of a negotiation is to maximize mutual worth (rather than worth to individual)
25
Complex Negotiations
bull Some attributes that make the negotiation process
complex are
ndash Multiple attributes
bull Single attribute (price) ndash symmetric scenario (both benefit in the
same way by a cheaper price)
bull Multiple attributes ndash several inter-related attributes eg buying a
car
ndash The number of agents and the way they interact
bull One-to-one eg single buyer and single seller
bull Many-to-one eg multiple buyers and a single seller auctions
bull Many-to-many eg multiple buyers and multiple sellers
26
Single issue negotiation
bull Like moneybull Symmetric (If roles were reversed I would
benefit the same way you would) ndash If one task requires less travel both would benefit
equally by having less travelndash utility for a task is experienced the same way by
whomever is assigned to that taskbull Non-symmetric ndash we would benefit differently if
roles were reversedndash if you delivered the picnic table you could just throw it
in the back of your van If I delivered it I would have to rent a U-haul to transport it (as my car is small)
27
Multiple Issue negotiation
bull Could be hundreds of issues (cost delivery date size quality)
bull Some may be inter-related (as size goes down cost goes down quality goes up)
bull Not clear what a true concession is (larger may be cheaper but harder to store or spoils before can be used)
bull May not even be clear what is up for negotiation (I didnrsquot realize not having any test was an option) (on the jobhellipAsk for stock options bigger office work from home)
28
How many agents are involved
bull One to one
bull One to many (auction is an example of one seller and many buyers)
bull Many to many (could be divided into buyers and sellers or all could be identical in role)ndash n(n-1)2 number of pairs
29
Negotiation DomainsTask-oriented
bull rdquoDomains in which an agentrsquos activity can be defined
in terms of a set of tasks that it has to achieverdquo (Rosenschein amp Zlotkin 1994)
bull An agent can carry out the tasks without interference (or
help) from other agents ndash such as rdquowho will deliver the
mailrdquo
bull All resources are available to the agent
bull Tasks redistributed for the benefit of all agents
30
Task-oriented Domain Definitionbull How can an agent evaluate the utility of a specific deal
ndash Utility represents how much an agent has to gain from the deal (it is
always based on change from original allocation)
ndash Since an agent can achieve the goal on its own it can compare the cost of
achieving the goal on its own to the cost of its part of the deal
bull If utilitylt0 it is worse off than performing tasks on its own
bull Conflict deal (stay with status quo) if agents fail to reach an
agreement
ndash where no agent agrees to execute tasks other than its own
bull utlity = 0
31
Formalization of TODA Task Oriented Domain(TOD) is a triple ltT Ag cgt
wherendash T is a finite set of all possible tasks
ndash Ag=A1 A2hellip An is a list of participant agentsndash c(T)R+ defines cost of executing each subset of tasks
Assumptions on cost function1 c() = 02 The cost of a subset of tasks does not depend on who carries out
them (Idealized situation)3 Cost function is monotonic which means that more tasks more
cost (It canrsquot cost less to take on more tasks) i T1 T2 implies c(T1) c(T2)
32
Redistribution of TasksGiven a TOD ltT A1A2 cgt T is original assignment D i
s assignment after the ldquodealrdquobull An encounter (instance) within the TOD is an ordered
list (T1 T2) such that for all k Tk T This is an original allocation of tasks that they might want to reallocate
bull A pure deal on an encounter is the redistribution of tasks among agents (D1 D2) such that all tasks are reassigned
D1 D2= T1 T2
Specifically (D1 D2)=(T1 T2) is called the conflict deal bull For each deal =(D1 D2) the cost of such a deal to
agent k is Costk()=c(Dk) (ie cost to k of deal is cost of Dk krsquos part of deal)
33
Examples of TOD
bull Parcel Delivery
Several couriers have to deliver sets of parcels to different cities The target of negotiation is to reallocate deliveries so that the cost of travel to each courier is minimalbull Database Queries
Several agents have access to a common database and each has to carry out a set of queries The target of negotiation is to arrange queries so as to maximize efficiency of database operations (Join Projection Union Intersection hellip) You are doing a join as part of another operation so please save the results for me
34
Possible DealsConsider an encounter from the Parcel Delivery Domain Suppose we have two agents Both agents have parcels to deliver to city a and only agent 2 has parcels to deliver to city b There are nine distinct pure deals in this encounter
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
the conflict deal
35
Figure deals knowing union must be ab
bull Choices for first agent a b ab
bull Second agent must ldquopick up the slackrdquo
bull a for agent 1 b|ab (for agent 2)
bull b for agent 1a|ab
bull ab for agent 1 a|ab|b|
bull for agent 1 ab
36
Utility Function for AgentsGiven an encounter (T1 T2) the utility function for each agent is just the difference of costs and is defined as follow
Utilityk()=c(Tk)-Costk() = c(Tk)- c(Dk)
where =(D1 D2) is a deal
ndash c(Tk) is the stand-alone cost to agent k (the cost of achieving its goal with no help)
ndash Costk() is the cost of its part of the deal
Note that the utility of the conflict deal is always 0
37
Parcel Delivery Domain (assuming do not have to return home ndash like
Uhaul)Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
38
Dominant Dealsbull Deal dominates deal if is better for at least one agent
and not worse for the other ie is at least as good for every agent as
k12 Utilityk() Utilityk()
is better for some agent than
k12 Utilityk()gt Utilityk()
bull Deal weakly dominates deal if at least the first condition holds (deal isnrsquot worse for anyone)
Any reasonable agent would prefer (or go along with) over
if dominates or weakly dominates
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
4
Desirable properties of the social choice rule
A social preference ordering lt should exist for all possible inputs (individual preferences)
lt should be defined for every pair (o o)O lt should be asymmetric and transitive over O The outcomes should be Pareto efficient
if i A o lti o then o lt olsquo (not misorder if all agree) The scheme should be independent of irrelevant alternatives (if
all agree on relative ranking of two should retain ranking in social choice)
if i A lt and ltlsquo are rankings based on different sets of choices and satisfy o lti o and o lti olsquo (their relative rankings are unaffected by other choices being present) then the social ranking of o and o should have same relationship
No agent should be a dictator in the sense thato lti o implies o lt o for all preferences of the other
agents
5
Arrows impossibility theoremArrows impossibility theorem No social choice rule satisfies all of the six conditions Must relax desired attributes
May not require gt to always be defined We may not require that gt is asymmetic and transitiveUse plurality protocol all votes are cast simultaneously and
highest vote count wins Introducing an irrelevant alternative may split the
majority causing the old majority and the new irrelevant to drop out of favor (The Ross Perot effect)
A binary protocol involves voting pairwise ndash single eliminationThe order of the pairing can totally change the results
(Figure below is fascinating) Reason for rankings in basketball tournament
6
One voter ranks c gt d gt b gt aOne voter ranks a gt c gt d gt bOne voter ranks b gt a gt c gt dNotice just rotates preferences
winner (c (winner (a winner(bd)))=awinner (d (winner (b winner(ca)))=d
winner (d (winner (c winner(ab)))=c
winner (b (winner (d winner(ca)))=b
surprisingly order of pairing yields different winner
7
Borda protocol (used if binary protocol is too slow) = assigns an alternative |O| points for the highest preference |O|-1 points for the second and so on
The counts are summed across the voters and the alternative with the highest count becomes the social choice
Winner turns loser and loser turns winner if the lowest ranked alternative is removed (does this surprise you) See Table on next slide
7
8
Borda Paradox ndash remove loser winner changes(notice c is always ahead of removed item)bull a gt b gt c gtd bull b gt c gt d gtabull c gt d gt a gt bbull a gt b gt c gt dbull b gt c gt dgt abull c gtd gt a gtbbull a ltb ltc lt da=18 b=19 c=20
d=13
a gt b gt c b gt c gta c gt a gt b a gt b gt c b gt c gt a c gt a gtb a ltb ltc
a=15b=14 c=13
When loser is removed next loser becomes winner
9
Strategic (insincere) votersbull Suppose your choice will likely come in second
place If you rank the first choice of rest of group very low you may lower that choice enough so yours is first
bull True story Deanrsquos selection Each committee member told they had 5 points to award and could spread out any way among the candidates The recipient of the most points wins I put all my points on one candidate Most split their points I swung the vote What was my gamble
bull Want to get the results as if truthful voting were done
10
Typical Competition Mechanisms
bull Auction allocate goods or tasks to agents through market Need a richer technique for reaching agreements
bull Negotiation reach agreements through interaction
bull Argumentation resolve confliction through debates
11
Negotiation
bull May involve
ndash Exchange of information
ndash Relaxation of initial goals
ndash Mutual concession
12
Mechanisms Protocols Strategies
bull Negotiation is governed by a mechanism or a
protocol
ndash defines the rdquorules of encounterrdquo between the agents
ndash the public rules by which the agents will come to
agreements
bull Given a particular protocol how can a particular
strategy be designed that individual agents can use
13
Negotiation is the process of reaching agreements on matters of common interest It usually proceeds in a series of rounds with every agent making a proposal at every round
Negotiation Mechanism
Issues in negotiation processbull Negotiation Space All possible deals that agents can make ie t
he set of candidate deals bull Negotiation Protocol ndash A rule that determines the process of a ne
gotiation how and when a proposal can be made when a deal has been struck when the negotiation should be terminated and so
bull Negotiation Strategy When and what proposals should be made
14
Protocol
bull Means kinds of deals that can be made
bull Means sequence of offers and counter-offers
bull Protocol is like rules of chess game whereas strategy is way in which player decides which move to make
15
Game Theory
bull Computers make concrete the notion of strategy which is central to game playing
16
Mechanisms Design
bull Mechanism design is the design of protocols for governing multi-
agent interactions
bull Desirable properties of mechanisms are
ndash Convergenceguaranteed success
ndash Maximising global welfare sum of agent benefits are maximized
ndash Pareto efficiency
ndash Individual rationality
ndash Stability no agent should have incentive to deviate from strategy
ndash Simplicity low computational demands little communication
ndash Distribution no central decision maker
ndash Symmetry not want agents to play different roles (all agents have same
choice of actions)
17
Attributes not universally accepted
bull Canrsquot always achieve every attribute so look at tradeoffs of choices (for example) efficiency and stability are sometimes in conflict with each other
18
Negotiation Protocol
bull Who beginsbull Take turnsbull Build off previous offersbull Give feed back (or not)bull Tell what utility is (or not)bull Obligations bull Privacybull Allowed proposals you can make as a result of
negotiation history
19
Thought Question
bull Why not just compute a joint solution ndash using linear programming
20
Negotiation Process 1
bull Negotiation usually proceeds in a series of rounds
with every agent making a proposal at every round
bull Communication during negotiation
Proposal
Counter Proposal
Agenti concedes
Agenti Agentj
21
Negotiation Process 2
bull Another way of looking at the negotiation
process is (can talk about 5050 or 9010
depending on who rdquomovesrdquo the farthest)
Proposals by AjProposals by AiPoint of
Acceptanceaggreement
22
Many types of interactive concession based methods
bull Some use multiple objective linear programming ndash ndash requires that the players construct a crude linear
approximation of t heir utility functions
bull Jointly Improving Direction method Start out with a neutral suggestive value continue until no joint improvements are possible ndash Used in Camp Daivd peace negotiations (EgyptIsrael
ndash Jimmy Carter Nobel Peace Prize 2002)
23
Jointly Improving Direction method
Iterate overbull Mediator helps players criticize a tentative
agreement (could be status quo)bull Generates a compromise direction (where each
of the k issues is a direction in k-space)bull Mediator helps players to find a jointly preferred
outcome along the compromise direction and then proposes a new tentative agreement
24
Typical Negotiation ProblemsTask-Oriented Domains(TOD) an agents activity can be defined in terms of a set of tasks that it has to achieve The target of a negotiation is to minimize the cost of completing the tasks
State Oriented Domains(SOD) each agent is concerned with moving the world from an initial state into one of a set of goal states The target of a negotiation is to achieve a common goal Main attribute actions have side effects (positivenegative)
Worth Oriented Domains(WOD) agents assign a worth to each potential state which captures its desirability for the agent The target of a negotiation is to maximize mutual worth (rather than worth to individual)
25
Complex Negotiations
bull Some attributes that make the negotiation process
complex are
ndash Multiple attributes
bull Single attribute (price) ndash symmetric scenario (both benefit in the
same way by a cheaper price)
bull Multiple attributes ndash several inter-related attributes eg buying a
car
ndash The number of agents and the way they interact
bull One-to-one eg single buyer and single seller
bull Many-to-one eg multiple buyers and a single seller auctions
bull Many-to-many eg multiple buyers and multiple sellers
26
Single issue negotiation
bull Like moneybull Symmetric (If roles were reversed I would
benefit the same way you would) ndash If one task requires less travel both would benefit
equally by having less travelndash utility for a task is experienced the same way by
whomever is assigned to that taskbull Non-symmetric ndash we would benefit differently if
roles were reversedndash if you delivered the picnic table you could just throw it
in the back of your van If I delivered it I would have to rent a U-haul to transport it (as my car is small)
27
Multiple Issue negotiation
bull Could be hundreds of issues (cost delivery date size quality)
bull Some may be inter-related (as size goes down cost goes down quality goes up)
bull Not clear what a true concession is (larger may be cheaper but harder to store or spoils before can be used)
bull May not even be clear what is up for negotiation (I didnrsquot realize not having any test was an option) (on the jobhellipAsk for stock options bigger office work from home)
28
How many agents are involved
bull One to one
bull One to many (auction is an example of one seller and many buyers)
bull Many to many (could be divided into buyers and sellers or all could be identical in role)ndash n(n-1)2 number of pairs
29
Negotiation DomainsTask-oriented
bull rdquoDomains in which an agentrsquos activity can be defined
in terms of a set of tasks that it has to achieverdquo (Rosenschein amp Zlotkin 1994)
bull An agent can carry out the tasks without interference (or
help) from other agents ndash such as rdquowho will deliver the
mailrdquo
bull All resources are available to the agent
bull Tasks redistributed for the benefit of all agents
30
Task-oriented Domain Definitionbull How can an agent evaluate the utility of a specific deal
ndash Utility represents how much an agent has to gain from the deal (it is
always based on change from original allocation)
ndash Since an agent can achieve the goal on its own it can compare the cost of
achieving the goal on its own to the cost of its part of the deal
bull If utilitylt0 it is worse off than performing tasks on its own
bull Conflict deal (stay with status quo) if agents fail to reach an
agreement
ndash where no agent agrees to execute tasks other than its own
bull utlity = 0
31
Formalization of TODA Task Oriented Domain(TOD) is a triple ltT Ag cgt
wherendash T is a finite set of all possible tasks
ndash Ag=A1 A2hellip An is a list of participant agentsndash c(T)R+ defines cost of executing each subset of tasks
Assumptions on cost function1 c() = 02 The cost of a subset of tasks does not depend on who carries out
them (Idealized situation)3 Cost function is monotonic which means that more tasks more
cost (It canrsquot cost less to take on more tasks) i T1 T2 implies c(T1) c(T2)
32
Redistribution of TasksGiven a TOD ltT A1A2 cgt T is original assignment D i
s assignment after the ldquodealrdquobull An encounter (instance) within the TOD is an ordered
list (T1 T2) such that for all k Tk T This is an original allocation of tasks that they might want to reallocate
bull A pure deal on an encounter is the redistribution of tasks among agents (D1 D2) such that all tasks are reassigned
D1 D2= T1 T2
Specifically (D1 D2)=(T1 T2) is called the conflict deal bull For each deal =(D1 D2) the cost of such a deal to
agent k is Costk()=c(Dk) (ie cost to k of deal is cost of Dk krsquos part of deal)
33
Examples of TOD
bull Parcel Delivery
Several couriers have to deliver sets of parcels to different cities The target of negotiation is to reallocate deliveries so that the cost of travel to each courier is minimalbull Database Queries
Several agents have access to a common database and each has to carry out a set of queries The target of negotiation is to arrange queries so as to maximize efficiency of database operations (Join Projection Union Intersection hellip) You are doing a join as part of another operation so please save the results for me
34
Possible DealsConsider an encounter from the Parcel Delivery Domain Suppose we have two agents Both agents have parcels to deliver to city a and only agent 2 has parcels to deliver to city b There are nine distinct pure deals in this encounter
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
the conflict deal
35
Figure deals knowing union must be ab
bull Choices for first agent a b ab
bull Second agent must ldquopick up the slackrdquo
bull a for agent 1 b|ab (for agent 2)
bull b for agent 1a|ab
bull ab for agent 1 a|ab|b|
bull for agent 1 ab
36
Utility Function for AgentsGiven an encounter (T1 T2) the utility function for each agent is just the difference of costs and is defined as follow
Utilityk()=c(Tk)-Costk() = c(Tk)- c(Dk)
where =(D1 D2) is a deal
ndash c(Tk) is the stand-alone cost to agent k (the cost of achieving its goal with no help)
ndash Costk() is the cost of its part of the deal
Note that the utility of the conflict deal is always 0
37
Parcel Delivery Domain (assuming do not have to return home ndash like
Uhaul)Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
38
Dominant Dealsbull Deal dominates deal if is better for at least one agent
and not worse for the other ie is at least as good for every agent as
k12 Utilityk() Utilityk()
is better for some agent than
k12 Utilityk()gt Utilityk()
bull Deal weakly dominates deal if at least the first condition holds (deal isnrsquot worse for anyone)
Any reasonable agent would prefer (or go along with) over
if dominates or weakly dominates
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
5
Arrows impossibility theoremArrows impossibility theorem No social choice rule satisfies all of the six conditions Must relax desired attributes
May not require gt to always be defined We may not require that gt is asymmetic and transitiveUse plurality protocol all votes are cast simultaneously and
highest vote count wins Introducing an irrelevant alternative may split the
majority causing the old majority and the new irrelevant to drop out of favor (The Ross Perot effect)
A binary protocol involves voting pairwise ndash single eliminationThe order of the pairing can totally change the results
(Figure below is fascinating) Reason for rankings in basketball tournament
6
One voter ranks c gt d gt b gt aOne voter ranks a gt c gt d gt bOne voter ranks b gt a gt c gt dNotice just rotates preferences
winner (c (winner (a winner(bd)))=awinner (d (winner (b winner(ca)))=d
winner (d (winner (c winner(ab)))=c
winner (b (winner (d winner(ca)))=b
surprisingly order of pairing yields different winner
7
Borda protocol (used if binary protocol is too slow) = assigns an alternative |O| points for the highest preference |O|-1 points for the second and so on
The counts are summed across the voters and the alternative with the highest count becomes the social choice
Winner turns loser and loser turns winner if the lowest ranked alternative is removed (does this surprise you) See Table on next slide
7
8
Borda Paradox ndash remove loser winner changes(notice c is always ahead of removed item)bull a gt b gt c gtd bull b gt c gt d gtabull c gt d gt a gt bbull a gt b gt c gt dbull b gt c gt dgt abull c gtd gt a gtbbull a ltb ltc lt da=18 b=19 c=20
d=13
a gt b gt c b gt c gta c gt a gt b a gt b gt c b gt c gt a c gt a gtb a ltb ltc
a=15b=14 c=13
When loser is removed next loser becomes winner
9
Strategic (insincere) votersbull Suppose your choice will likely come in second
place If you rank the first choice of rest of group very low you may lower that choice enough so yours is first
bull True story Deanrsquos selection Each committee member told they had 5 points to award and could spread out any way among the candidates The recipient of the most points wins I put all my points on one candidate Most split their points I swung the vote What was my gamble
bull Want to get the results as if truthful voting were done
10
Typical Competition Mechanisms
bull Auction allocate goods or tasks to agents through market Need a richer technique for reaching agreements
bull Negotiation reach agreements through interaction
bull Argumentation resolve confliction through debates
11
Negotiation
bull May involve
ndash Exchange of information
ndash Relaxation of initial goals
ndash Mutual concession
12
Mechanisms Protocols Strategies
bull Negotiation is governed by a mechanism or a
protocol
ndash defines the rdquorules of encounterrdquo between the agents
ndash the public rules by which the agents will come to
agreements
bull Given a particular protocol how can a particular
strategy be designed that individual agents can use
13
Negotiation is the process of reaching agreements on matters of common interest It usually proceeds in a series of rounds with every agent making a proposal at every round
Negotiation Mechanism
Issues in negotiation processbull Negotiation Space All possible deals that agents can make ie t
he set of candidate deals bull Negotiation Protocol ndash A rule that determines the process of a ne
gotiation how and when a proposal can be made when a deal has been struck when the negotiation should be terminated and so
bull Negotiation Strategy When and what proposals should be made
14
Protocol
bull Means kinds of deals that can be made
bull Means sequence of offers and counter-offers
bull Protocol is like rules of chess game whereas strategy is way in which player decides which move to make
15
Game Theory
bull Computers make concrete the notion of strategy which is central to game playing
16
Mechanisms Design
bull Mechanism design is the design of protocols for governing multi-
agent interactions
bull Desirable properties of mechanisms are
ndash Convergenceguaranteed success
ndash Maximising global welfare sum of agent benefits are maximized
ndash Pareto efficiency
ndash Individual rationality
ndash Stability no agent should have incentive to deviate from strategy
ndash Simplicity low computational demands little communication
ndash Distribution no central decision maker
ndash Symmetry not want agents to play different roles (all agents have same
choice of actions)
17
Attributes not universally accepted
bull Canrsquot always achieve every attribute so look at tradeoffs of choices (for example) efficiency and stability are sometimes in conflict with each other
18
Negotiation Protocol
bull Who beginsbull Take turnsbull Build off previous offersbull Give feed back (or not)bull Tell what utility is (or not)bull Obligations bull Privacybull Allowed proposals you can make as a result of
negotiation history
19
Thought Question
bull Why not just compute a joint solution ndash using linear programming
20
Negotiation Process 1
bull Negotiation usually proceeds in a series of rounds
with every agent making a proposal at every round
bull Communication during negotiation
Proposal
Counter Proposal
Agenti concedes
Agenti Agentj
21
Negotiation Process 2
bull Another way of looking at the negotiation
process is (can talk about 5050 or 9010
depending on who rdquomovesrdquo the farthest)
Proposals by AjProposals by AiPoint of
Acceptanceaggreement
22
Many types of interactive concession based methods
bull Some use multiple objective linear programming ndash ndash requires that the players construct a crude linear
approximation of t heir utility functions
bull Jointly Improving Direction method Start out with a neutral suggestive value continue until no joint improvements are possible ndash Used in Camp Daivd peace negotiations (EgyptIsrael
ndash Jimmy Carter Nobel Peace Prize 2002)
23
Jointly Improving Direction method
Iterate overbull Mediator helps players criticize a tentative
agreement (could be status quo)bull Generates a compromise direction (where each
of the k issues is a direction in k-space)bull Mediator helps players to find a jointly preferred
outcome along the compromise direction and then proposes a new tentative agreement
24
Typical Negotiation ProblemsTask-Oriented Domains(TOD) an agents activity can be defined in terms of a set of tasks that it has to achieve The target of a negotiation is to minimize the cost of completing the tasks
State Oriented Domains(SOD) each agent is concerned with moving the world from an initial state into one of a set of goal states The target of a negotiation is to achieve a common goal Main attribute actions have side effects (positivenegative)
Worth Oriented Domains(WOD) agents assign a worth to each potential state which captures its desirability for the agent The target of a negotiation is to maximize mutual worth (rather than worth to individual)
25
Complex Negotiations
bull Some attributes that make the negotiation process
complex are
ndash Multiple attributes
bull Single attribute (price) ndash symmetric scenario (both benefit in the
same way by a cheaper price)
bull Multiple attributes ndash several inter-related attributes eg buying a
car
ndash The number of agents and the way they interact
bull One-to-one eg single buyer and single seller
bull Many-to-one eg multiple buyers and a single seller auctions
bull Many-to-many eg multiple buyers and multiple sellers
26
Single issue negotiation
bull Like moneybull Symmetric (If roles were reversed I would
benefit the same way you would) ndash If one task requires less travel both would benefit
equally by having less travelndash utility for a task is experienced the same way by
whomever is assigned to that taskbull Non-symmetric ndash we would benefit differently if
roles were reversedndash if you delivered the picnic table you could just throw it
in the back of your van If I delivered it I would have to rent a U-haul to transport it (as my car is small)
27
Multiple Issue negotiation
bull Could be hundreds of issues (cost delivery date size quality)
bull Some may be inter-related (as size goes down cost goes down quality goes up)
bull Not clear what a true concession is (larger may be cheaper but harder to store or spoils before can be used)
bull May not even be clear what is up for negotiation (I didnrsquot realize not having any test was an option) (on the jobhellipAsk for stock options bigger office work from home)
28
How many agents are involved
bull One to one
bull One to many (auction is an example of one seller and many buyers)
bull Many to many (could be divided into buyers and sellers or all could be identical in role)ndash n(n-1)2 number of pairs
29
Negotiation DomainsTask-oriented
bull rdquoDomains in which an agentrsquos activity can be defined
in terms of a set of tasks that it has to achieverdquo (Rosenschein amp Zlotkin 1994)
bull An agent can carry out the tasks without interference (or
help) from other agents ndash such as rdquowho will deliver the
mailrdquo
bull All resources are available to the agent
bull Tasks redistributed for the benefit of all agents
30
Task-oriented Domain Definitionbull How can an agent evaluate the utility of a specific deal
ndash Utility represents how much an agent has to gain from the deal (it is
always based on change from original allocation)
ndash Since an agent can achieve the goal on its own it can compare the cost of
achieving the goal on its own to the cost of its part of the deal
bull If utilitylt0 it is worse off than performing tasks on its own
bull Conflict deal (stay with status quo) if agents fail to reach an
agreement
ndash where no agent agrees to execute tasks other than its own
bull utlity = 0
31
Formalization of TODA Task Oriented Domain(TOD) is a triple ltT Ag cgt
wherendash T is a finite set of all possible tasks
ndash Ag=A1 A2hellip An is a list of participant agentsndash c(T)R+ defines cost of executing each subset of tasks
Assumptions on cost function1 c() = 02 The cost of a subset of tasks does not depend on who carries out
them (Idealized situation)3 Cost function is monotonic which means that more tasks more
cost (It canrsquot cost less to take on more tasks) i T1 T2 implies c(T1) c(T2)
32
Redistribution of TasksGiven a TOD ltT A1A2 cgt T is original assignment D i
s assignment after the ldquodealrdquobull An encounter (instance) within the TOD is an ordered
list (T1 T2) such that for all k Tk T This is an original allocation of tasks that they might want to reallocate
bull A pure deal on an encounter is the redistribution of tasks among agents (D1 D2) such that all tasks are reassigned
D1 D2= T1 T2
Specifically (D1 D2)=(T1 T2) is called the conflict deal bull For each deal =(D1 D2) the cost of such a deal to
agent k is Costk()=c(Dk) (ie cost to k of deal is cost of Dk krsquos part of deal)
33
Examples of TOD
bull Parcel Delivery
Several couriers have to deliver sets of parcels to different cities The target of negotiation is to reallocate deliveries so that the cost of travel to each courier is minimalbull Database Queries
Several agents have access to a common database and each has to carry out a set of queries The target of negotiation is to arrange queries so as to maximize efficiency of database operations (Join Projection Union Intersection hellip) You are doing a join as part of another operation so please save the results for me
34
Possible DealsConsider an encounter from the Parcel Delivery Domain Suppose we have two agents Both agents have parcels to deliver to city a and only agent 2 has parcels to deliver to city b There are nine distinct pure deals in this encounter
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
the conflict deal
35
Figure deals knowing union must be ab
bull Choices for first agent a b ab
bull Second agent must ldquopick up the slackrdquo
bull a for agent 1 b|ab (for agent 2)
bull b for agent 1a|ab
bull ab for agent 1 a|ab|b|
bull for agent 1 ab
36
Utility Function for AgentsGiven an encounter (T1 T2) the utility function for each agent is just the difference of costs and is defined as follow
Utilityk()=c(Tk)-Costk() = c(Tk)- c(Dk)
where =(D1 D2) is a deal
ndash c(Tk) is the stand-alone cost to agent k (the cost of achieving its goal with no help)
ndash Costk() is the cost of its part of the deal
Note that the utility of the conflict deal is always 0
37
Parcel Delivery Domain (assuming do not have to return home ndash like
Uhaul)Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
38
Dominant Dealsbull Deal dominates deal if is better for at least one agent
and not worse for the other ie is at least as good for every agent as
k12 Utilityk() Utilityk()
is better for some agent than
k12 Utilityk()gt Utilityk()
bull Deal weakly dominates deal if at least the first condition holds (deal isnrsquot worse for anyone)
Any reasonable agent would prefer (or go along with) over
if dominates or weakly dominates
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
6
One voter ranks c gt d gt b gt aOne voter ranks a gt c gt d gt bOne voter ranks b gt a gt c gt dNotice just rotates preferences
winner (c (winner (a winner(bd)))=awinner (d (winner (b winner(ca)))=d
winner (d (winner (c winner(ab)))=c
winner (b (winner (d winner(ca)))=b
surprisingly order of pairing yields different winner
7
Borda protocol (used if binary protocol is too slow) = assigns an alternative |O| points for the highest preference |O|-1 points for the second and so on
The counts are summed across the voters and the alternative with the highest count becomes the social choice
Winner turns loser and loser turns winner if the lowest ranked alternative is removed (does this surprise you) See Table on next slide
7
8
Borda Paradox ndash remove loser winner changes(notice c is always ahead of removed item)bull a gt b gt c gtd bull b gt c gt d gtabull c gt d gt a gt bbull a gt b gt c gt dbull b gt c gt dgt abull c gtd gt a gtbbull a ltb ltc lt da=18 b=19 c=20
d=13
a gt b gt c b gt c gta c gt a gt b a gt b gt c b gt c gt a c gt a gtb a ltb ltc
a=15b=14 c=13
When loser is removed next loser becomes winner
9
Strategic (insincere) votersbull Suppose your choice will likely come in second
place If you rank the first choice of rest of group very low you may lower that choice enough so yours is first
bull True story Deanrsquos selection Each committee member told they had 5 points to award and could spread out any way among the candidates The recipient of the most points wins I put all my points on one candidate Most split their points I swung the vote What was my gamble
bull Want to get the results as if truthful voting were done
10
Typical Competition Mechanisms
bull Auction allocate goods or tasks to agents through market Need a richer technique for reaching agreements
bull Negotiation reach agreements through interaction
bull Argumentation resolve confliction through debates
11
Negotiation
bull May involve
ndash Exchange of information
ndash Relaxation of initial goals
ndash Mutual concession
12
Mechanisms Protocols Strategies
bull Negotiation is governed by a mechanism or a
protocol
ndash defines the rdquorules of encounterrdquo between the agents
ndash the public rules by which the agents will come to
agreements
bull Given a particular protocol how can a particular
strategy be designed that individual agents can use
13
Negotiation is the process of reaching agreements on matters of common interest It usually proceeds in a series of rounds with every agent making a proposal at every round
Negotiation Mechanism
Issues in negotiation processbull Negotiation Space All possible deals that agents can make ie t
he set of candidate deals bull Negotiation Protocol ndash A rule that determines the process of a ne
gotiation how and when a proposal can be made when a deal has been struck when the negotiation should be terminated and so
bull Negotiation Strategy When and what proposals should be made
14
Protocol
bull Means kinds of deals that can be made
bull Means sequence of offers and counter-offers
bull Protocol is like rules of chess game whereas strategy is way in which player decides which move to make
15
Game Theory
bull Computers make concrete the notion of strategy which is central to game playing
16
Mechanisms Design
bull Mechanism design is the design of protocols for governing multi-
agent interactions
bull Desirable properties of mechanisms are
ndash Convergenceguaranteed success
ndash Maximising global welfare sum of agent benefits are maximized
ndash Pareto efficiency
ndash Individual rationality
ndash Stability no agent should have incentive to deviate from strategy
ndash Simplicity low computational demands little communication
ndash Distribution no central decision maker
ndash Symmetry not want agents to play different roles (all agents have same
choice of actions)
17
Attributes not universally accepted
bull Canrsquot always achieve every attribute so look at tradeoffs of choices (for example) efficiency and stability are sometimes in conflict with each other
18
Negotiation Protocol
bull Who beginsbull Take turnsbull Build off previous offersbull Give feed back (or not)bull Tell what utility is (or not)bull Obligations bull Privacybull Allowed proposals you can make as a result of
negotiation history
19
Thought Question
bull Why not just compute a joint solution ndash using linear programming
20
Negotiation Process 1
bull Negotiation usually proceeds in a series of rounds
with every agent making a proposal at every round
bull Communication during negotiation
Proposal
Counter Proposal
Agenti concedes
Agenti Agentj
21
Negotiation Process 2
bull Another way of looking at the negotiation
process is (can talk about 5050 or 9010
depending on who rdquomovesrdquo the farthest)
Proposals by AjProposals by AiPoint of
Acceptanceaggreement
22
Many types of interactive concession based methods
bull Some use multiple objective linear programming ndash ndash requires that the players construct a crude linear
approximation of t heir utility functions
bull Jointly Improving Direction method Start out with a neutral suggestive value continue until no joint improvements are possible ndash Used in Camp Daivd peace negotiations (EgyptIsrael
ndash Jimmy Carter Nobel Peace Prize 2002)
23
Jointly Improving Direction method
Iterate overbull Mediator helps players criticize a tentative
agreement (could be status quo)bull Generates a compromise direction (where each
of the k issues is a direction in k-space)bull Mediator helps players to find a jointly preferred
outcome along the compromise direction and then proposes a new tentative agreement
24
Typical Negotiation ProblemsTask-Oriented Domains(TOD) an agents activity can be defined in terms of a set of tasks that it has to achieve The target of a negotiation is to minimize the cost of completing the tasks
State Oriented Domains(SOD) each agent is concerned with moving the world from an initial state into one of a set of goal states The target of a negotiation is to achieve a common goal Main attribute actions have side effects (positivenegative)
Worth Oriented Domains(WOD) agents assign a worth to each potential state which captures its desirability for the agent The target of a negotiation is to maximize mutual worth (rather than worth to individual)
25
Complex Negotiations
bull Some attributes that make the negotiation process
complex are
ndash Multiple attributes
bull Single attribute (price) ndash symmetric scenario (both benefit in the
same way by a cheaper price)
bull Multiple attributes ndash several inter-related attributes eg buying a
car
ndash The number of agents and the way they interact
bull One-to-one eg single buyer and single seller
bull Many-to-one eg multiple buyers and a single seller auctions
bull Many-to-many eg multiple buyers and multiple sellers
26
Single issue negotiation
bull Like moneybull Symmetric (If roles were reversed I would
benefit the same way you would) ndash If one task requires less travel both would benefit
equally by having less travelndash utility for a task is experienced the same way by
whomever is assigned to that taskbull Non-symmetric ndash we would benefit differently if
roles were reversedndash if you delivered the picnic table you could just throw it
in the back of your van If I delivered it I would have to rent a U-haul to transport it (as my car is small)
27
Multiple Issue negotiation
bull Could be hundreds of issues (cost delivery date size quality)
bull Some may be inter-related (as size goes down cost goes down quality goes up)
bull Not clear what a true concession is (larger may be cheaper but harder to store or spoils before can be used)
bull May not even be clear what is up for negotiation (I didnrsquot realize not having any test was an option) (on the jobhellipAsk for stock options bigger office work from home)
28
How many agents are involved
bull One to one
bull One to many (auction is an example of one seller and many buyers)
bull Many to many (could be divided into buyers and sellers or all could be identical in role)ndash n(n-1)2 number of pairs
29
Negotiation DomainsTask-oriented
bull rdquoDomains in which an agentrsquos activity can be defined
in terms of a set of tasks that it has to achieverdquo (Rosenschein amp Zlotkin 1994)
bull An agent can carry out the tasks without interference (or
help) from other agents ndash such as rdquowho will deliver the
mailrdquo
bull All resources are available to the agent
bull Tasks redistributed for the benefit of all agents
30
Task-oriented Domain Definitionbull How can an agent evaluate the utility of a specific deal
ndash Utility represents how much an agent has to gain from the deal (it is
always based on change from original allocation)
ndash Since an agent can achieve the goal on its own it can compare the cost of
achieving the goal on its own to the cost of its part of the deal
bull If utilitylt0 it is worse off than performing tasks on its own
bull Conflict deal (stay with status quo) if agents fail to reach an
agreement
ndash where no agent agrees to execute tasks other than its own
bull utlity = 0
31
Formalization of TODA Task Oriented Domain(TOD) is a triple ltT Ag cgt
wherendash T is a finite set of all possible tasks
ndash Ag=A1 A2hellip An is a list of participant agentsndash c(T)R+ defines cost of executing each subset of tasks
Assumptions on cost function1 c() = 02 The cost of a subset of tasks does not depend on who carries out
them (Idealized situation)3 Cost function is monotonic which means that more tasks more
cost (It canrsquot cost less to take on more tasks) i T1 T2 implies c(T1) c(T2)
32
Redistribution of TasksGiven a TOD ltT A1A2 cgt T is original assignment D i
s assignment after the ldquodealrdquobull An encounter (instance) within the TOD is an ordered
list (T1 T2) such that for all k Tk T This is an original allocation of tasks that they might want to reallocate
bull A pure deal on an encounter is the redistribution of tasks among agents (D1 D2) such that all tasks are reassigned
D1 D2= T1 T2
Specifically (D1 D2)=(T1 T2) is called the conflict deal bull For each deal =(D1 D2) the cost of such a deal to
agent k is Costk()=c(Dk) (ie cost to k of deal is cost of Dk krsquos part of deal)
33
Examples of TOD
bull Parcel Delivery
Several couriers have to deliver sets of parcels to different cities The target of negotiation is to reallocate deliveries so that the cost of travel to each courier is minimalbull Database Queries
Several agents have access to a common database and each has to carry out a set of queries The target of negotiation is to arrange queries so as to maximize efficiency of database operations (Join Projection Union Intersection hellip) You are doing a join as part of another operation so please save the results for me
34
Possible DealsConsider an encounter from the Parcel Delivery Domain Suppose we have two agents Both agents have parcels to deliver to city a and only agent 2 has parcels to deliver to city b There are nine distinct pure deals in this encounter
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
the conflict deal
35
Figure deals knowing union must be ab
bull Choices for first agent a b ab
bull Second agent must ldquopick up the slackrdquo
bull a for agent 1 b|ab (for agent 2)
bull b for agent 1a|ab
bull ab for agent 1 a|ab|b|
bull for agent 1 ab
36
Utility Function for AgentsGiven an encounter (T1 T2) the utility function for each agent is just the difference of costs and is defined as follow
Utilityk()=c(Tk)-Costk() = c(Tk)- c(Dk)
where =(D1 D2) is a deal
ndash c(Tk) is the stand-alone cost to agent k (the cost of achieving its goal with no help)
ndash Costk() is the cost of its part of the deal
Note that the utility of the conflict deal is always 0
37
Parcel Delivery Domain (assuming do not have to return home ndash like
Uhaul)Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
38
Dominant Dealsbull Deal dominates deal if is better for at least one agent
and not worse for the other ie is at least as good for every agent as
k12 Utilityk() Utilityk()
is better for some agent than
k12 Utilityk()gt Utilityk()
bull Deal weakly dominates deal if at least the first condition holds (deal isnrsquot worse for anyone)
Any reasonable agent would prefer (or go along with) over
if dominates or weakly dominates
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
7
Borda protocol (used if binary protocol is too slow) = assigns an alternative |O| points for the highest preference |O|-1 points for the second and so on
The counts are summed across the voters and the alternative with the highest count becomes the social choice
Winner turns loser and loser turns winner if the lowest ranked alternative is removed (does this surprise you) See Table on next slide
7
8
Borda Paradox ndash remove loser winner changes(notice c is always ahead of removed item)bull a gt b gt c gtd bull b gt c gt d gtabull c gt d gt a gt bbull a gt b gt c gt dbull b gt c gt dgt abull c gtd gt a gtbbull a ltb ltc lt da=18 b=19 c=20
d=13
a gt b gt c b gt c gta c gt a gt b a gt b gt c b gt c gt a c gt a gtb a ltb ltc
a=15b=14 c=13
When loser is removed next loser becomes winner
9
Strategic (insincere) votersbull Suppose your choice will likely come in second
place If you rank the first choice of rest of group very low you may lower that choice enough so yours is first
bull True story Deanrsquos selection Each committee member told they had 5 points to award and could spread out any way among the candidates The recipient of the most points wins I put all my points on one candidate Most split their points I swung the vote What was my gamble
bull Want to get the results as if truthful voting were done
10
Typical Competition Mechanisms
bull Auction allocate goods or tasks to agents through market Need a richer technique for reaching agreements
bull Negotiation reach agreements through interaction
bull Argumentation resolve confliction through debates
11
Negotiation
bull May involve
ndash Exchange of information
ndash Relaxation of initial goals
ndash Mutual concession
12
Mechanisms Protocols Strategies
bull Negotiation is governed by a mechanism or a
protocol
ndash defines the rdquorules of encounterrdquo between the agents
ndash the public rules by which the agents will come to
agreements
bull Given a particular protocol how can a particular
strategy be designed that individual agents can use
13
Negotiation is the process of reaching agreements on matters of common interest It usually proceeds in a series of rounds with every agent making a proposal at every round
Negotiation Mechanism
Issues in negotiation processbull Negotiation Space All possible deals that agents can make ie t
he set of candidate deals bull Negotiation Protocol ndash A rule that determines the process of a ne
gotiation how and when a proposal can be made when a deal has been struck when the negotiation should be terminated and so
bull Negotiation Strategy When and what proposals should be made
14
Protocol
bull Means kinds of deals that can be made
bull Means sequence of offers and counter-offers
bull Protocol is like rules of chess game whereas strategy is way in which player decides which move to make
15
Game Theory
bull Computers make concrete the notion of strategy which is central to game playing
16
Mechanisms Design
bull Mechanism design is the design of protocols for governing multi-
agent interactions
bull Desirable properties of mechanisms are
ndash Convergenceguaranteed success
ndash Maximising global welfare sum of agent benefits are maximized
ndash Pareto efficiency
ndash Individual rationality
ndash Stability no agent should have incentive to deviate from strategy
ndash Simplicity low computational demands little communication
ndash Distribution no central decision maker
ndash Symmetry not want agents to play different roles (all agents have same
choice of actions)
17
Attributes not universally accepted
bull Canrsquot always achieve every attribute so look at tradeoffs of choices (for example) efficiency and stability are sometimes in conflict with each other
18
Negotiation Protocol
bull Who beginsbull Take turnsbull Build off previous offersbull Give feed back (or not)bull Tell what utility is (or not)bull Obligations bull Privacybull Allowed proposals you can make as a result of
negotiation history
19
Thought Question
bull Why not just compute a joint solution ndash using linear programming
20
Negotiation Process 1
bull Negotiation usually proceeds in a series of rounds
with every agent making a proposal at every round
bull Communication during negotiation
Proposal
Counter Proposal
Agenti concedes
Agenti Agentj
21
Negotiation Process 2
bull Another way of looking at the negotiation
process is (can talk about 5050 or 9010
depending on who rdquomovesrdquo the farthest)
Proposals by AjProposals by AiPoint of
Acceptanceaggreement
22
Many types of interactive concession based methods
bull Some use multiple objective linear programming ndash ndash requires that the players construct a crude linear
approximation of t heir utility functions
bull Jointly Improving Direction method Start out with a neutral suggestive value continue until no joint improvements are possible ndash Used in Camp Daivd peace negotiations (EgyptIsrael
ndash Jimmy Carter Nobel Peace Prize 2002)
23
Jointly Improving Direction method
Iterate overbull Mediator helps players criticize a tentative
agreement (could be status quo)bull Generates a compromise direction (where each
of the k issues is a direction in k-space)bull Mediator helps players to find a jointly preferred
outcome along the compromise direction and then proposes a new tentative agreement
24
Typical Negotiation ProblemsTask-Oriented Domains(TOD) an agents activity can be defined in terms of a set of tasks that it has to achieve The target of a negotiation is to minimize the cost of completing the tasks
State Oriented Domains(SOD) each agent is concerned with moving the world from an initial state into one of a set of goal states The target of a negotiation is to achieve a common goal Main attribute actions have side effects (positivenegative)
Worth Oriented Domains(WOD) agents assign a worth to each potential state which captures its desirability for the agent The target of a negotiation is to maximize mutual worth (rather than worth to individual)
25
Complex Negotiations
bull Some attributes that make the negotiation process
complex are
ndash Multiple attributes
bull Single attribute (price) ndash symmetric scenario (both benefit in the
same way by a cheaper price)
bull Multiple attributes ndash several inter-related attributes eg buying a
car
ndash The number of agents and the way they interact
bull One-to-one eg single buyer and single seller
bull Many-to-one eg multiple buyers and a single seller auctions
bull Many-to-many eg multiple buyers and multiple sellers
26
Single issue negotiation
bull Like moneybull Symmetric (If roles were reversed I would
benefit the same way you would) ndash If one task requires less travel both would benefit
equally by having less travelndash utility for a task is experienced the same way by
whomever is assigned to that taskbull Non-symmetric ndash we would benefit differently if
roles were reversedndash if you delivered the picnic table you could just throw it
in the back of your van If I delivered it I would have to rent a U-haul to transport it (as my car is small)
27
Multiple Issue negotiation
bull Could be hundreds of issues (cost delivery date size quality)
bull Some may be inter-related (as size goes down cost goes down quality goes up)
bull Not clear what a true concession is (larger may be cheaper but harder to store or spoils before can be used)
bull May not even be clear what is up for negotiation (I didnrsquot realize not having any test was an option) (on the jobhellipAsk for stock options bigger office work from home)
28
How many agents are involved
bull One to one
bull One to many (auction is an example of one seller and many buyers)
bull Many to many (could be divided into buyers and sellers or all could be identical in role)ndash n(n-1)2 number of pairs
29
Negotiation DomainsTask-oriented
bull rdquoDomains in which an agentrsquos activity can be defined
in terms of a set of tasks that it has to achieverdquo (Rosenschein amp Zlotkin 1994)
bull An agent can carry out the tasks without interference (or
help) from other agents ndash such as rdquowho will deliver the
mailrdquo
bull All resources are available to the agent
bull Tasks redistributed for the benefit of all agents
30
Task-oriented Domain Definitionbull How can an agent evaluate the utility of a specific deal
ndash Utility represents how much an agent has to gain from the deal (it is
always based on change from original allocation)
ndash Since an agent can achieve the goal on its own it can compare the cost of
achieving the goal on its own to the cost of its part of the deal
bull If utilitylt0 it is worse off than performing tasks on its own
bull Conflict deal (stay with status quo) if agents fail to reach an
agreement
ndash where no agent agrees to execute tasks other than its own
bull utlity = 0
31
Formalization of TODA Task Oriented Domain(TOD) is a triple ltT Ag cgt
wherendash T is a finite set of all possible tasks
ndash Ag=A1 A2hellip An is a list of participant agentsndash c(T)R+ defines cost of executing each subset of tasks
Assumptions on cost function1 c() = 02 The cost of a subset of tasks does not depend on who carries out
them (Idealized situation)3 Cost function is monotonic which means that more tasks more
cost (It canrsquot cost less to take on more tasks) i T1 T2 implies c(T1) c(T2)
32
Redistribution of TasksGiven a TOD ltT A1A2 cgt T is original assignment D i
s assignment after the ldquodealrdquobull An encounter (instance) within the TOD is an ordered
list (T1 T2) such that for all k Tk T This is an original allocation of tasks that they might want to reallocate
bull A pure deal on an encounter is the redistribution of tasks among agents (D1 D2) such that all tasks are reassigned
D1 D2= T1 T2
Specifically (D1 D2)=(T1 T2) is called the conflict deal bull For each deal =(D1 D2) the cost of such a deal to
agent k is Costk()=c(Dk) (ie cost to k of deal is cost of Dk krsquos part of deal)
33
Examples of TOD
bull Parcel Delivery
Several couriers have to deliver sets of parcels to different cities The target of negotiation is to reallocate deliveries so that the cost of travel to each courier is minimalbull Database Queries
Several agents have access to a common database and each has to carry out a set of queries The target of negotiation is to arrange queries so as to maximize efficiency of database operations (Join Projection Union Intersection hellip) You are doing a join as part of another operation so please save the results for me
34
Possible DealsConsider an encounter from the Parcel Delivery Domain Suppose we have two agents Both agents have parcels to deliver to city a and only agent 2 has parcels to deliver to city b There are nine distinct pure deals in this encounter
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
the conflict deal
35
Figure deals knowing union must be ab
bull Choices for first agent a b ab
bull Second agent must ldquopick up the slackrdquo
bull a for agent 1 b|ab (for agent 2)
bull b for agent 1a|ab
bull ab for agent 1 a|ab|b|
bull for agent 1 ab
36
Utility Function for AgentsGiven an encounter (T1 T2) the utility function for each agent is just the difference of costs and is defined as follow
Utilityk()=c(Tk)-Costk() = c(Tk)- c(Dk)
where =(D1 D2) is a deal
ndash c(Tk) is the stand-alone cost to agent k (the cost of achieving its goal with no help)
ndash Costk() is the cost of its part of the deal
Note that the utility of the conflict deal is always 0
37
Parcel Delivery Domain (assuming do not have to return home ndash like
Uhaul)Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
38
Dominant Dealsbull Deal dominates deal if is better for at least one agent
and not worse for the other ie is at least as good for every agent as
k12 Utilityk() Utilityk()
is better for some agent than
k12 Utilityk()gt Utilityk()
bull Deal weakly dominates deal if at least the first condition holds (deal isnrsquot worse for anyone)
Any reasonable agent would prefer (or go along with) over
if dominates or weakly dominates
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
8
Borda Paradox ndash remove loser winner changes(notice c is always ahead of removed item)bull a gt b gt c gtd bull b gt c gt d gtabull c gt d gt a gt bbull a gt b gt c gt dbull b gt c gt dgt abull c gtd gt a gtbbull a ltb ltc lt da=18 b=19 c=20
d=13
a gt b gt c b gt c gta c gt a gt b a gt b gt c b gt c gt a c gt a gtb a ltb ltc
a=15b=14 c=13
When loser is removed next loser becomes winner
9
Strategic (insincere) votersbull Suppose your choice will likely come in second
place If you rank the first choice of rest of group very low you may lower that choice enough so yours is first
bull True story Deanrsquos selection Each committee member told they had 5 points to award and could spread out any way among the candidates The recipient of the most points wins I put all my points on one candidate Most split their points I swung the vote What was my gamble
bull Want to get the results as if truthful voting were done
10
Typical Competition Mechanisms
bull Auction allocate goods or tasks to agents through market Need a richer technique for reaching agreements
bull Negotiation reach agreements through interaction
bull Argumentation resolve confliction through debates
11
Negotiation
bull May involve
ndash Exchange of information
ndash Relaxation of initial goals
ndash Mutual concession
12
Mechanisms Protocols Strategies
bull Negotiation is governed by a mechanism or a
protocol
ndash defines the rdquorules of encounterrdquo between the agents
ndash the public rules by which the agents will come to
agreements
bull Given a particular protocol how can a particular
strategy be designed that individual agents can use
13
Negotiation is the process of reaching agreements on matters of common interest It usually proceeds in a series of rounds with every agent making a proposal at every round
Negotiation Mechanism
Issues in negotiation processbull Negotiation Space All possible deals that agents can make ie t
he set of candidate deals bull Negotiation Protocol ndash A rule that determines the process of a ne
gotiation how and when a proposal can be made when a deal has been struck when the negotiation should be terminated and so
bull Negotiation Strategy When and what proposals should be made
14
Protocol
bull Means kinds of deals that can be made
bull Means sequence of offers and counter-offers
bull Protocol is like rules of chess game whereas strategy is way in which player decides which move to make
15
Game Theory
bull Computers make concrete the notion of strategy which is central to game playing
16
Mechanisms Design
bull Mechanism design is the design of protocols for governing multi-
agent interactions
bull Desirable properties of mechanisms are
ndash Convergenceguaranteed success
ndash Maximising global welfare sum of agent benefits are maximized
ndash Pareto efficiency
ndash Individual rationality
ndash Stability no agent should have incentive to deviate from strategy
ndash Simplicity low computational demands little communication
ndash Distribution no central decision maker
ndash Symmetry not want agents to play different roles (all agents have same
choice of actions)
17
Attributes not universally accepted
bull Canrsquot always achieve every attribute so look at tradeoffs of choices (for example) efficiency and stability are sometimes in conflict with each other
18
Negotiation Protocol
bull Who beginsbull Take turnsbull Build off previous offersbull Give feed back (or not)bull Tell what utility is (or not)bull Obligations bull Privacybull Allowed proposals you can make as a result of
negotiation history
19
Thought Question
bull Why not just compute a joint solution ndash using linear programming
20
Negotiation Process 1
bull Negotiation usually proceeds in a series of rounds
with every agent making a proposal at every round
bull Communication during negotiation
Proposal
Counter Proposal
Agenti concedes
Agenti Agentj
21
Negotiation Process 2
bull Another way of looking at the negotiation
process is (can talk about 5050 or 9010
depending on who rdquomovesrdquo the farthest)
Proposals by AjProposals by AiPoint of
Acceptanceaggreement
22
Many types of interactive concession based methods
bull Some use multiple objective linear programming ndash ndash requires that the players construct a crude linear
approximation of t heir utility functions
bull Jointly Improving Direction method Start out with a neutral suggestive value continue until no joint improvements are possible ndash Used in Camp Daivd peace negotiations (EgyptIsrael
ndash Jimmy Carter Nobel Peace Prize 2002)
23
Jointly Improving Direction method
Iterate overbull Mediator helps players criticize a tentative
agreement (could be status quo)bull Generates a compromise direction (where each
of the k issues is a direction in k-space)bull Mediator helps players to find a jointly preferred
outcome along the compromise direction and then proposes a new tentative agreement
24
Typical Negotiation ProblemsTask-Oriented Domains(TOD) an agents activity can be defined in terms of a set of tasks that it has to achieve The target of a negotiation is to minimize the cost of completing the tasks
State Oriented Domains(SOD) each agent is concerned with moving the world from an initial state into one of a set of goal states The target of a negotiation is to achieve a common goal Main attribute actions have side effects (positivenegative)
Worth Oriented Domains(WOD) agents assign a worth to each potential state which captures its desirability for the agent The target of a negotiation is to maximize mutual worth (rather than worth to individual)
25
Complex Negotiations
bull Some attributes that make the negotiation process
complex are
ndash Multiple attributes
bull Single attribute (price) ndash symmetric scenario (both benefit in the
same way by a cheaper price)
bull Multiple attributes ndash several inter-related attributes eg buying a
car
ndash The number of agents and the way they interact
bull One-to-one eg single buyer and single seller
bull Many-to-one eg multiple buyers and a single seller auctions
bull Many-to-many eg multiple buyers and multiple sellers
26
Single issue negotiation
bull Like moneybull Symmetric (If roles were reversed I would
benefit the same way you would) ndash If one task requires less travel both would benefit
equally by having less travelndash utility for a task is experienced the same way by
whomever is assigned to that taskbull Non-symmetric ndash we would benefit differently if
roles were reversedndash if you delivered the picnic table you could just throw it
in the back of your van If I delivered it I would have to rent a U-haul to transport it (as my car is small)
27
Multiple Issue negotiation
bull Could be hundreds of issues (cost delivery date size quality)
bull Some may be inter-related (as size goes down cost goes down quality goes up)
bull Not clear what a true concession is (larger may be cheaper but harder to store or spoils before can be used)
bull May not even be clear what is up for negotiation (I didnrsquot realize not having any test was an option) (on the jobhellipAsk for stock options bigger office work from home)
28
How many agents are involved
bull One to one
bull One to many (auction is an example of one seller and many buyers)
bull Many to many (could be divided into buyers and sellers or all could be identical in role)ndash n(n-1)2 number of pairs
29
Negotiation DomainsTask-oriented
bull rdquoDomains in which an agentrsquos activity can be defined
in terms of a set of tasks that it has to achieverdquo (Rosenschein amp Zlotkin 1994)
bull An agent can carry out the tasks without interference (or
help) from other agents ndash such as rdquowho will deliver the
mailrdquo
bull All resources are available to the agent
bull Tasks redistributed for the benefit of all agents
30
Task-oriented Domain Definitionbull How can an agent evaluate the utility of a specific deal
ndash Utility represents how much an agent has to gain from the deal (it is
always based on change from original allocation)
ndash Since an agent can achieve the goal on its own it can compare the cost of
achieving the goal on its own to the cost of its part of the deal
bull If utilitylt0 it is worse off than performing tasks on its own
bull Conflict deal (stay with status quo) if agents fail to reach an
agreement
ndash where no agent agrees to execute tasks other than its own
bull utlity = 0
31
Formalization of TODA Task Oriented Domain(TOD) is a triple ltT Ag cgt
wherendash T is a finite set of all possible tasks
ndash Ag=A1 A2hellip An is a list of participant agentsndash c(T)R+ defines cost of executing each subset of tasks
Assumptions on cost function1 c() = 02 The cost of a subset of tasks does not depend on who carries out
them (Idealized situation)3 Cost function is monotonic which means that more tasks more
cost (It canrsquot cost less to take on more tasks) i T1 T2 implies c(T1) c(T2)
32
Redistribution of TasksGiven a TOD ltT A1A2 cgt T is original assignment D i
s assignment after the ldquodealrdquobull An encounter (instance) within the TOD is an ordered
list (T1 T2) such that for all k Tk T This is an original allocation of tasks that they might want to reallocate
bull A pure deal on an encounter is the redistribution of tasks among agents (D1 D2) such that all tasks are reassigned
D1 D2= T1 T2
Specifically (D1 D2)=(T1 T2) is called the conflict deal bull For each deal =(D1 D2) the cost of such a deal to
agent k is Costk()=c(Dk) (ie cost to k of deal is cost of Dk krsquos part of deal)
33
Examples of TOD
bull Parcel Delivery
Several couriers have to deliver sets of parcels to different cities The target of negotiation is to reallocate deliveries so that the cost of travel to each courier is minimalbull Database Queries
Several agents have access to a common database and each has to carry out a set of queries The target of negotiation is to arrange queries so as to maximize efficiency of database operations (Join Projection Union Intersection hellip) You are doing a join as part of another operation so please save the results for me
34
Possible DealsConsider an encounter from the Parcel Delivery Domain Suppose we have two agents Both agents have parcels to deliver to city a and only agent 2 has parcels to deliver to city b There are nine distinct pure deals in this encounter
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
the conflict deal
35
Figure deals knowing union must be ab
bull Choices for first agent a b ab
bull Second agent must ldquopick up the slackrdquo
bull a for agent 1 b|ab (for agent 2)
bull b for agent 1a|ab
bull ab for agent 1 a|ab|b|
bull for agent 1 ab
36
Utility Function for AgentsGiven an encounter (T1 T2) the utility function for each agent is just the difference of costs and is defined as follow
Utilityk()=c(Tk)-Costk() = c(Tk)- c(Dk)
where =(D1 D2) is a deal
ndash c(Tk) is the stand-alone cost to agent k (the cost of achieving its goal with no help)
ndash Costk() is the cost of its part of the deal
Note that the utility of the conflict deal is always 0
37
Parcel Delivery Domain (assuming do not have to return home ndash like
Uhaul)Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
38
Dominant Dealsbull Deal dominates deal if is better for at least one agent
and not worse for the other ie is at least as good for every agent as
k12 Utilityk() Utilityk()
is better for some agent than
k12 Utilityk()gt Utilityk()
bull Deal weakly dominates deal if at least the first condition holds (deal isnrsquot worse for anyone)
Any reasonable agent would prefer (or go along with) over
if dominates or weakly dominates
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
9
Strategic (insincere) votersbull Suppose your choice will likely come in second
place If you rank the first choice of rest of group very low you may lower that choice enough so yours is first
bull True story Deanrsquos selection Each committee member told they had 5 points to award and could spread out any way among the candidates The recipient of the most points wins I put all my points on one candidate Most split their points I swung the vote What was my gamble
bull Want to get the results as if truthful voting were done
10
Typical Competition Mechanisms
bull Auction allocate goods or tasks to agents through market Need a richer technique for reaching agreements
bull Negotiation reach agreements through interaction
bull Argumentation resolve confliction through debates
11
Negotiation
bull May involve
ndash Exchange of information
ndash Relaxation of initial goals
ndash Mutual concession
12
Mechanisms Protocols Strategies
bull Negotiation is governed by a mechanism or a
protocol
ndash defines the rdquorules of encounterrdquo between the agents
ndash the public rules by which the agents will come to
agreements
bull Given a particular protocol how can a particular
strategy be designed that individual agents can use
13
Negotiation is the process of reaching agreements on matters of common interest It usually proceeds in a series of rounds with every agent making a proposal at every round
Negotiation Mechanism
Issues in negotiation processbull Negotiation Space All possible deals that agents can make ie t
he set of candidate deals bull Negotiation Protocol ndash A rule that determines the process of a ne
gotiation how and when a proposal can be made when a deal has been struck when the negotiation should be terminated and so
bull Negotiation Strategy When and what proposals should be made
14
Protocol
bull Means kinds of deals that can be made
bull Means sequence of offers and counter-offers
bull Protocol is like rules of chess game whereas strategy is way in which player decides which move to make
15
Game Theory
bull Computers make concrete the notion of strategy which is central to game playing
16
Mechanisms Design
bull Mechanism design is the design of protocols for governing multi-
agent interactions
bull Desirable properties of mechanisms are
ndash Convergenceguaranteed success
ndash Maximising global welfare sum of agent benefits are maximized
ndash Pareto efficiency
ndash Individual rationality
ndash Stability no agent should have incentive to deviate from strategy
ndash Simplicity low computational demands little communication
ndash Distribution no central decision maker
ndash Symmetry not want agents to play different roles (all agents have same
choice of actions)
17
Attributes not universally accepted
bull Canrsquot always achieve every attribute so look at tradeoffs of choices (for example) efficiency and stability are sometimes in conflict with each other
18
Negotiation Protocol
bull Who beginsbull Take turnsbull Build off previous offersbull Give feed back (or not)bull Tell what utility is (or not)bull Obligations bull Privacybull Allowed proposals you can make as a result of
negotiation history
19
Thought Question
bull Why not just compute a joint solution ndash using linear programming
20
Negotiation Process 1
bull Negotiation usually proceeds in a series of rounds
with every agent making a proposal at every round
bull Communication during negotiation
Proposal
Counter Proposal
Agenti concedes
Agenti Agentj
21
Negotiation Process 2
bull Another way of looking at the negotiation
process is (can talk about 5050 or 9010
depending on who rdquomovesrdquo the farthest)
Proposals by AjProposals by AiPoint of
Acceptanceaggreement
22
Many types of interactive concession based methods
bull Some use multiple objective linear programming ndash ndash requires that the players construct a crude linear
approximation of t heir utility functions
bull Jointly Improving Direction method Start out with a neutral suggestive value continue until no joint improvements are possible ndash Used in Camp Daivd peace negotiations (EgyptIsrael
ndash Jimmy Carter Nobel Peace Prize 2002)
23
Jointly Improving Direction method
Iterate overbull Mediator helps players criticize a tentative
agreement (could be status quo)bull Generates a compromise direction (where each
of the k issues is a direction in k-space)bull Mediator helps players to find a jointly preferred
outcome along the compromise direction and then proposes a new tentative agreement
24
Typical Negotiation ProblemsTask-Oriented Domains(TOD) an agents activity can be defined in terms of a set of tasks that it has to achieve The target of a negotiation is to minimize the cost of completing the tasks
State Oriented Domains(SOD) each agent is concerned with moving the world from an initial state into one of a set of goal states The target of a negotiation is to achieve a common goal Main attribute actions have side effects (positivenegative)
Worth Oriented Domains(WOD) agents assign a worth to each potential state which captures its desirability for the agent The target of a negotiation is to maximize mutual worth (rather than worth to individual)
25
Complex Negotiations
bull Some attributes that make the negotiation process
complex are
ndash Multiple attributes
bull Single attribute (price) ndash symmetric scenario (both benefit in the
same way by a cheaper price)
bull Multiple attributes ndash several inter-related attributes eg buying a
car
ndash The number of agents and the way they interact
bull One-to-one eg single buyer and single seller
bull Many-to-one eg multiple buyers and a single seller auctions
bull Many-to-many eg multiple buyers and multiple sellers
26
Single issue negotiation
bull Like moneybull Symmetric (If roles were reversed I would
benefit the same way you would) ndash If one task requires less travel both would benefit
equally by having less travelndash utility for a task is experienced the same way by
whomever is assigned to that taskbull Non-symmetric ndash we would benefit differently if
roles were reversedndash if you delivered the picnic table you could just throw it
in the back of your van If I delivered it I would have to rent a U-haul to transport it (as my car is small)
27
Multiple Issue negotiation
bull Could be hundreds of issues (cost delivery date size quality)
bull Some may be inter-related (as size goes down cost goes down quality goes up)
bull Not clear what a true concession is (larger may be cheaper but harder to store or spoils before can be used)
bull May not even be clear what is up for negotiation (I didnrsquot realize not having any test was an option) (on the jobhellipAsk for stock options bigger office work from home)
28
How many agents are involved
bull One to one
bull One to many (auction is an example of one seller and many buyers)
bull Many to many (could be divided into buyers and sellers or all could be identical in role)ndash n(n-1)2 number of pairs
29
Negotiation DomainsTask-oriented
bull rdquoDomains in which an agentrsquos activity can be defined
in terms of a set of tasks that it has to achieverdquo (Rosenschein amp Zlotkin 1994)
bull An agent can carry out the tasks without interference (or
help) from other agents ndash such as rdquowho will deliver the
mailrdquo
bull All resources are available to the agent
bull Tasks redistributed for the benefit of all agents
30
Task-oriented Domain Definitionbull How can an agent evaluate the utility of a specific deal
ndash Utility represents how much an agent has to gain from the deal (it is
always based on change from original allocation)
ndash Since an agent can achieve the goal on its own it can compare the cost of
achieving the goal on its own to the cost of its part of the deal
bull If utilitylt0 it is worse off than performing tasks on its own
bull Conflict deal (stay with status quo) if agents fail to reach an
agreement
ndash where no agent agrees to execute tasks other than its own
bull utlity = 0
31
Formalization of TODA Task Oriented Domain(TOD) is a triple ltT Ag cgt
wherendash T is a finite set of all possible tasks
ndash Ag=A1 A2hellip An is a list of participant agentsndash c(T)R+ defines cost of executing each subset of tasks
Assumptions on cost function1 c() = 02 The cost of a subset of tasks does not depend on who carries out
them (Idealized situation)3 Cost function is monotonic which means that more tasks more
cost (It canrsquot cost less to take on more tasks) i T1 T2 implies c(T1) c(T2)
32
Redistribution of TasksGiven a TOD ltT A1A2 cgt T is original assignment D i
s assignment after the ldquodealrdquobull An encounter (instance) within the TOD is an ordered
list (T1 T2) such that for all k Tk T This is an original allocation of tasks that they might want to reallocate
bull A pure deal on an encounter is the redistribution of tasks among agents (D1 D2) such that all tasks are reassigned
D1 D2= T1 T2
Specifically (D1 D2)=(T1 T2) is called the conflict deal bull For each deal =(D1 D2) the cost of such a deal to
agent k is Costk()=c(Dk) (ie cost to k of deal is cost of Dk krsquos part of deal)
33
Examples of TOD
bull Parcel Delivery
Several couriers have to deliver sets of parcels to different cities The target of negotiation is to reallocate deliveries so that the cost of travel to each courier is minimalbull Database Queries
Several agents have access to a common database and each has to carry out a set of queries The target of negotiation is to arrange queries so as to maximize efficiency of database operations (Join Projection Union Intersection hellip) You are doing a join as part of another operation so please save the results for me
34
Possible DealsConsider an encounter from the Parcel Delivery Domain Suppose we have two agents Both agents have parcels to deliver to city a and only agent 2 has parcels to deliver to city b There are nine distinct pure deals in this encounter
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
the conflict deal
35
Figure deals knowing union must be ab
bull Choices for first agent a b ab
bull Second agent must ldquopick up the slackrdquo
bull a for agent 1 b|ab (for agent 2)
bull b for agent 1a|ab
bull ab for agent 1 a|ab|b|
bull for agent 1 ab
36
Utility Function for AgentsGiven an encounter (T1 T2) the utility function for each agent is just the difference of costs and is defined as follow
Utilityk()=c(Tk)-Costk() = c(Tk)- c(Dk)
where =(D1 D2) is a deal
ndash c(Tk) is the stand-alone cost to agent k (the cost of achieving its goal with no help)
ndash Costk() is the cost of its part of the deal
Note that the utility of the conflict deal is always 0
37
Parcel Delivery Domain (assuming do not have to return home ndash like
Uhaul)Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
38
Dominant Dealsbull Deal dominates deal if is better for at least one agent
and not worse for the other ie is at least as good for every agent as
k12 Utilityk() Utilityk()
is better for some agent than
k12 Utilityk()gt Utilityk()
bull Deal weakly dominates deal if at least the first condition holds (deal isnrsquot worse for anyone)
Any reasonable agent would prefer (or go along with) over
if dominates or weakly dominates
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
10
Typical Competition Mechanisms
bull Auction allocate goods or tasks to agents through market Need a richer technique for reaching agreements
bull Negotiation reach agreements through interaction
bull Argumentation resolve confliction through debates
11
Negotiation
bull May involve
ndash Exchange of information
ndash Relaxation of initial goals
ndash Mutual concession
12
Mechanisms Protocols Strategies
bull Negotiation is governed by a mechanism or a
protocol
ndash defines the rdquorules of encounterrdquo between the agents
ndash the public rules by which the agents will come to
agreements
bull Given a particular protocol how can a particular
strategy be designed that individual agents can use
13
Negotiation is the process of reaching agreements on matters of common interest It usually proceeds in a series of rounds with every agent making a proposal at every round
Negotiation Mechanism
Issues in negotiation processbull Negotiation Space All possible deals that agents can make ie t
he set of candidate deals bull Negotiation Protocol ndash A rule that determines the process of a ne
gotiation how and when a proposal can be made when a deal has been struck when the negotiation should be terminated and so
bull Negotiation Strategy When and what proposals should be made
14
Protocol
bull Means kinds of deals that can be made
bull Means sequence of offers and counter-offers
bull Protocol is like rules of chess game whereas strategy is way in which player decides which move to make
15
Game Theory
bull Computers make concrete the notion of strategy which is central to game playing
16
Mechanisms Design
bull Mechanism design is the design of protocols for governing multi-
agent interactions
bull Desirable properties of mechanisms are
ndash Convergenceguaranteed success
ndash Maximising global welfare sum of agent benefits are maximized
ndash Pareto efficiency
ndash Individual rationality
ndash Stability no agent should have incentive to deviate from strategy
ndash Simplicity low computational demands little communication
ndash Distribution no central decision maker
ndash Symmetry not want agents to play different roles (all agents have same
choice of actions)
17
Attributes not universally accepted
bull Canrsquot always achieve every attribute so look at tradeoffs of choices (for example) efficiency and stability are sometimes in conflict with each other
18
Negotiation Protocol
bull Who beginsbull Take turnsbull Build off previous offersbull Give feed back (or not)bull Tell what utility is (or not)bull Obligations bull Privacybull Allowed proposals you can make as a result of
negotiation history
19
Thought Question
bull Why not just compute a joint solution ndash using linear programming
20
Negotiation Process 1
bull Negotiation usually proceeds in a series of rounds
with every agent making a proposal at every round
bull Communication during negotiation
Proposal
Counter Proposal
Agenti concedes
Agenti Agentj
21
Negotiation Process 2
bull Another way of looking at the negotiation
process is (can talk about 5050 or 9010
depending on who rdquomovesrdquo the farthest)
Proposals by AjProposals by AiPoint of
Acceptanceaggreement
22
Many types of interactive concession based methods
bull Some use multiple objective linear programming ndash ndash requires that the players construct a crude linear
approximation of t heir utility functions
bull Jointly Improving Direction method Start out with a neutral suggestive value continue until no joint improvements are possible ndash Used in Camp Daivd peace negotiations (EgyptIsrael
ndash Jimmy Carter Nobel Peace Prize 2002)
23
Jointly Improving Direction method
Iterate overbull Mediator helps players criticize a tentative
agreement (could be status quo)bull Generates a compromise direction (where each
of the k issues is a direction in k-space)bull Mediator helps players to find a jointly preferred
outcome along the compromise direction and then proposes a new tentative agreement
24
Typical Negotiation ProblemsTask-Oriented Domains(TOD) an agents activity can be defined in terms of a set of tasks that it has to achieve The target of a negotiation is to minimize the cost of completing the tasks
State Oriented Domains(SOD) each agent is concerned with moving the world from an initial state into one of a set of goal states The target of a negotiation is to achieve a common goal Main attribute actions have side effects (positivenegative)
Worth Oriented Domains(WOD) agents assign a worth to each potential state which captures its desirability for the agent The target of a negotiation is to maximize mutual worth (rather than worth to individual)
25
Complex Negotiations
bull Some attributes that make the negotiation process
complex are
ndash Multiple attributes
bull Single attribute (price) ndash symmetric scenario (both benefit in the
same way by a cheaper price)
bull Multiple attributes ndash several inter-related attributes eg buying a
car
ndash The number of agents and the way they interact
bull One-to-one eg single buyer and single seller
bull Many-to-one eg multiple buyers and a single seller auctions
bull Many-to-many eg multiple buyers and multiple sellers
26
Single issue negotiation
bull Like moneybull Symmetric (If roles were reversed I would
benefit the same way you would) ndash If one task requires less travel both would benefit
equally by having less travelndash utility for a task is experienced the same way by
whomever is assigned to that taskbull Non-symmetric ndash we would benefit differently if
roles were reversedndash if you delivered the picnic table you could just throw it
in the back of your van If I delivered it I would have to rent a U-haul to transport it (as my car is small)
27
Multiple Issue negotiation
bull Could be hundreds of issues (cost delivery date size quality)
bull Some may be inter-related (as size goes down cost goes down quality goes up)
bull Not clear what a true concession is (larger may be cheaper but harder to store or spoils before can be used)
bull May not even be clear what is up for negotiation (I didnrsquot realize not having any test was an option) (on the jobhellipAsk for stock options bigger office work from home)
28
How many agents are involved
bull One to one
bull One to many (auction is an example of one seller and many buyers)
bull Many to many (could be divided into buyers and sellers or all could be identical in role)ndash n(n-1)2 number of pairs
29
Negotiation DomainsTask-oriented
bull rdquoDomains in which an agentrsquos activity can be defined
in terms of a set of tasks that it has to achieverdquo (Rosenschein amp Zlotkin 1994)
bull An agent can carry out the tasks without interference (or
help) from other agents ndash such as rdquowho will deliver the
mailrdquo
bull All resources are available to the agent
bull Tasks redistributed for the benefit of all agents
30
Task-oriented Domain Definitionbull How can an agent evaluate the utility of a specific deal
ndash Utility represents how much an agent has to gain from the deal (it is
always based on change from original allocation)
ndash Since an agent can achieve the goal on its own it can compare the cost of
achieving the goal on its own to the cost of its part of the deal
bull If utilitylt0 it is worse off than performing tasks on its own
bull Conflict deal (stay with status quo) if agents fail to reach an
agreement
ndash where no agent agrees to execute tasks other than its own
bull utlity = 0
31
Formalization of TODA Task Oriented Domain(TOD) is a triple ltT Ag cgt
wherendash T is a finite set of all possible tasks
ndash Ag=A1 A2hellip An is a list of participant agentsndash c(T)R+ defines cost of executing each subset of tasks
Assumptions on cost function1 c() = 02 The cost of a subset of tasks does not depend on who carries out
them (Idealized situation)3 Cost function is monotonic which means that more tasks more
cost (It canrsquot cost less to take on more tasks) i T1 T2 implies c(T1) c(T2)
32
Redistribution of TasksGiven a TOD ltT A1A2 cgt T is original assignment D i
s assignment after the ldquodealrdquobull An encounter (instance) within the TOD is an ordered
list (T1 T2) such that for all k Tk T This is an original allocation of tasks that they might want to reallocate
bull A pure deal on an encounter is the redistribution of tasks among agents (D1 D2) such that all tasks are reassigned
D1 D2= T1 T2
Specifically (D1 D2)=(T1 T2) is called the conflict deal bull For each deal =(D1 D2) the cost of such a deal to
agent k is Costk()=c(Dk) (ie cost to k of deal is cost of Dk krsquos part of deal)
33
Examples of TOD
bull Parcel Delivery
Several couriers have to deliver sets of parcels to different cities The target of negotiation is to reallocate deliveries so that the cost of travel to each courier is minimalbull Database Queries
Several agents have access to a common database and each has to carry out a set of queries The target of negotiation is to arrange queries so as to maximize efficiency of database operations (Join Projection Union Intersection hellip) You are doing a join as part of another operation so please save the results for me
34
Possible DealsConsider an encounter from the Parcel Delivery Domain Suppose we have two agents Both agents have parcels to deliver to city a and only agent 2 has parcels to deliver to city b There are nine distinct pure deals in this encounter
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
the conflict deal
35
Figure deals knowing union must be ab
bull Choices for first agent a b ab
bull Second agent must ldquopick up the slackrdquo
bull a for agent 1 b|ab (for agent 2)
bull b for agent 1a|ab
bull ab for agent 1 a|ab|b|
bull for agent 1 ab
36
Utility Function for AgentsGiven an encounter (T1 T2) the utility function for each agent is just the difference of costs and is defined as follow
Utilityk()=c(Tk)-Costk() = c(Tk)- c(Dk)
where =(D1 D2) is a deal
ndash c(Tk) is the stand-alone cost to agent k (the cost of achieving its goal with no help)
ndash Costk() is the cost of its part of the deal
Note that the utility of the conflict deal is always 0
37
Parcel Delivery Domain (assuming do not have to return home ndash like
Uhaul)Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
38
Dominant Dealsbull Deal dominates deal if is better for at least one agent
and not worse for the other ie is at least as good for every agent as
k12 Utilityk() Utilityk()
is better for some agent than
k12 Utilityk()gt Utilityk()
bull Deal weakly dominates deal if at least the first condition holds (deal isnrsquot worse for anyone)
Any reasonable agent would prefer (or go along with) over
if dominates or weakly dominates
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
11
Negotiation
bull May involve
ndash Exchange of information
ndash Relaxation of initial goals
ndash Mutual concession
12
Mechanisms Protocols Strategies
bull Negotiation is governed by a mechanism or a
protocol
ndash defines the rdquorules of encounterrdquo between the agents
ndash the public rules by which the agents will come to
agreements
bull Given a particular protocol how can a particular
strategy be designed that individual agents can use
13
Negotiation is the process of reaching agreements on matters of common interest It usually proceeds in a series of rounds with every agent making a proposal at every round
Negotiation Mechanism
Issues in negotiation processbull Negotiation Space All possible deals that agents can make ie t
he set of candidate deals bull Negotiation Protocol ndash A rule that determines the process of a ne
gotiation how and when a proposal can be made when a deal has been struck when the negotiation should be terminated and so
bull Negotiation Strategy When and what proposals should be made
14
Protocol
bull Means kinds of deals that can be made
bull Means sequence of offers and counter-offers
bull Protocol is like rules of chess game whereas strategy is way in which player decides which move to make
15
Game Theory
bull Computers make concrete the notion of strategy which is central to game playing
16
Mechanisms Design
bull Mechanism design is the design of protocols for governing multi-
agent interactions
bull Desirable properties of mechanisms are
ndash Convergenceguaranteed success
ndash Maximising global welfare sum of agent benefits are maximized
ndash Pareto efficiency
ndash Individual rationality
ndash Stability no agent should have incentive to deviate from strategy
ndash Simplicity low computational demands little communication
ndash Distribution no central decision maker
ndash Symmetry not want agents to play different roles (all agents have same
choice of actions)
17
Attributes not universally accepted
bull Canrsquot always achieve every attribute so look at tradeoffs of choices (for example) efficiency and stability are sometimes in conflict with each other
18
Negotiation Protocol
bull Who beginsbull Take turnsbull Build off previous offersbull Give feed back (or not)bull Tell what utility is (or not)bull Obligations bull Privacybull Allowed proposals you can make as a result of
negotiation history
19
Thought Question
bull Why not just compute a joint solution ndash using linear programming
20
Negotiation Process 1
bull Negotiation usually proceeds in a series of rounds
with every agent making a proposal at every round
bull Communication during negotiation
Proposal
Counter Proposal
Agenti concedes
Agenti Agentj
21
Negotiation Process 2
bull Another way of looking at the negotiation
process is (can talk about 5050 or 9010
depending on who rdquomovesrdquo the farthest)
Proposals by AjProposals by AiPoint of
Acceptanceaggreement
22
Many types of interactive concession based methods
bull Some use multiple objective linear programming ndash ndash requires that the players construct a crude linear
approximation of t heir utility functions
bull Jointly Improving Direction method Start out with a neutral suggestive value continue until no joint improvements are possible ndash Used in Camp Daivd peace negotiations (EgyptIsrael
ndash Jimmy Carter Nobel Peace Prize 2002)
23
Jointly Improving Direction method
Iterate overbull Mediator helps players criticize a tentative
agreement (could be status quo)bull Generates a compromise direction (where each
of the k issues is a direction in k-space)bull Mediator helps players to find a jointly preferred
outcome along the compromise direction and then proposes a new tentative agreement
24
Typical Negotiation ProblemsTask-Oriented Domains(TOD) an agents activity can be defined in terms of a set of tasks that it has to achieve The target of a negotiation is to minimize the cost of completing the tasks
State Oriented Domains(SOD) each agent is concerned with moving the world from an initial state into one of a set of goal states The target of a negotiation is to achieve a common goal Main attribute actions have side effects (positivenegative)
Worth Oriented Domains(WOD) agents assign a worth to each potential state which captures its desirability for the agent The target of a negotiation is to maximize mutual worth (rather than worth to individual)
25
Complex Negotiations
bull Some attributes that make the negotiation process
complex are
ndash Multiple attributes
bull Single attribute (price) ndash symmetric scenario (both benefit in the
same way by a cheaper price)
bull Multiple attributes ndash several inter-related attributes eg buying a
car
ndash The number of agents and the way they interact
bull One-to-one eg single buyer and single seller
bull Many-to-one eg multiple buyers and a single seller auctions
bull Many-to-many eg multiple buyers and multiple sellers
26
Single issue negotiation
bull Like moneybull Symmetric (If roles were reversed I would
benefit the same way you would) ndash If one task requires less travel both would benefit
equally by having less travelndash utility for a task is experienced the same way by
whomever is assigned to that taskbull Non-symmetric ndash we would benefit differently if
roles were reversedndash if you delivered the picnic table you could just throw it
in the back of your van If I delivered it I would have to rent a U-haul to transport it (as my car is small)
27
Multiple Issue negotiation
bull Could be hundreds of issues (cost delivery date size quality)
bull Some may be inter-related (as size goes down cost goes down quality goes up)
bull Not clear what a true concession is (larger may be cheaper but harder to store or spoils before can be used)
bull May not even be clear what is up for negotiation (I didnrsquot realize not having any test was an option) (on the jobhellipAsk for stock options bigger office work from home)
28
How many agents are involved
bull One to one
bull One to many (auction is an example of one seller and many buyers)
bull Many to many (could be divided into buyers and sellers or all could be identical in role)ndash n(n-1)2 number of pairs
29
Negotiation DomainsTask-oriented
bull rdquoDomains in which an agentrsquos activity can be defined
in terms of a set of tasks that it has to achieverdquo (Rosenschein amp Zlotkin 1994)
bull An agent can carry out the tasks without interference (or
help) from other agents ndash such as rdquowho will deliver the
mailrdquo
bull All resources are available to the agent
bull Tasks redistributed for the benefit of all agents
30
Task-oriented Domain Definitionbull How can an agent evaluate the utility of a specific deal
ndash Utility represents how much an agent has to gain from the deal (it is
always based on change from original allocation)
ndash Since an agent can achieve the goal on its own it can compare the cost of
achieving the goal on its own to the cost of its part of the deal
bull If utilitylt0 it is worse off than performing tasks on its own
bull Conflict deal (stay with status quo) if agents fail to reach an
agreement
ndash where no agent agrees to execute tasks other than its own
bull utlity = 0
31
Formalization of TODA Task Oriented Domain(TOD) is a triple ltT Ag cgt
wherendash T is a finite set of all possible tasks
ndash Ag=A1 A2hellip An is a list of participant agentsndash c(T)R+ defines cost of executing each subset of tasks
Assumptions on cost function1 c() = 02 The cost of a subset of tasks does not depend on who carries out
them (Idealized situation)3 Cost function is monotonic which means that more tasks more
cost (It canrsquot cost less to take on more tasks) i T1 T2 implies c(T1) c(T2)
32
Redistribution of TasksGiven a TOD ltT A1A2 cgt T is original assignment D i
s assignment after the ldquodealrdquobull An encounter (instance) within the TOD is an ordered
list (T1 T2) such that for all k Tk T This is an original allocation of tasks that they might want to reallocate
bull A pure deal on an encounter is the redistribution of tasks among agents (D1 D2) such that all tasks are reassigned
D1 D2= T1 T2
Specifically (D1 D2)=(T1 T2) is called the conflict deal bull For each deal =(D1 D2) the cost of such a deal to
agent k is Costk()=c(Dk) (ie cost to k of deal is cost of Dk krsquos part of deal)
33
Examples of TOD
bull Parcel Delivery
Several couriers have to deliver sets of parcels to different cities The target of negotiation is to reallocate deliveries so that the cost of travel to each courier is minimalbull Database Queries
Several agents have access to a common database and each has to carry out a set of queries The target of negotiation is to arrange queries so as to maximize efficiency of database operations (Join Projection Union Intersection hellip) You are doing a join as part of another operation so please save the results for me
34
Possible DealsConsider an encounter from the Parcel Delivery Domain Suppose we have two agents Both agents have parcels to deliver to city a and only agent 2 has parcels to deliver to city b There are nine distinct pure deals in this encounter
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
the conflict deal
35
Figure deals knowing union must be ab
bull Choices for first agent a b ab
bull Second agent must ldquopick up the slackrdquo
bull a for agent 1 b|ab (for agent 2)
bull b for agent 1a|ab
bull ab for agent 1 a|ab|b|
bull for agent 1 ab
36
Utility Function for AgentsGiven an encounter (T1 T2) the utility function for each agent is just the difference of costs and is defined as follow
Utilityk()=c(Tk)-Costk() = c(Tk)- c(Dk)
where =(D1 D2) is a deal
ndash c(Tk) is the stand-alone cost to agent k (the cost of achieving its goal with no help)
ndash Costk() is the cost of its part of the deal
Note that the utility of the conflict deal is always 0
37
Parcel Delivery Domain (assuming do not have to return home ndash like
Uhaul)Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
38
Dominant Dealsbull Deal dominates deal if is better for at least one agent
and not worse for the other ie is at least as good for every agent as
k12 Utilityk() Utilityk()
is better for some agent than
k12 Utilityk()gt Utilityk()
bull Deal weakly dominates deal if at least the first condition holds (deal isnrsquot worse for anyone)
Any reasonable agent would prefer (or go along with) over
if dominates or weakly dominates
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
12
Mechanisms Protocols Strategies
bull Negotiation is governed by a mechanism or a
protocol
ndash defines the rdquorules of encounterrdquo between the agents
ndash the public rules by which the agents will come to
agreements
bull Given a particular protocol how can a particular
strategy be designed that individual agents can use
13
Negotiation is the process of reaching agreements on matters of common interest It usually proceeds in a series of rounds with every agent making a proposal at every round
Negotiation Mechanism
Issues in negotiation processbull Negotiation Space All possible deals that agents can make ie t
he set of candidate deals bull Negotiation Protocol ndash A rule that determines the process of a ne
gotiation how and when a proposal can be made when a deal has been struck when the negotiation should be terminated and so
bull Negotiation Strategy When and what proposals should be made
14
Protocol
bull Means kinds of deals that can be made
bull Means sequence of offers and counter-offers
bull Protocol is like rules of chess game whereas strategy is way in which player decides which move to make
15
Game Theory
bull Computers make concrete the notion of strategy which is central to game playing
16
Mechanisms Design
bull Mechanism design is the design of protocols for governing multi-
agent interactions
bull Desirable properties of mechanisms are
ndash Convergenceguaranteed success
ndash Maximising global welfare sum of agent benefits are maximized
ndash Pareto efficiency
ndash Individual rationality
ndash Stability no agent should have incentive to deviate from strategy
ndash Simplicity low computational demands little communication
ndash Distribution no central decision maker
ndash Symmetry not want agents to play different roles (all agents have same
choice of actions)
17
Attributes not universally accepted
bull Canrsquot always achieve every attribute so look at tradeoffs of choices (for example) efficiency and stability are sometimes in conflict with each other
18
Negotiation Protocol
bull Who beginsbull Take turnsbull Build off previous offersbull Give feed back (or not)bull Tell what utility is (or not)bull Obligations bull Privacybull Allowed proposals you can make as a result of
negotiation history
19
Thought Question
bull Why not just compute a joint solution ndash using linear programming
20
Negotiation Process 1
bull Negotiation usually proceeds in a series of rounds
with every agent making a proposal at every round
bull Communication during negotiation
Proposal
Counter Proposal
Agenti concedes
Agenti Agentj
21
Negotiation Process 2
bull Another way of looking at the negotiation
process is (can talk about 5050 or 9010
depending on who rdquomovesrdquo the farthest)
Proposals by AjProposals by AiPoint of
Acceptanceaggreement
22
Many types of interactive concession based methods
bull Some use multiple objective linear programming ndash ndash requires that the players construct a crude linear
approximation of t heir utility functions
bull Jointly Improving Direction method Start out with a neutral suggestive value continue until no joint improvements are possible ndash Used in Camp Daivd peace negotiations (EgyptIsrael
ndash Jimmy Carter Nobel Peace Prize 2002)
23
Jointly Improving Direction method
Iterate overbull Mediator helps players criticize a tentative
agreement (could be status quo)bull Generates a compromise direction (where each
of the k issues is a direction in k-space)bull Mediator helps players to find a jointly preferred
outcome along the compromise direction and then proposes a new tentative agreement
24
Typical Negotiation ProblemsTask-Oriented Domains(TOD) an agents activity can be defined in terms of a set of tasks that it has to achieve The target of a negotiation is to minimize the cost of completing the tasks
State Oriented Domains(SOD) each agent is concerned with moving the world from an initial state into one of a set of goal states The target of a negotiation is to achieve a common goal Main attribute actions have side effects (positivenegative)
Worth Oriented Domains(WOD) agents assign a worth to each potential state which captures its desirability for the agent The target of a negotiation is to maximize mutual worth (rather than worth to individual)
25
Complex Negotiations
bull Some attributes that make the negotiation process
complex are
ndash Multiple attributes
bull Single attribute (price) ndash symmetric scenario (both benefit in the
same way by a cheaper price)
bull Multiple attributes ndash several inter-related attributes eg buying a
car
ndash The number of agents and the way they interact
bull One-to-one eg single buyer and single seller
bull Many-to-one eg multiple buyers and a single seller auctions
bull Many-to-many eg multiple buyers and multiple sellers
26
Single issue negotiation
bull Like moneybull Symmetric (If roles were reversed I would
benefit the same way you would) ndash If one task requires less travel both would benefit
equally by having less travelndash utility for a task is experienced the same way by
whomever is assigned to that taskbull Non-symmetric ndash we would benefit differently if
roles were reversedndash if you delivered the picnic table you could just throw it
in the back of your van If I delivered it I would have to rent a U-haul to transport it (as my car is small)
27
Multiple Issue negotiation
bull Could be hundreds of issues (cost delivery date size quality)
bull Some may be inter-related (as size goes down cost goes down quality goes up)
bull Not clear what a true concession is (larger may be cheaper but harder to store or spoils before can be used)
bull May not even be clear what is up for negotiation (I didnrsquot realize not having any test was an option) (on the jobhellipAsk for stock options bigger office work from home)
28
How many agents are involved
bull One to one
bull One to many (auction is an example of one seller and many buyers)
bull Many to many (could be divided into buyers and sellers or all could be identical in role)ndash n(n-1)2 number of pairs
29
Negotiation DomainsTask-oriented
bull rdquoDomains in which an agentrsquos activity can be defined
in terms of a set of tasks that it has to achieverdquo (Rosenschein amp Zlotkin 1994)
bull An agent can carry out the tasks without interference (or
help) from other agents ndash such as rdquowho will deliver the
mailrdquo
bull All resources are available to the agent
bull Tasks redistributed for the benefit of all agents
30
Task-oriented Domain Definitionbull How can an agent evaluate the utility of a specific deal
ndash Utility represents how much an agent has to gain from the deal (it is
always based on change from original allocation)
ndash Since an agent can achieve the goal on its own it can compare the cost of
achieving the goal on its own to the cost of its part of the deal
bull If utilitylt0 it is worse off than performing tasks on its own
bull Conflict deal (stay with status quo) if agents fail to reach an
agreement
ndash where no agent agrees to execute tasks other than its own
bull utlity = 0
31
Formalization of TODA Task Oriented Domain(TOD) is a triple ltT Ag cgt
wherendash T is a finite set of all possible tasks
ndash Ag=A1 A2hellip An is a list of participant agentsndash c(T)R+ defines cost of executing each subset of tasks
Assumptions on cost function1 c() = 02 The cost of a subset of tasks does not depend on who carries out
them (Idealized situation)3 Cost function is monotonic which means that more tasks more
cost (It canrsquot cost less to take on more tasks) i T1 T2 implies c(T1) c(T2)
32
Redistribution of TasksGiven a TOD ltT A1A2 cgt T is original assignment D i
s assignment after the ldquodealrdquobull An encounter (instance) within the TOD is an ordered
list (T1 T2) such that for all k Tk T This is an original allocation of tasks that they might want to reallocate
bull A pure deal on an encounter is the redistribution of tasks among agents (D1 D2) such that all tasks are reassigned
D1 D2= T1 T2
Specifically (D1 D2)=(T1 T2) is called the conflict deal bull For each deal =(D1 D2) the cost of such a deal to
agent k is Costk()=c(Dk) (ie cost to k of deal is cost of Dk krsquos part of deal)
33
Examples of TOD
bull Parcel Delivery
Several couriers have to deliver sets of parcels to different cities The target of negotiation is to reallocate deliveries so that the cost of travel to each courier is minimalbull Database Queries
Several agents have access to a common database and each has to carry out a set of queries The target of negotiation is to arrange queries so as to maximize efficiency of database operations (Join Projection Union Intersection hellip) You are doing a join as part of another operation so please save the results for me
34
Possible DealsConsider an encounter from the Parcel Delivery Domain Suppose we have two agents Both agents have parcels to deliver to city a and only agent 2 has parcels to deliver to city b There are nine distinct pure deals in this encounter
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
the conflict deal
35
Figure deals knowing union must be ab
bull Choices for first agent a b ab
bull Second agent must ldquopick up the slackrdquo
bull a for agent 1 b|ab (for agent 2)
bull b for agent 1a|ab
bull ab for agent 1 a|ab|b|
bull for agent 1 ab
36
Utility Function for AgentsGiven an encounter (T1 T2) the utility function for each agent is just the difference of costs and is defined as follow
Utilityk()=c(Tk)-Costk() = c(Tk)- c(Dk)
where =(D1 D2) is a deal
ndash c(Tk) is the stand-alone cost to agent k (the cost of achieving its goal with no help)
ndash Costk() is the cost of its part of the deal
Note that the utility of the conflict deal is always 0
37
Parcel Delivery Domain (assuming do not have to return home ndash like
Uhaul)Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
38
Dominant Dealsbull Deal dominates deal if is better for at least one agent
and not worse for the other ie is at least as good for every agent as
k12 Utilityk() Utilityk()
is better for some agent than
k12 Utilityk()gt Utilityk()
bull Deal weakly dominates deal if at least the first condition holds (deal isnrsquot worse for anyone)
Any reasonable agent would prefer (or go along with) over
if dominates or weakly dominates
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
13
Negotiation is the process of reaching agreements on matters of common interest It usually proceeds in a series of rounds with every agent making a proposal at every round
Negotiation Mechanism
Issues in negotiation processbull Negotiation Space All possible deals that agents can make ie t
he set of candidate deals bull Negotiation Protocol ndash A rule that determines the process of a ne
gotiation how and when a proposal can be made when a deal has been struck when the negotiation should be terminated and so
bull Negotiation Strategy When and what proposals should be made
14
Protocol
bull Means kinds of deals that can be made
bull Means sequence of offers and counter-offers
bull Protocol is like rules of chess game whereas strategy is way in which player decides which move to make
15
Game Theory
bull Computers make concrete the notion of strategy which is central to game playing
16
Mechanisms Design
bull Mechanism design is the design of protocols for governing multi-
agent interactions
bull Desirable properties of mechanisms are
ndash Convergenceguaranteed success
ndash Maximising global welfare sum of agent benefits are maximized
ndash Pareto efficiency
ndash Individual rationality
ndash Stability no agent should have incentive to deviate from strategy
ndash Simplicity low computational demands little communication
ndash Distribution no central decision maker
ndash Symmetry not want agents to play different roles (all agents have same
choice of actions)
17
Attributes not universally accepted
bull Canrsquot always achieve every attribute so look at tradeoffs of choices (for example) efficiency and stability are sometimes in conflict with each other
18
Negotiation Protocol
bull Who beginsbull Take turnsbull Build off previous offersbull Give feed back (or not)bull Tell what utility is (or not)bull Obligations bull Privacybull Allowed proposals you can make as a result of
negotiation history
19
Thought Question
bull Why not just compute a joint solution ndash using linear programming
20
Negotiation Process 1
bull Negotiation usually proceeds in a series of rounds
with every agent making a proposal at every round
bull Communication during negotiation
Proposal
Counter Proposal
Agenti concedes
Agenti Agentj
21
Negotiation Process 2
bull Another way of looking at the negotiation
process is (can talk about 5050 or 9010
depending on who rdquomovesrdquo the farthest)
Proposals by AjProposals by AiPoint of
Acceptanceaggreement
22
Many types of interactive concession based methods
bull Some use multiple objective linear programming ndash ndash requires that the players construct a crude linear
approximation of t heir utility functions
bull Jointly Improving Direction method Start out with a neutral suggestive value continue until no joint improvements are possible ndash Used in Camp Daivd peace negotiations (EgyptIsrael
ndash Jimmy Carter Nobel Peace Prize 2002)
23
Jointly Improving Direction method
Iterate overbull Mediator helps players criticize a tentative
agreement (could be status quo)bull Generates a compromise direction (where each
of the k issues is a direction in k-space)bull Mediator helps players to find a jointly preferred
outcome along the compromise direction and then proposes a new tentative agreement
24
Typical Negotiation ProblemsTask-Oriented Domains(TOD) an agents activity can be defined in terms of a set of tasks that it has to achieve The target of a negotiation is to minimize the cost of completing the tasks
State Oriented Domains(SOD) each agent is concerned with moving the world from an initial state into one of a set of goal states The target of a negotiation is to achieve a common goal Main attribute actions have side effects (positivenegative)
Worth Oriented Domains(WOD) agents assign a worth to each potential state which captures its desirability for the agent The target of a negotiation is to maximize mutual worth (rather than worth to individual)
25
Complex Negotiations
bull Some attributes that make the negotiation process
complex are
ndash Multiple attributes
bull Single attribute (price) ndash symmetric scenario (both benefit in the
same way by a cheaper price)
bull Multiple attributes ndash several inter-related attributes eg buying a
car
ndash The number of agents and the way they interact
bull One-to-one eg single buyer and single seller
bull Many-to-one eg multiple buyers and a single seller auctions
bull Many-to-many eg multiple buyers and multiple sellers
26
Single issue negotiation
bull Like moneybull Symmetric (If roles were reversed I would
benefit the same way you would) ndash If one task requires less travel both would benefit
equally by having less travelndash utility for a task is experienced the same way by
whomever is assigned to that taskbull Non-symmetric ndash we would benefit differently if
roles were reversedndash if you delivered the picnic table you could just throw it
in the back of your van If I delivered it I would have to rent a U-haul to transport it (as my car is small)
27
Multiple Issue negotiation
bull Could be hundreds of issues (cost delivery date size quality)
bull Some may be inter-related (as size goes down cost goes down quality goes up)
bull Not clear what a true concession is (larger may be cheaper but harder to store or spoils before can be used)
bull May not even be clear what is up for negotiation (I didnrsquot realize not having any test was an option) (on the jobhellipAsk for stock options bigger office work from home)
28
How many agents are involved
bull One to one
bull One to many (auction is an example of one seller and many buyers)
bull Many to many (could be divided into buyers and sellers or all could be identical in role)ndash n(n-1)2 number of pairs
29
Negotiation DomainsTask-oriented
bull rdquoDomains in which an agentrsquos activity can be defined
in terms of a set of tasks that it has to achieverdquo (Rosenschein amp Zlotkin 1994)
bull An agent can carry out the tasks without interference (or
help) from other agents ndash such as rdquowho will deliver the
mailrdquo
bull All resources are available to the agent
bull Tasks redistributed for the benefit of all agents
30
Task-oriented Domain Definitionbull How can an agent evaluate the utility of a specific deal
ndash Utility represents how much an agent has to gain from the deal (it is
always based on change from original allocation)
ndash Since an agent can achieve the goal on its own it can compare the cost of
achieving the goal on its own to the cost of its part of the deal
bull If utilitylt0 it is worse off than performing tasks on its own
bull Conflict deal (stay with status quo) if agents fail to reach an
agreement
ndash where no agent agrees to execute tasks other than its own
bull utlity = 0
31
Formalization of TODA Task Oriented Domain(TOD) is a triple ltT Ag cgt
wherendash T is a finite set of all possible tasks
ndash Ag=A1 A2hellip An is a list of participant agentsndash c(T)R+ defines cost of executing each subset of tasks
Assumptions on cost function1 c() = 02 The cost of a subset of tasks does not depend on who carries out
them (Idealized situation)3 Cost function is monotonic which means that more tasks more
cost (It canrsquot cost less to take on more tasks) i T1 T2 implies c(T1) c(T2)
32
Redistribution of TasksGiven a TOD ltT A1A2 cgt T is original assignment D i
s assignment after the ldquodealrdquobull An encounter (instance) within the TOD is an ordered
list (T1 T2) such that for all k Tk T This is an original allocation of tasks that they might want to reallocate
bull A pure deal on an encounter is the redistribution of tasks among agents (D1 D2) such that all tasks are reassigned
D1 D2= T1 T2
Specifically (D1 D2)=(T1 T2) is called the conflict deal bull For each deal =(D1 D2) the cost of such a deal to
agent k is Costk()=c(Dk) (ie cost to k of deal is cost of Dk krsquos part of deal)
33
Examples of TOD
bull Parcel Delivery
Several couriers have to deliver sets of parcels to different cities The target of negotiation is to reallocate deliveries so that the cost of travel to each courier is minimalbull Database Queries
Several agents have access to a common database and each has to carry out a set of queries The target of negotiation is to arrange queries so as to maximize efficiency of database operations (Join Projection Union Intersection hellip) You are doing a join as part of another operation so please save the results for me
34
Possible DealsConsider an encounter from the Parcel Delivery Domain Suppose we have two agents Both agents have parcels to deliver to city a and only agent 2 has parcels to deliver to city b There are nine distinct pure deals in this encounter
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
the conflict deal
35
Figure deals knowing union must be ab
bull Choices for first agent a b ab
bull Second agent must ldquopick up the slackrdquo
bull a for agent 1 b|ab (for agent 2)
bull b for agent 1a|ab
bull ab for agent 1 a|ab|b|
bull for agent 1 ab
36
Utility Function for AgentsGiven an encounter (T1 T2) the utility function for each agent is just the difference of costs and is defined as follow
Utilityk()=c(Tk)-Costk() = c(Tk)- c(Dk)
where =(D1 D2) is a deal
ndash c(Tk) is the stand-alone cost to agent k (the cost of achieving its goal with no help)
ndash Costk() is the cost of its part of the deal
Note that the utility of the conflict deal is always 0
37
Parcel Delivery Domain (assuming do not have to return home ndash like
Uhaul)Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
38
Dominant Dealsbull Deal dominates deal if is better for at least one agent
and not worse for the other ie is at least as good for every agent as
k12 Utilityk() Utilityk()
is better for some agent than
k12 Utilityk()gt Utilityk()
bull Deal weakly dominates deal if at least the first condition holds (deal isnrsquot worse for anyone)
Any reasonable agent would prefer (or go along with) over
if dominates or weakly dominates
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
14
Protocol
bull Means kinds of deals that can be made
bull Means sequence of offers and counter-offers
bull Protocol is like rules of chess game whereas strategy is way in which player decides which move to make
15
Game Theory
bull Computers make concrete the notion of strategy which is central to game playing
16
Mechanisms Design
bull Mechanism design is the design of protocols for governing multi-
agent interactions
bull Desirable properties of mechanisms are
ndash Convergenceguaranteed success
ndash Maximising global welfare sum of agent benefits are maximized
ndash Pareto efficiency
ndash Individual rationality
ndash Stability no agent should have incentive to deviate from strategy
ndash Simplicity low computational demands little communication
ndash Distribution no central decision maker
ndash Symmetry not want agents to play different roles (all agents have same
choice of actions)
17
Attributes not universally accepted
bull Canrsquot always achieve every attribute so look at tradeoffs of choices (for example) efficiency and stability are sometimes in conflict with each other
18
Negotiation Protocol
bull Who beginsbull Take turnsbull Build off previous offersbull Give feed back (or not)bull Tell what utility is (or not)bull Obligations bull Privacybull Allowed proposals you can make as a result of
negotiation history
19
Thought Question
bull Why not just compute a joint solution ndash using linear programming
20
Negotiation Process 1
bull Negotiation usually proceeds in a series of rounds
with every agent making a proposal at every round
bull Communication during negotiation
Proposal
Counter Proposal
Agenti concedes
Agenti Agentj
21
Negotiation Process 2
bull Another way of looking at the negotiation
process is (can talk about 5050 or 9010
depending on who rdquomovesrdquo the farthest)
Proposals by AjProposals by AiPoint of
Acceptanceaggreement
22
Many types of interactive concession based methods
bull Some use multiple objective linear programming ndash ndash requires that the players construct a crude linear
approximation of t heir utility functions
bull Jointly Improving Direction method Start out with a neutral suggestive value continue until no joint improvements are possible ndash Used in Camp Daivd peace negotiations (EgyptIsrael
ndash Jimmy Carter Nobel Peace Prize 2002)
23
Jointly Improving Direction method
Iterate overbull Mediator helps players criticize a tentative
agreement (could be status quo)bull Generates a compromise direction (where each
of the k issues is a direction in k-space)bull Mediator helps players to find a jointly preferred
outcome along the compromise direction and then proposes a new tentative agreement
24
Typical Negotiation ProblemsTask-Oriented Domains(TOD) an agents activity can be defined in terms of a set of tasks that it has to achieve The target of a negotiation is to minimize the cost of completing the tasks
State Oriented Domains(SOD) each agent is concerned with moving the world from an initial state into one of a set of goal states The target of a negotiation is to achieve a common goal Main attribute actions have side effects (positivenegative)
Worth Oriented Domains(WOD) agents assign a worth to each potential state which captures its desirability for the agent The target of a negotiation is to maximize mutual worth (rather than worth to individual)
25
Complex Negotiations
bull Some attributes that make the negotiation process
complex are
ndash Multiple attributes
bull Single attribute (price) ndash symmetric scenario (both benefit in the
same way by a cheaper price)
bull Multiple attributes ndash several inter-related attributes eg buying a
car
ndash The number of agents and the way they interact
bull One-to-one eg single buyer and single seller
bull Many-to-one eg multiple buyers and a single seller auctions
bull Many-to-many eg multiple buyers and multiple sellers
26
Single issue negotiation
bull Like moneybull Symmetric (If roles were reversed I would
benefit the same way you would) ndash If one task requires less travel both would benefit
equally by having less travelndash utility for a task is experienced the same way by
whomever is assigned to that taskbull Non-symmetric ndash we would benefit differently if
roles were reversedndash if you delivered the picnic table you could just throw it
in the back of your van If I delivered it I would have to rent a U-haul to transport it (as my car is small)
27
Multiple Issue negotiation
bull Could be hundreds of issues (cost delivery date size quality)
bull Some may be inter-related (as size goes down cost goes down quality goes up)
bull Not clear what a true concession is (larger may be cheaper but harder to store or spoils before can be used)
bull May not even be clear what is up for negotiation (I didnrsquot realize not having any test was an option) (on the jobhellipAsk for stock options bigger office work from home)
28
How many agents are involved
bull One to one
bull One to many (auction is an example of one seller and many buyers)
bull Many to many (could be divided into buyers and sellers or all could be identical in role)ndash n(n-1)2 number of pairs
29
Negotiation DomainsTask-oriented
bull rdquoDomains in which an agentrsquos activity can be defined
in terms of a set of tasks that it has to achieverdquo (Rosenschein amp Zlotkin 1994)
bull An agent can carry out the tasks without interference (or
help) from other agents ndash such as rdquowho will deliver the
mailrdquo
bull All resources are available to the agent
bull Tasks redistributed for the benefit of all agents
30
Task-oriented Domain Definitionbull How can an agent evaluate the utility of a specific deal
ndash Utility represents how much an agent has to gain from the deal (it is
always based on change from original allocation)
ndash Since an agent can achieve the goal on its own it can compare the cost of
achieving the goal on its own to the cost of its part of the deal
bull If utilitylt0 it is worse off than performing tasks on its own
bull Conflict deal (stay with status quo) if agents fail to reach an
agreement
ndash where no agent agrees to execute tasks other than its own
bull utlity = 0
31
Formalization of TODA Task Oriented Domain(TOD) is a triple ltT Ag cgt
wherendash T is a finite set of all possible tasks
ndash Ag=A1 A2hellip An is a list of participant agentsndash c(T)R+ defines cost of executing each subset of tasks
Assumptions on cost function1 c() = 02 The cost of a subset of tasks does not depend on who carries out
them (Idealized situation)3 Cost function is monotonic which means that more tasks more
cost (It canrsquot cost less to take on more tasks) i T1 T2 implies c(T1) c(T2)
32
Redistribution of TasksGiven a TOD ltT A1A2 cgt T is original assignment D i
s assignment after the ldquodealrdquobull An encounter (instance) within the TOD is an ordered
list (T1 T2) such that for all k Tk T This is an original allocation of tasks that they might want to reallocate
bull A pure deal on an encounter is the redistribution of tasks among agents (D1 D2) such that all tasks are reassigned
D1 D2= T1 T2
Specifically (D1 D2)=(T1 T2) is called the conflict deal bull For each deal =(D1 D2) the cost of such a deal to
agent k is Costk()=c(Dk) (ie cost to k of deal is cost of Dk krsquos part of deal)
33
Examples of TOD
bull Parcel Delivery
Several couriers have to deliver sets of parcels to different cities The target of negotiation is to reallocate deliveries so that the cost of travel to each courier is minimalbull Database Queries
Several agents have access to a common database and each has to carry out a set of queries The target of negotiation is to arrange queries so as to maximize efficiency of database operations (Join Projection Union Intersection hellip) You are doing a join as part of another operation so please save the results for me
34
Possible DealsConsider an encounter from the Parcel Delivery Domain Suppose we have two agents Both agents have parcels to deliver to city a and only agent 2 has parcels to deliver to city b There are nine distinct pure deals in this encounter
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
the conflict deal
35
Figure deals knowing union must be ab
bull Choices for first agent a b ab
bull Second agent must ldquopick up the slackrdquo
bull a for agent 1 b|ab (for agent 2)
bull b for agent 1a|ab
bull ab for agent 1 a|ab|b|
bull for agent 1 ab
36
Utility Function for AgentsGiven an encounter (T1 T2) the utility function for each agent is just the difference of costs and is defined as follow
Utilityk()=c(Tk)-Costk() = c(Tk)- c(Dk)
where =(D1 D2) is a deal
ndash c(Tk) is the stand-alone cost to agent k (the cost of achieving its goal with no help)
ndash Costk() is the cost of its part of the deal
Note that the utility of the conflict deal is always 0
37
Parcel Delivery Domain (assuming do not have to return home ndash like
Uhaul)Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
38
Dominant Dealsbull Deal dominates deal if is better for at least one agent
and not worse for the other ie is at least as good for every agent as
k12 Utilityk() Utilityk()
is better for some agent than
k12 Utilityk()gt Utilityk()
bull Deal weakly dominates deal if at least the first condition holds (deal isnrsquot worse for anyone)
Any reasonable agent would prefer (or go along with) over
if dominates or weakly dominates
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
15
Game Theory
bull Computers make concrete the notion of strategy which is central to game playing
16
Mechanisms Design
bull Mechanism design is the design of protocols for governing multi-
agent interactions
bull Desirable properties of mechanisms are
ndash Convergenceguaranteed success
ndash Maximising global welfare sum of agent benefits are maximized
ndash Pareto efficiency
ndash Individual rationality
ndash Stability no agent should have incentive to deviate from strategy
ndash Simplicity low computational demands little communication
ndash Distribution no central decision maker
ndash Symmetry not want agents to play different roles (all agents have same
choice of actions)
17
Attributes not universally accepted
bull Canrsquot always achieve every attribute so look at tradeoffs of choices (for example) efficiency and stability are sometimes in conflict with each other
18
Negotiation Protocol
bull Who beginsbull Take turnsbull Build off previous offersbull Give feed back (or not)bull Tell what utility is (or not)bull Obligations bull Privacybull Allowed proposals you can make as a result of
negotiation history
19
Thought Question
bull Why not just compute a joint solution ndash using linear programming
20
Negotiation Process 1
bull Negotiation usually proceeds in a series of rounds
with every agent making a proposal at every round
bull Communication during negotiation
Proposal
Counter Proposal
Agenti concedes
Agenti Agentj
21
Negotiation Process 2
bull Another way of looking at the negotiation
process is (can talk about 5050 or 9010
depending on who rdquomovesrdquo the farthest)
Proposals by AjProposals by AiPoint of
Acceptanceaggreement
22
Many types of interactive concession based methods
bull Some use multiple objective linear programming ndash ndash requires that the players construct a crude linear
approximation of t heir utility functions
bull Jointly Improving Direction method Start out with a neutral suggestive value continue until no joint improvements are possible ndash Used in Camp Daivd peace negotiations (EgyptIsrael
ndash Jimmy Carter Nobel Peace Prize 2002)
23
Jointly Improving Direction method
Iterate overbull Mediator helps players criticize a tentative
agreement (could be status quo)bull Generates a compromise direction (where each
of the k issues is a direction in k-space)bull Mediator helps players to find a jointly preferred
outcome along the compromise direction and then proposes a new tentative agreement
24
Typical Negotiation ProblemsTask-Oriented Domains(TOD) an agents activity can be defined in terms of a set of tasks that it has to achieve The target of a negotiation is to minimize the cost of completing the tasks
State Oriented Domains(SOD) each agent is concerned with moving the world from an initial state into one of a set of goal states The target of a negotiation is to achieve a common goal Main attribute actions have side effects (positivenegative)
Worth Oriented Domains(WOD) agents assign a worth to each potential state which captures its desirability for the agent The target of a negotiation is to maximize mutual worth (rather than worth to individual)
25
Complex Negotiations
bull Some attributes that make the negotiation process
complex are
ndash Multiple attributes
bull Single attribute (price) ndash symmetric scenario (both benefit in the
same way by a cheaper price)
bull Multiple attributes ndash several inter-related attributes eg buying a
car
ndash The number of agents and the way they interact
bull One-to-one eg single buyer and single seller
bull Many-to-one eg multiple buyers and a single seller auctions
bull Many-to-many eg multiple buyers and multiple sellers
26
Single issue negotiation
bull Like moneybull Symmetric (If roles were reversed I would
benefit the same way you would) ndash If one task requires less travel both would benefit
equally by having less travelndash utility for a task is experienced the same way by
whomever is assigned to that taskbull Non-symmetric ndash we would benefit differently if
roles were reversedndash if you delivered the picnic table you could just throw it
in the back of your van If I delivered it I would have to rent a U-haul to transport it (as my car is small)
27
Multiple Issue negotiation
bull Could be hundreds of issues (cost delivery date size quality)
bull Some may be inter-related (as size goes down cost goes down quality goes up)
bull Not clear what a true concession is (larger may be cheaper but harder to store or spoils before can be used)
bull May not even be clear what is up for negotiation (I didnrsquot realize not having any test was an option) (on the jobhellipAsk for stock options bigger office work from home)
28
How many agents are involved
bull One to one
bull One to many (auction is an example of one seller and many buyers)
bull Many to many (could be divided into buyers and sellers or all could be identical in role)ndash n(n-1)2 number of pairs
29
Negotiation DomainsTask-oriented
bull rdquoDomains in which an agentrsquos activity can be defined
in terms of a set of tasks that it has to achieverdquo (Rosenschein amp Zlotkin 1994)
bull An agent can carry out the tasks without interference (or
help) from other agents ndash such as rdquowho will deliver the
mailrdquo
bull All resources are available to the agent
bull Tasks redistributed for the benefit of all agents
30
Task-oriented Domain Definitionbull How can an agent evaluate the utility of a specific deal
ndash Utility represents how much an agent has to gain from the deal (it is
always based on change from original allocation)
ndash Since an agent can achieve the goal on its own it can compare the cost of
achieving the goal on its own to the cost of its part of the deal
bull If utilitylt0 it is worse off than performing tasks on its own
bull Conflict deal (stay with status quo) if agents fail to reach an
agreement
ndash where no agent agrees to execute tasks other than its own
bull utlity = 0
31
Formalization of TODA Task Oriented Domain(TOD) is a triple ltT Ag cgt
wherendash T is a finite set of all possible tasks
ndash Ag=A1 A2hellip An is a list of participant agentsndash c(T)R+ defines cost of executing each subset of tasks
Assumptions on cost function1 c() = 02 The cost of a subset of tasks does not depend on who carries out
them (Idealized situation)3 Cost function is monotonic which means that more tasks more
cost (It canrsquot cost less to take on more tasks) i T1 T2 implies c(T1) c(T2)
32
Redistribution of TasksGiven a TOD ltT A1A2 cgt T is original assignment D i
s assignment after the ldquodealrdquobull An encounter (instance) within the TOD is an ordered
list (T1 T2) such that for all k Tk T This is an original allocation of tasks that they might want to reallocate
bull A pure deal on an encounter is the redistribution of tasks among agents (D1 D2) such that all tasks are reassigned
D1 D2= T1 T2
Specifically (D1 D2)=(T1 T2) is called the conflict deal bull For each deal =(D1 D2) the cost of such a deal to
agent k is Costk()=c(Dk) (ie cost to k of deal is cost of Dk krsquos part of deal)
33
Examples of TOD
bull Parcel Delivery
Several couriers have to deliver sets of parcels to different cities The target of negotiation is to reallocate deliveries so that the cost of travel to each courier is minimalbull Database Queries
Several agents have access to a common database and each has to carry out a set of queries The target of negotiation is to arrange queries so as to maximize efficiency of database operations (Join Projection Union Intersection hellip) You are doing a join as part of another operation so please save the results for me
34
Possible DealsConsider an encounter from the Parcel Delivery Domain Suppose we have two agents Both agents have parcels to deliver to city a and only agent 2 has parcels to deliver to city b There are nine distinct pure deals in this encounter
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
the conflict deal
35
Figure deals knowing union must be ab
bull Choices for first agent a b ab
bull Second agent must ldquopick up the slackrdquo
bull a for agent 1 b|ab (for agent 2)
bull b for agent 1a|ab
bull ab for agent 1 a|ab|b|
bull for agent 1 ab
36
Utility Function for AgentsGiven an encounter (T1 T2) the utility function for each agent is just the difference of costs and is defined as follow
Utilityk()=c(Tk)-Costk() = c(Tk)- c(Dk)
where =(D1 D2) is a deal
ndash c(Tk) is the stand-alone cost to agent k (the cost of achieving its goal with no help)
ndash Costk() is the cost of its part of the deal
Note that the utility of the conflict deal is always 0
37
Parcel Delivery Domain (assuming do not have to return home ndash like
Uhaul)Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
38
Dominant Dealsbull Deal dominates deal if is better for at least one agent
and not worse for the other ie is at least as good for every agent as
k12 Utilityk() Utilityk()
is better for some agent than
k12 Utilityk()gt Utilityk()
bull Deal weakly dominates deal if at least the first condition holds (deal isnrsquot worse for anyone)
Any reasonable agent would prefer (or go along with) over
if dominates or weakly dominates
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
16
Mechanisms Design
bull Mechanism design is the design of protocols for governing multi-
agent interactions
bull Desirable properties of mechanisms are
ndash Convergenceguaranteed success
ndash Maximising global welfare sum of agent benefits are maximized
ndash Pareto efficiency
ndash Individual rationality
ndash Stability no agent should have incentive to deviate from strategy
ndash Simplicity low computational demands little communication
ndash Distribution no central decision maker
ndash Symmetry not want agents to play different roles (all agents have same
choice of actions)
17
Attributes not universally accepted
bull Canrsquot always achieve every attribute so look at tradeoffs of choices (for example) efficiency and stability are sometimes in conflict with each other
18
Negotiation Protocol
bull Who beginsbull Take turnsbull Build off previous offersbull Give feed back (or not)bull Tell what utility is (or not)bull Obligations bull Privacybull Allowed proposals you can make as a result of
negotiation history
19
Thought Question
bull Why not just compute a joint solution ndash using linear programming
20
Negotiation Process 1
bull Negotiation usually proceeds in a series of rounds
with every agent making a proposal at every round
bull Communication during negotiation
Proposal
Counter Proposal
Agenti concedes
Agenti Agentj
21
Negotiation Process 2
bull Another way of looking at the negotiation
process is (can talk about 5050 or 9010
depending on who rdquomovesrdquo the farthest)
Proposals by AjProposals by AiPoint of
Acceptanceaggreement
22
Many types of interactive concession based methods
bull Some use multiple objective linear programming ndash ndash requires that the players construct a crude linear
approximation of t heir utility functions
bull Jointly Improving Direction method Start out with a neutral suggestive value continue until no joint improvements are possible ndash Used in Camp Daivd peace negotiations (EgyptIsrael
ndash Jimmy Carter Nobel Peace Prize 2002)
23
Jointly Improving Direction method
Iterate overbull Mediator helps players criticize a tentative
agreement (could be status quo)bull Generates a compromise direction (where each
of the k issues is a direction in k-space)bull Mediator helps players to find a jointly preferred
outcome along the compromise direction and then proposes a new tentative agreement
24
Typical Negotiation ProblemsTask-Oriented Domains(TOD) an agents activity can be defined in terms of a set of tasks that it has to achieve The target of a negotiation is to minimize the cost of completing the tasks
State Oriented Domains(SOD) each agent is concerned with moving the world from an initial state into one of a set of goal states The target of a negotiation is to achieve a common goal Main attribute actions have side effects (positivenegative)
Worth Oriented Domains(WOD) agents assign a worth to each potential state which captures its desirability for the agent The target of a negotiation is to maximize mutual worth (rather than worth to individual)
25
Complex Negotiations
bull Some attributes that make the negotiation process
complex are
ndash Multiple attributes
bull Single attribute (price) ndash symmetric scenario (both benefit in the
same way by a cheaper price)
bull Multiple attributes ndash several inter-related attributes eg buying a
car
ndash The number of agents and the way they interact
bull One-to-one eg single buyer and single seller
bull Many-to-one eg multiple buyers and a single seller auctions
bull Many-to-many eg multiple buyers and multiple sellers
26
Single issue negotiation
bull Like moneybull Symmetric (If roles were reversed I would
benefit the same way you would) ndash If one task requires less travel both would benefit
equally by having less travelndash utility for a task is experienced the same way by
whomever is assigned to that taskbull Non-symmetric ndash we would benefit differently if
roles were reversedndash if you delivered the picnic table you could just throw it
in the back of your van If I delivered it I would have to rent a U-haul to transport it (as my car is small)
27
Multiple Issue negotiation
bull Could be hundreds of issues (cost delivery date size quality)
bull Some may be inter-related (as size goes down cost goes down quality goes up)
bull Not clear what a true concession is (larger may be cheaper but harder to store or spoils before can be used)
bull May not even be clear what is up for negotiation (I didnrsquot realize not having any test was an option) (on the jobhellipAsk for stock options bigger office work from home)
28
How many agents are involved
bull One to one
bull One to many (auction is an example of one seller and many buyers)
bull Many to many (could be divided into buyers and sellers or all could be identical in role)ndash n(n-1)2 number of pairs
29
Negotiation DomainsTask-oriented
bull rdquoDomains in which an agentrsquos activity can be defined
in terms of a set of tasks that it has to achieverdquo (Rosenschein amp Zlotkin 1994)
bull An agent can carry out the tasks without interference (or
help) from other agents ndash such as rdquowho will deliver the
mailrdquo
bull All resources are available to the agent
bull Tasks redistributed for the benefit of all agents
30
Task-oriented Domain Definitionbull How can an agent evaluate the utility of a specific deal
ndash Utility represents how much an agent has to gain from the deal (it is
always based on change from original allocation)
ndash Since an agent can achieve the goal on its own it can compare the cost of
achieving the goal on its own to the cost of its part of the deal
bull If utilitylt0 it is worse off than performing tasks on its own
bull Conflict deal (stay with status quo) if agents fail to reach an
agreement
ndash where no agent agrees to execute tasks other than its own
bull utlity = 0
31
Formalization of TODA Task Oriented Domain(TOD) is a triple ltT Ag cgt
wherendash T is a finite set of all possible tasks
ndash Ag=A1 A2hellip An is a list of participant agentsndash c(T)R+ defines cost of executing each subset of tasks
Assumptions on cost function1 c() = 02 The cost of a subset of tasks does not depend on who carries out
them (Idealized situation)3 Cost function is monotonic which means that more tasks more
cost (It canrsquot cost less to take on more tasks) i T1 T2 implies c(T1) c(T2)
32
Redistribution of TasksGiven a TOD ltT A1A2 cgt T is original assignment D i
s assignment after the ldquodealrdquobull An encounter (instance) within the TOD is an ordered
list (T1 T2) such that for all k Tk T This is an original allocation of tasks that they might want to reallocate
bull A pure deal on an encounter is the redistribution of tasks among agents (D1 D2) such that all tasks are reassigned
D1 D2= T1 T2
Specifically (D1 D2)=(T1 T2) is called the conflict deal bull For each deal =(D1 D2) the cost of such a deal to
agent k is Costk()=c(Dk) (ie cost to k of deal is cost of Dk krsquos part of deal)
33
Examples of TOD
bull Parcel Delivery
Several couriers have to deliver sets of parcels to different cities The target of negotiation is to reallocate deliveries so that the cost of travel to each courier is minimalbull Database Queries
Several agents have access to a common database and each has to carry out a set of queries The target of negotiation is to arrange queries so as to maximize efficiency of database operations (Join Projection Union Intersection hellip) You are doing a join as part of another operation so please save the results for me
34
Possible DealsConsider an encounter from the Parcel Delivery Domain Suppose we have two agents Both agents have parcels to deliver to city a and only agent 2 has parcels to deliver to city b There are nine distinct pure deals in this encounter
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
the conflict deal
35
Figure deals knowing union must be ab
bull Choices for first agent a b ab
bull Second agent must ldquopick up the slackrdquo
bull a for agent 1 b|ab (for agent 2)
bull b for agent 1a|ab
bull ab for agent 1 a|ab|b|
bull for agent 1 ab
36
Utility Function for AgentsGiven an encounter (T1 T2) the utility function for each agent is just the difference of costs and is defined as follow
Utilityk()=c(Tk)-Costk() = c(Tk)- c(Dk)
where =(D1 D2) is a deal
ndash c(Tk) is the stand-alone cost to agent k (the cost of achieving its goal with no help)
ndash Costk() is the cost of its part of the deal
Note that the utility of the conflict deal is always 0
37
Parcel Delivery Domain (assuming do not have to return home ndash like
Uhaul)Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
38
Dominant Dealsbull Deal dominates deal if is better for at least one agent
and not worse for the other ie is at least as good for every agent as
k12 Utilityk() Utilityk()
is better for some agent than
k12 Utilityk()gt Utilityk()
bull Deal weakly dominates deal if at least the first condition holds (deal isnrsquot worse for anyone)
Any reasonable agent would prefer (or go along with) over
if dominates or weakly dominates
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
17
Attributes not universally accepted
bull Canrsquot always achieve every attribute so look at tradeoffs of choices (for example) efficiency and stability are sometimes in conflict with each other
18
Negotiation Protocol
bull Who beginsbull Take turnsbull Build off previous offersbull Give feed back (or not)bull Tell what utility is (or not)bull Obligations bull Privacybull Allowed proposals you can make as a result of
negotiation history
19
Thought Question
bull Why not just compute a joint solution ndash using linear programming
20
Negotiation Process 1
bull Negotiation usually proceeds in a series of rounds
with every agent making a proposal at every round
bull Communication during negotiation
Proposal
Counter Proposal
Agenti concedes
Agenti Agentj
21
Negotiation Process 2
bull Another way of looking at the negotiation
process is (can talk about 5050 or 9010
depending on who rdquomovesrdquo the farthest)
Proposals by AjProposals by AiPoint of
Acceptanceaggreement
22
Many types of interactive concession based methods
bull Some use multiple objective linear programming ndash ndash requires that the players construct a crude linear
approximation of t heir utility functions
bull Jointly Improving Direction method Start out with a neutral suggestive value continue until no joint improvements are possible ndash Used in Camp Daivd peace negotiations (EgyptIsrael
ndash Jimmy Carter Nobel Peace Prize 2002)
23
Jointly Improving Direction method
Iterate overbull Mediator helps players criticize a tentative
agreement (could be status quo)bull Generates a compromise direction (where each
of the k issues is a direction in k-space)bull Mediator helps players to find a jointly preferred
outcome along the compromise direction and then proposes a new tentative agreement
24
Typical Negotiation ProblemsTask-Oriented Domains(TOD) an agents activity can be defined in terms of a set of tasks that it has to achieve The target of a negotiation is to minimize the cost of completing the tasks
State Oriented Domains(SOD) each agent is concerned with moving the world from an initial state into one of a set of goal states The target of a negotiation is to achieve a common goal Main attribute actions have side effects (positivenegative)
Worth Oriented Domains(WOD) agents assign a worth to each potential state which captures its desirability for the agent The target of a negotiation is to maximize mutual worth (rather than worth to individual)
25
Complex Negotiations
bull Some attributes that make the negotiation process
complex are
ndash Multiple attributes
bull Single attribute (price) ndash symmetric scenario (both benefit in the
same way by a cheaper price)
bull Multiple attributes ndash several inter-related attributes eg buying a
car
ndash The number of agents and the way they interact
bull One-to-one eg single buyer and single seller
bull Many-to-one eg multiple buyers and a single seller auctions
bull Many-to-many eg multiple buyers and multiple sellers
26
Single issue negotiation
bull Like moneybull Symmetric (If roles were reversed I would
benefit the same way you would) ndash If one task requires less travel both would benefit
equally by having less travelndash utility for a task is experienced the same way by
whomever is assigned to that taskbull Non-symmetric ndash we would benefit differently if
roles were reversedndash if you delivered the picnic table you could just throw it
in the back of your van If I delivered it I would have to rent a U-haul to transport it (as my car is small)
27
Multiple Issue negotiation
bull Could be hundreds of issues (cost delivery date size quality)
bull Some may be inter-related (as size goes down cost goes down quality goes up)
bull Not clear what a true concession is (larger may be cheaper but harder to store or spoils before can be used)
bull May not even be clear what is up for negotiation (I didnrsquot realize not having any test was an option) (on the jobhellipAsk for stock options bigger office work from home)
28
How many agents are involved
bull One to one
bull One to many (auction is an example of one seller and many buyers)
bull Many to many (could be divided into buyers and sellers or all could be identical in role)ndash n(n-1)2 number of pairs
29
Negotiation DomainsTask-oriented
bull rdquoDomains in which an agentrsquos activity can be defined
in terms of a set of tasks that it has to achieverdquo (Rosenschein amp Zlotkin 1994)
bull An agent can carry out the tasks without interference (or
help) from other agents ndash such as rdquowho will deliver the
mailrdquo
bull All resources are available to the agent
bull Tasks redistributed for the benefit of all agents
30
Task-oriented Domain Definitionbull How can an agent evaluate the utility of a specific deal
ndash Utility represents how much an agent has to gain from the deal (it is
always based on change from original allocation)
ndash Since an agent can achieve the goal on its own it can compare the cost of
achieving the goal on its own to the cost of its part of the deal
bull If utilitylt0 it is worse off than performing tasks on its own
bull Conflict deal (stay with status quo) if agents fail to reach an
agreement
ndash where no agent agrees to execute tasks other than its own
bull utlity = 0
31
Formalization of TODA Task Oriented Domain(TOD) is a triple ltT Ag cgt
wherendash T is a finite set of all possible tasks
ndash Ag=A1 A2hellip An is a list of participant agentsndash c(T)R+ defines cost of executing each subset of tasks
Assumptions on cost function1 c() = 02 The cost of a subset of tasks does not depend on who carries out
them (Idealized situation)3 Cost function is monotonic which means that more tasks more
cost (It canrsquot cost less to take on more tasks) i T1 T2 implies c(T1) c(T2)
32
Redistribution of TasksGiven a TOD ltT A1A2 cgt T is original assignment D i
s assignment after the ldquodealrdquobull An encounter (instance) within the TOD is an ordered
list (T1 T2) such that for all k Tk T This is an original allocation of tasks that they might want to reallocate
bull A pure deal on an encounter is the redistribution of tasks among agents (D1 D2) such that all tasks are reassigned
D1 D2= T1 T2
Specifically (D1 D2)=(T1 T2) is called the conflict deal bull For each deal =(D1 D2) the cost of such a deal to
agent k is Costk()=c(Dk) (ie cost to k of deal is cost of Dk krsquos part of deal)
33
Examples of TOD
bull Parcel Delivery
Several couriers have to deliver sets of parcels to different cities The target of negotiation is to reallocate deliveries so that the cost of travel to each courier is minimalbull Database Queries
Several agents have access to a common database and each has to carry out a set of queries The target of negotiation is to arrange queries so as to maximize efficiency of database operations (Join Projection Union Intersection hellip) You are doing a join as part of another operation so please save the results for me
34
Possible DealsConsider an encounter from the Parcel Delivery Domain Suppose we have two agents Both agents have parcels to deliver to city a and only agent 2 has parcels to deliver to city b There are nine distinct pure deals in this encounter
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
the conflict deal
35
Figure deals knowing union must be ab
bull Choices for first agent a b ab
bull Second agent must ldquopick up the slackrdquo
bull a for agent 1 b|ab (for agent 2)
bull b for agent 1a|ab
bull ab for agent 1 a|ab|b|
bull for agent 1 ab
36
Utility Function for AgentsGiven an encounter (T1 T2) the utility function for each agent is just the difference of costs and is defined as follow
Utilityk()=c(Tk)-Costk() = c(Tk)- c(Dk)
where =(D1 D2) is a deal
ndash c(Tk) is the stand-alone cost to agent k (the cost of achieving its goal with no help)
ndash Costk() is the cost of its part of the deal
Note that the utility of the conflict deal is always 0
37
Parcel Delivery Domain (assuming do not have to return home ndash like
Uhaul)Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
38
Dominant Dealsbull Deal dominates deal if is better for at least one agent
and not worse for the other ie is at least as good for every agent as
k12 Utilityk() Utilityk()
is better for some agent than
k12 Utilityk()gt Utilityk()
bull Deal weakly dominates deal if at least the first condition holds (deal isnrsquot worse for anyone)
Any reasonable agent would prefer (or go along with) over
if dominates or weakly dominates
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
18
Negotiation Protocol
bull Who beginsbull Take turnsbull Build off previous offersbull Give feed back (or not)bull Tell what utility is (or not)bull Obligations bull Privacybull Allowed proposals you can make as a result of
negotiation history
19
Thought Question
bull Why not just compute a joint solution ndash using linear programming
20
Negotiation Process 1
bull Negotiation usually proceeds in a series of rounds
with every agent making a proposal at every round
bull Communication during negotiation
Proposal
Counter Proposal
Agenti concedes
Agenti Agentj
21
Negotiation Process 2
bull Another way of looking at the negotiation
process is (can talk about 5050 or 9010
depending on who rdquomovesrdquo the farthest)
Proposals by AjProposals by AiPoint of
Acceptanceaggreement
22
Many types of interactive concession based methods
bull Some use multiple objective linear programming ndash ndash requires that the players construct a crude linear
approximation of t heir utility functions
bull Jointly Improving Direction method Start out with a neutral suggestive value continue until no joint improvements are possible ndash Used in Camp Daivd peace negotiations (EgyptIsrael
ndash Jimmy Carter Nobel Peace Prize 2002)
23
Jointly Improving Direction method
Iterate overbull Mediator helps players criticize a tentative
agreement (could be status quo)bull Generates a compromise direction (where each
of the k issues is a direction in k-space)bull Mediator helps players to find a jointly preferred
outcome along the compromise direction and then proposes a new tentative agreement
24
Typical Negotiation ProblemsTask-Oriented Domains(TOD) an agents activity can be defined in terms of a set of tasks that it has to achieve The target of a negotiation is to minimize the cost of completing the tasks
State Oriented Domains(SOD) each agent is concerned with moving the world from an initial state into one of a set of goal states The target of a negotiation is to achieve a common goal Main attribute actions have side effects (positivenegative)
Worth Oriented Domains(WOD) agents assign a worth to each potential state which captures its desirability for the agent The target of a negotiation is to maximize mutual worth (rather than worth to individual)
25
Complex Negotiations
bull Some attributes that make the negotiation process
complex are
ndash Multiple attributes
bull Single attribute (price) ndash symmetric scenario (both benefit in the
same way by a cheaper price)
bull Multiple attributes ndash several inter-related attributes eg buying a
car
ndash The number of agents and the way they interact
bull One-to-one eg single buyer and single seller
bull Many-to-one eg multiple buyers and a single seller auctions
bull Many-to-many eg multiple buyers and multiple sellers
26
Single issue negotiation
bull Like moneybull Symmetric (If roles were reversed I would
benefit the same way you would) ndash If one task requires less travel both would benefit
equally by having less travelndash utility for a task is experienced the same way by
whomever is assigned to that taskbull Non-symmetric ndash we would benefit differently if
roles were reversedndash if you delivered the picnic table you could just throw it
in the back of your van If I delivered it I would have to rent a U-haul to transport it (as my car is small)
27
Multiple Issue negotiation
bull Could be hundreds of issues (cost delivery date size quality)
bull Some may be inter-related (as size goes down cost goes down quality goes up)
bull Not clear what a true concession is (larger may be cheaper but harder to store or spoils before can be used)
bull May not even be clear what is up for negotiation (I didnrsquot realize not having any test was an option) (on the jobhellipAsk for stock options bigger office work from home)
28
How many agents are involved
bull One to one
bull One to many (auction is an example of one seller and many buyers)
bull Many to many (could be divided into buyers and sellers or all could be identical in role)ndash n(n-1)2 number of pairs
29
Negotiation DomainsTask-oriented
bull rdquoDomains in which an agentrsquos activity can be defined
in terms of a set of tasks that it has to achieverdquo (Rosenschein amp Zlotkin 1994)
bull An agent can carry out the tasks without interference (or
help) from other agents ndash such as rdquowho will deliver the
mailrdquo
bull All resources are available to the agent
bull Tasks redistributed for the benefit of all agents
30
Task-oriented Domain Definitionbull How can an agent evaluate the utility of a specific deal
ndash Utility represents how much an agent has to gain from the deal (it is
always based on change from original allocation)
ndash Since an agent can achieve the goal on its own it can compare the cost of
achieving the goal on its own to the cost of its part of the deal
bull If utilitylt0 it is worse off than performing tasks on its own
bull Conflict deal (stay with status quo) if agents fail to reach an
agreement
ndash where no agent agrees to execute tasks other than its own
bull utlity = 0
31
Formalization of TODA Task Oriented Domain(TOD) is a triple ltT Ag cgt
wherendash T is a finite set of all possible tasks
ndash Ag=A1 A2hellip An is a list of participant agentsndash c(T)R+ defines cost of executing each subset of tasks
Assumptions on cost function1 c() = 02 The cost of a subset of tasks does not depend on who carries out
them (Idealized situation)3 Cost function is monotonic which means that more tasks more
cost (It canrsquot cost less to take on more tasks) i T1 T2 implies c(T1) c(T2)
32
Redistribution of TasksGiven a TOD ltT A1A2 cgt T is original assignment D i
s assignment after the ldquodealrdquobull An encounter (instance) within the TOD is an ordered
list (T1 T2) such that for all k Tk T This is an original allocation of tasks that they might want to reallocate
bull A pure deal on an encounter is the redistribution of tasks among agents (D1 D2) such that all tasks are reassigned
D1 D2= T1 T2
Specifically (D1 D2)=(T1 T2) is called the conflict deal bull For each deal =(D1 D2) the cost of such a deal to
agent k is Costk()=c(Dk) (ie cost to k of deal is cost of Dk krsquos part of deal)
33
Examples of TOD
bull Parcel Delivery
Several couriers have to deliver sets of parcels to different cities The target of negotiation is to reallocate deliveries so that the cost of travel to each courier is minimalbull Database Queries
Several agents have access to a common database and each has to carry out a set of queries The target of negotiation is to arrange queries so as to maximize efficiency of database operations (Join Projection Union Intersection hellip) You are doing a join as part of another operation so please save the results for me
34
Possible DealsConsider an encounter from the Parcel Delivery Domain Suppose we have two agents Both agents have parcels to deliver to city a and only agent 2 has parcels to deliver to city b There are nine distinct pure deals in this encounter
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
the conflict deal
35
Figure deals knowing union must be ab
bull Choices for first agent a b ab
bull Second agent must ldquopick up the slackrdquo
bull a for agent 1 b|ab (for agent 2)
bull b for agent 1a|ab
bull ab for agent 1 a|ab|b|
bull for agent 1 ab
36
Utility Function for AgentsGiven an encounter (T1 T2) the utility function for each agent is just the difference of costs and is defined as follow
Utilityk()=c(Tk)-Costk() = c(Tk)- c(Dk)
where =(D1 D2) is a deal
ndash c(Tk) is the stand-alone cost to agent k (the cost of achieving its goal with no help)
ndash Costk() is the cost of its part of the deal
Note that the utility of the conflict deal is always 0
37
Parcel Delivery Domain (assuming do not have to return home ndash like
Uhaul)Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
38
Dominant Dealsbull Deal dominates deal if is better for at least one agent
and not worse for the other ie is at least as good for every agent as
k12 Utilityk() Utilityk()
is better for some agent than
k12 Utilityk()gt Utilityk()
bull Deal weakly dominates deal if at least the first condition holds (deal isnrsquot worse for anyone)
Any reasonable agent would prefer (or go along with) over
if dominates or weakly dominates
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
19
Thought Question
bull Why not just compute a joint solution ndash using linear programming
20
Negotiation Process 1
bull Negotiation usually proceeds in a series of rounds
with every agent making a proposal at every round
bull Communication during negotiation
Proposal
Counter Proposal
Agenti concedes
Agenti Agentj
21
Negotiation Process 2
bull Another way of looking at the negotiation
process is (can talk about 5050 or 9010
depending on who rdquomovesrdquo the farthest)
Proposals by AjProposals by AiPoint of
Acceptanceaggreement
22
Many types of interactive concession based methods
bull Some use multiple objective linear programming ndash ndash requires that the players construct a crude linear
approximation of t heir utility functions
bull Jointly Improving Direction method Start out with a neutral suggestive value continue until no joint improvements are possible ndash Used in Camp Daivd peace negotiations (EgyptIsrael
ndash Jimmy Carter Nobel Peace Prize 2002)
23
Jointly Improving Direction method
Iterate overbull Mediator helps players criticize a tentative
agreement (could be status quo)bull Generates a compromise direction (where each
of the k issues is a direction in k-space)bull Mediator helps players to find a jointly preferred
outcome along the compromise direction and then proposes a new tentative agreement
24
Typical Negotiation ProblemsTask-Oriented Domains(TOD) an agents activity can be defined in terms of a set of tasks that it has to achieve The target of a negotiation is to minimize the cost of completing the tasks
State Oriented Domains(SOD) each agent is concerned with moving the world from an initial state into one of a set of goal states The target of a negotiation is to achieve a common goal Main attribute actions have side effects (positivenegative)
Worth Oriented Domains(WOD) agents assign a worth to each potential state which captures its desirability for the agent The target of a negotiation is to maximize mutual worth (rather than worth to individual)
25
Complex Negotiations
bull Some attributes that make the negotiation process
complex are
ndash Multiple attributes
bull Single attribute (price) ndash symmetric scenario (both benefit in the
same way by a cheaper price)
bull Multiple attributes ndash several inter-related attributes eg buying a
car
ndash The number of agents and the way they interact
bull One-to-one eg single buyer and single seller
bull Many-to-one eg multiple buyers and a single seller auctions
bull Many-to-many eg multiple buyers and multiple sellers
26
Single issue negotiation
bull Like moneybull Symmetric (If roles were reversed I would
benefit the same way you would) ndash If one task requires less travel both would benefit
equally by having less travelndash utility for a task is experienced the same way by
whomever is assigned to that taskbull Non-symmetric ndash we would benefit differently if
roles were reversedndash if you delivered the picnic table you could just throw it
in the back of your van If I delivered it I would have to rent a U-haul to transport it (as my car is small)
27
Multiple Issue negotiation
bull Could be hundreds of issues (cost delivery date size quality)
bull Some may be inter-related (as size goes down cost goes down quality goes up)
bull Not clear what a true concession is (larger may be cheaper but harder to store or spoils before can be used)
bull May not even be clear what is up for negotiation (I didnrsquot realize not having any test was an option) (on the jobhellipAsk for stock options bigger office work from home)
28
How many agents are involved
bull One to one
bull One to many (auction is an example of one seller and many buyers)
bull Many to many (could be divided into buyers and sellers or all could be identical in role)ndash n(n-1)2 number of pairs
29
Negotiation DomainsTask-oriented
bull rdquoDomains in which an agentrsquos activity can be defined
in terms of a set of tasks that it has to achieverdquo (Rosenschein amp Zlotkin 1994)
bull An agent can carry out the tasks without interference (or
help) from other agents ndash such as rdquowho will deliver the
mailrdquo
bull All resources are available to the agent
bull Tasks redistributed for the benefit of all agents
30
Task-oriented Domain Definitionbull How can an agent evaluate the utility of a specific deal
ndash Utility represents how much an agent has to gain from the deal (it is
always based on change from original allocation)
ndash Since an agent can achieve the goal on its own it can compare the cost of
achieving the goal on its own to the cost of its part of the deal
bull If utilitylt0 it is worse off than performing tasks on its own
bull Conflict deal (stay with status quo) if agents fail to reach an
agreement
ndash where no agent agrees to execute tasks other than its own
bull utlity = 0
31
Formalization of TODA Task Oriented Domain(TOD) is a triple ltT Ag cgt
wherendash T is a finite set of all possible tasks
ndash Ag=A1 A2hellip An is a list of participant agentsndash c(T)R+ defines cost of executing each subset of tasks
Assumptions on cost function1 c() = 02 The cost of a subset of tasks does not depend on who carries out
them (Idealized situation)3 Cost function is monotonic which means that more tasks more
cost (It canrsquot cost less to take on more tasks) i T1 T2 implies c(T1) c(T2)
32
Redistribution of TasksGiven a TOD ltT A1A2 cgt T is original assignment D i
s assignment after the ldquodealrdquobull An encounter (instance) within the TOD is an ordered
list (T1 T2) such that for all k Tk T This is an original allocation of tasks that they might want to reallocate
bull A pure deal on an encounter is the redistribution of tasks among agents (D1 D2) such that all tasks are reassigned
D1 D2= T1 T2
Specifically (D1 D2)=(T1 T2) is called the conflict deal bull For each deal =(D1 D2) the cost of such a deal to
agent k is Costk()=c(Dk) (ie cost to k of deal is cost of Dk krsquos part of deal)
33
Examples of TOD
bull Parcel Delivery
Several couriers have to deliver sets of parcels to different cities The target of negotiation is to reallocate deliveries so that the cost of travel to each courier is minimalbull Database Queries
Several agents have access to a common database and each has to carry out a set of queries The target of negotiation is to arrange queries so as to maximize efficiency of database operations (Join Projection Union Intersection hellip) You are doing a join as part of another operation so please save the results for me
34
Possible DealsConsider an encounter from the Parcel Delivery Domain Suppose we have two agents Both agents have parcels to deliver to city a and only agent 2 has parcels to deliver to city b There are nine distinct pure deals in this encounter
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
the conflict deal
35
Figure deals knowing union must be ab
bull Choices for first agent a b ab
bull Second agent must ldquopick up the slackrdquo
bull a for agent 1 b|ab (for agent 2)
bull b for agent 1a|ab
bull ab for agent 1 a|ab|b|
bull for agent 1 ab
36
Utility Function for AgentsGiven an encounter (T1 T2) the utility function for each agent is just the difference of costs and is defined as follow
Utilityk()=c(Tk)-Costk() = c(Tk)- c(Dk)
where =(D1 D2) is a deal
ndash c(Tk) is the stand-alone cost to agent k (the cost of achieving its goal with no help)
ndash Costk() is the cost of its part of the deal
Note that the utility of the conflict deal is always 0
37
Parcel Delivery Domain (assuming do not have to return home ndash like
Uhaul)Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
38
Dominant Dealsbull Deal dominates deal if is better for at least one agent
and not worse for the other ie is at least as good for every agent as
k12 Utilityk() Utilityk()
is better for some agent than
k12 Utilityk()gt Utilityk()
bull Deal weakly dominates deal if at least the first condition holds (deal isnrsquot worse for anyone)
Any reasonable agent would prefer (or go along with) over
if dominates or weakly dominates
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
20
Negotiation Process 1
bull Negotiation usually proceeds in a series of rounds
with every agent making a proposal at every round
bull Communication during negotiation
Proposal
Counter Proposal
Agenti concedes
Agenti Agentj
21
Negotiation Process 2
bull Another way of looking at the negotiation
process is (can talk about 5050 or 9010
depending on who rdquomovesrdquo the farthest)
Proposals by AjProposals by AiPoint of
Acceptanceaggreement
22
Many types of interactive concession based methods
bull Some use multiple objective linear programming ndash ndash requires that the players construct a crude linear
approximation of t heir utility functions
bull Jointly Improving Direction method Start out with a neutral suggestive value continue until no joint improvements are possible ndash Used in Camp Daivd peace negotiations (EgyptIsrael
ndash Jimmy Carter Nobel Peace Prize 2002)
23
Jointly Improving Direction method
Iterate overbull Mediator helps players criticize a tentative
agreement (could be status quo)bull Generates a compromise direction (where each
of the k issues is a direction in k-space)bull Mediator helps players to find a jointly preferred
outcome along the compromise direction and then proposes a new tentative agreement
24
Typical Negotiation ProblemsTask-Oriented Domains(TOD) an agents activity can be defined in terms of a set of tasks that it has to achieve The target of a negotiation is to minimize the cost of completing the tasks
State Oriented Domains(SOD) each agent is concerned with moving the world from an initial state into one of a set of goal states The target of a negotiation is to achieve a common goal Main attribute actions have side effects (positivenegative)
Worth Oriented Domains(WOD) agents assign a worth to each potential state which captures its desirability for the agent The target of a negotiation is to maximize mutual worth (rather than worth to individual)
25
Complex Negotiations
bull Some attributes that make the negotiation process
complex are
ndash Multiple attributes
bull Single attribute (price) ndash symmetric scenario (both benefit in the
same way by a cheaper price)
bull Multiple attributes ndash several inter-related attributes eg buying a
car
ndash The number of agents and the way they interact
bull One-to-one eg single buyer and single seller
bull Many-to-one eg multiple buyers and a single seller auctions
bull Many-to-many eg multiple buyers and multiple sellers
26
Single issue negotiation
bull Like moneybull Symmetric (If roles were reversed I would
benefit the same way you would) ndash If one task requires less travel both would benefit
equally by having less travelndash utility for a task is experienced the same way by
whomever is assigned to that taskbull Non-symmetric ndash we would benefit differently if
roles were reversedndash if you delivered the picnic table you could just throw it
in the back of your van If I delivered it I would have to rent a U-haul to transport it (as my car is small)
27
Multiple Issue negotiation
bull Could be hundreds of issues (cost delivery date size quality)
bull Some may be inter-related (as size goes down cost goes down quality goes up)
bull Not clear what a true concession is (larger may be cheaper but harder to store or spoils before can be used)
bull May not even be clear what is up for negotiation (I didnrsquot realize not having any test was an option) (on the jobhellipAsk for stock options bigger office work from home)
28
How many agents are involved
bull One to one
bull One to many (auction is an example of one seller and many buyers)
bull Many to many (could be divided into buyers and sellers or all could be identical in role)ndash n(n-1)2 number of pairs
29
Negotiation DomainsTask-oriented
bull rdquoDomains in which an agentrsquos activity can be defined
in terms of a set of tasks that it has to achieverdquo (Rosenschein amp Zlotkin 1994)
bull An agent can carry out the tasks without interference (or
help) from other agents ndash such as rdquowho will deliver the
mailrdquo
bull All resources are available to the agent
bull Tasks redistributed for the benefit of all agents
30
Task-oriented Domain Definitionbull How can an agent evaluate the utility of a specific deal
ndash Utility represents how much an agent has to gain from the deal (it is
always based on change from original allocation)
ndash Since an agent can achieve the goal on its own it can compare the cost of
achieving the goal on its own to the cost of its part of the deal
bull If utilitylt0 it is worse off than performing tasks on its own
bull Conflict deal (stay with status quo) if agents fail to reach an
agreement
ndash where no agent agrees to execute tasks other than its own
bull utlity = 0
31
Formalization of TODA Task Oriented Domain(TOD) is a triple ltT Ag cgt
wherendash T is a finite set of all possible tasks
ndash Ag=A1 A2hellip An is a list of participant agentsndash c(T)R+ defines cost of executing each subset of tasks
Assumptions on cost function1 c() = 02 The cost of a subset of tasks does not depend on who carries out
them (Idealized situation)3 Cost function is monotonic which means that more tasks more
cost (It canrsquot cost less to take on more tasks) i T1 T2 implies c(T1) c(T2)
32
Redistribution of TasksGiven a TOD ltT A1A2 cgt T is original assignment D i
s assignment after the ldquodealrdquobull An encounter (instance) within the TOD is an ordered
list (T1 T2) such that for all k Tk T This is an original allocation of tasks that they might want to reallocate
bull A pure deal on an encounter is the redistribution of tasks among agents (D1 D2) such that all tasks are reassigned
D1 D2= T1 T2
Specifically (D1 D2)=(T1 T2) is called the conflict deal bull For each deal =(D1 D2) the cost of such a deal to
agent k is Costk()=c(Dk) (ie cost to k of deal is cost of Dk krsquos part of deal)
33
Examples of TOD
bull Parcel Delivery
Several couriers have to deliver sets of parcels to different cities The target of negotiation is to reallocate deliveries so that the cost of travel to each courier is minimalbull Database Queries
Several agents have access to a common database and each has to carry out a set of queries The target of negotiation is to arrange queries so as to maximize efficiency of database operations (Join Projection Union Intersection hellip) You are doing a join as part of another operation so please save the results for me
34
Possible DealsConsider an encounter from the Parcel Delivery Domain Suppose we have two agents Both agents have parcels to deliver to city a and only agent 2 has parcels to deliver to city b There are nine distinct pure deals in this encounter
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
the conflict deal
35
Figure deals knowing union must be ab
bull Choices for first agent a b ab
bull Second agent must ldquopick up the slackrdquo
bull a for agent 1 b|ab (for agent 2)
bull b for agent 1a|ab
bull ab for agent 1 a|ab|b|
bull for agent 1 ab
36
Utility Function for AgentsGiven an encounter (T1 T2) the utility function for each agent is just the difference of costs and is defined as follow
Utilityk()=c(Tk)-Costk() = c(Tk)- c(Dk)
where =(D1 D2) is a deal
ndash c(Tk) is the stand-alone cost to agent k (the cost of achieving its goal with no help)
ndash Costk() is the cost of its part of the deal
Note that the utility of the conflict deal is always 0
37
Parcel Delivery Domain (assuming do not have to return home ndash like
Uhaul)Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
38
Dominant Dealsbull Deal dominates deal if is better for at least one agent
and not worse for the other ie is at least as good for every agent as
k12 Utilityk() Utilityk()
is better for some agent than
k12 Utilityk()gt Utilityk()
bull Deal weakly dominates deal if at least the first condition holds (deal isnrsquot worse for anyone)
Any reasonable agent would prefer (or go along with) over
if dominates or weakly dominates
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
21
Negotiation Process 2
bull Another way of looking at the negotiation
process is (can talk about 5050 or 9010
depending on who rdquomovesrdquo the farthest)
Proposals by AjProposals by AiPoint of
Acceptanceaggreement
22
Many types of interactive concession based methods
bull Some use multiple objective linear programming ndash ndash requires that the players construct a crude linear
approximation of t heir utility functions
bull Jointly Improving Direction method Start out with a neutral suggestive value continue until no joint improvements are possible ndash Used in Camp Daivd peace negotiations (EgyptIsrael
ndash Jimmy Carter Nobel Peace Prize 2002)
23
Jointly Improving Direction method
Iterate overbull Mediator helps players criticize a tentative
agreement (could be status quo)bull Generates a compromise direction (where each
of the k issues is a direction in k-space)bull Mediator helps players to find a jointly preferred
outcome along the compromise direction and then proposes a new tentative agreement
24
Typical Negotiation ProblemsTask-Oriented Domains(TOD) an agents activity can be defined in terms of a set of tasks that it has to achieve The target of a negotiation is to minimize the cost of completing the tasks
State Oriented Domains(SOD) each agent is concerned with moving the world from an initial state into one of a set of goal states The target of a negotiation is to achieve a common goal Main attribute actions have side effects (positivenegative)
Worth Oriented Domains(WOD) agents assign a worth to each potential state which captures its desirability for the agent The target of a negotiation is to maximize mutual worth (rather than worth to individual)
25
Complex Negotiations
bull Some attributes that make the negotiation process
complex are
ndash Multiple attributes
bull Single attribute (price) ndash symmetric scenario (both benefit in the
same way by a cheaper price)
bull Multiple attributes ndash several inter-related attributes eg buying a
car
ndash The number of agents and the way they interact
bull One-to-one eg single buyer and single seller
bull Many-to-one eg multiple buyers and a single seller auctions
bull Many-to-many eg multiple buyers and multiple sellers
26
Single issue negotiation
bull Like moneybull Symmetric (If roles were reversed I would
benefit the same way you would) ndash If one task requires less travel both would benefit
equally by having less travelndash utility for a task is experienced the same way by
whomever is assigned to that taskbull Non-symmetric ndash we would benefit differently if
roles were reversedndash if you delivered the picnic table you could just throw it
in the back of your van If I delivered it I would have to rent a U-haul to transport it (as my car is small)
27
Multiple Issue negotiation
bull Could be hundreds of issues (cost delivery date size quality)
bull Some may be inter-related (as size goes down cost goes down quality goes up)
bull Not clear what a true concession is (larger may be cheaper but harder to store or spoils before can be used)
bull May not even be clear what is up for negotiation (I didnrsquot realize not having any test was an option) (on the jobhellipAsk for stock options bigger office work from home)
28
How many agents are involved
bull One to one
bull One to many (auction is an example of one seller and many buyers)
bull Many to many (could be divided into buyers and sellers or all could be identical in role)ndash n(n-1)2 number of pairs
29
Negotiation DomainsTask-oriented
bull rdquoDomains in which an agentrsquos activity can be defined
in terms of a set of tasks that it has to achieverdquo (Rosenschein amp Zlotkin 1994)
bull An agent can carry out the tasks without interference (or
help) from other agents ndash such as rdquowho will deliver the
mailrdquo
bull All resources are available to the agent
bull Tasks redistributed for the benefit of all agents
30
Task-oriented Domain Definitionbull How can an agent evaluate the utility of a specific deal
ndash Utility represents how much an agent has to gain from the deal (it is
always based on change from original allocation)
ndash Since an agent can achieve the goal on its own it can compare the cost of
achieving the goal on its own to the cost of its part of the deal
bull If utilitylt0 it is worse off than performing tasks on its own
bull Conflict deal (stay with status quo) if agents fail to reach an
agreement
ndash where no agent agrees to execute tasks other than its own
bull utlity = 0
31
Formalization of TODA Task Oriented Domain(TOD) is a triple ltT Ag cgt
wherendash T is a finite set of all possible tasks
ndash Ag=A1 A2hellip An is a list of participant agentsndash c(T)R+ defines cost of executing each subset of tasks
Assumptions on cost function1 c() = 02 The cost of a subset of tasks does not depend on who carries out
them (Idealized situation)3 Cost function is monotonic which means that more tasks more
cost (It canrsquot cost less to take on more tasks) i T1 T2 implies c(T1) c(T2)
32
Redistribution of TasksGiven a TOD ltT A1A2 cgt T is original assignment D i
s assignment after the ldquodealrdquobull An encounter (instance) within the TOD is an ordered
list (T1 T2) such that for all k Tk T This is an original allocation of tasks that they might want to reallocate
bull A pure deal on an encounter is the redistribution of tasks among agents (D1 D2) such that all tasks are reassigned
D1 D2= T1 T2
Specifically (D1 D2)=(T1 T2) is called the conflict deal bull For each deal =(D1 D2) the cost of such a deal to
agent k is Costk()=c(Dk) (ie cost to k of deal is cost of Dk krsquos part of deal)
33
Examples of TOD
bull Parcel Delivery
Several couriers have to deliver sets of parcels to different cities The target of negotiation is to reallocate deliveries so that the cost of travel to each courier is minimalbull Database Queries
Several agents have access to a common database and each has to carry out a set of queries The target of negotiation is to arrange queries so as to maximize efficiency of database operations (Join Projection Union Intersection hellip) You are doing a join as part of another operation so please save the results for me
34
Possible DealsConsider an encounter from the Parcel Delivery Domain Suppose we have two agents Both agents have parcels to deliver to city a and only agent 2 has parcels to deliver to city b There are nine distinct pure deals in this encounter
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
the conflict deal
35
Figure deals knowing union must be ab
bull Choices for first agent a b ab
bull Second agent must ldquopick up the slackrdquo
bull a for agent 1 b|ab (for agent 2)
bull b for agent 1a|ab
bull ab for agent 1 a|ab|b|
bull for agent 1 ab
36
Utility Function for AgentsGiven an encounter (T1 T2) the utility function for each agent is just the difference of costs and is defined as follow
Utilityk()=c(Tk)-Costk() = c(Tk)- c(Dk)
where =(D1 D2) is a deal
ndash c(Tk) is the stand-alone cost to agent k (the cost of achieving its goal with no help)
ndash Costk() is the cost of its part of the deal
Note that the utility of the conflict deal is always 0
37
Parcel Delivery Domain (assuming do not have to return home ndash like
Uhaul)Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
38
Dominant Dealsbull Deal dominates deal if is better for at least one agent
and not worse for the other ie is at least as good for every agent as
k12 Utilityk() Utilityk()
is better for some agent than
k12 Utilityk()gt Utilityk()
bull Deal weakly dominates deal if at least the first condition holds (deal isnrsquot worse for anyone)
Any reasonable agent would prefer (or go along with) over
if dominates or weakly dominates
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
22
Many types of interactive concession based methods
bull Some use multiple objective linear programming ndash ndash requires that the players construct a crude linear
approximation of t heir utility functions
bull Jointly Improving Direction method Start out with a neutral suggestive value continue until no joint improvements are possible ndash Used in Camp Daivd peace negotiations (EgyptIsrael
ndash Jimmy Carter Nobel Peace Prize 2002)
23
Jointly Improving Direction method
Iterate overbull Mediator helps players criticize a tentative
agreement (could be status quo)bull Generates a compromise direction (where each
of the k issues is a direction in k-space)bull Mediator helps players to find a jointly preferred
outcome along the compromise direction and then proposes a new tentative agreement
24
Typical Negotiation ProblemsTask-Oriented Domains(TOD) an agents activity can be defined in terms of a set of tasks that it has to achieve The target of a negotiation is to minimize the cost of completing the tasks
State Oriented Domains(SOD) each agent is concerned with moving the world from an initial state into one of a set of goal states The target of a negotiation is to achieve a common goal Main attribute actions have side effects (positivenegative)
Worth Oriented Domains(WOD) agents assign a worth to each potential state which captures its desirability for the agent The target of a negotiation is to maximize mutual worth (rather than worth to individual)
25
Complex Negotiations
bull Some attributes that make the negotiation process
complex are
ndash Multiple attributes
bull Single attribute (price) ndash symmetric scenario (both benefit in the
same way by a cheaper price)
bull Multiple attributes ndash several inter-related attributes eg buying a
car
ndash The number of agents and the way they interact
bull One-to-one eg single buyer and single seller
bull Many-to-one eg multiple buyers and a single seller auctions
bull Many-to-many eg multiple buyers and multiple sellers
26
Single issue negotiation
bull Like moneybull Symmetric (If roles were reversed I would
benefit the same way you would) ndash If one task requires less travel both would benefit
equally by having less travelndash utility for a task is experienced the same way by
whomever is assigned to that taskbull Non-symmetric ndash we would benefit differently if
roles were reversedndash if you delivered the picnic table you could just throw it
in the back of your van If I delivered it I would have to rent a U-haul to transport it (as my car is small)
27
Multiple Issue negotiation
bull Could be hundreds of issues (cost delivery date size quality)
bull Some may be inter-related (as size goes down cost goes down quality goes up)
bull Not clear what a true concession is (larger may be cheaper but harder to store or spoils before can be used)
bull May not even be clear what is up for negotiation (I didnrsquot realize not having any test was an option) (on the jobhellipAsk for stock options bigger office work from home)
28
How many agents are involved
bull One to one
bull One to many (auction is an example of one seller and many buyers)
bull Many to many (could be divided into buyers and sellers or all could be identical in role)ndash n(n-1)2 number of pairs
29
Negotiation DomainsTask-oriented
bull rdquoDomains in which an agentrsquos activity can be defined
in terms of a set of tasks that it has to achieverdquo (Rosenschein amp Zlotkin 1994)
bull An agent can carry out the tasks without interference (or
help) from other agents ndash such as rdquowho will deliver the
mailrdquo
bull All resources are available to the agent
bull Tasks redistributed for the benefit of all agents
30
Task-oriented Domain Definitionbull How can an agent evaluate the utility of a specific deal
ndash Utility represents how much an agent has to gain from the deal (it is
always based on change from original allocation)
ndash Since an agent can achieve the goal on its own it can compare the cost of
achieving the goal on its own to the cost of its part of the deal
bull If utilitylt0 it is worse off than performing tasks on its own
bull Conflict deal (stay with status quo) if agents fail to reach an
agreement
ndash where no agent agrees to execute tasks other than its own
bull utlity = 0
31
Formalization of TODA Task Oriented Domain(TOD) is a triple ltT Ag cgt
wherendash T is a finite set of all possible tasks
ndash Ag=A1 A2hellip An is a list of participant agentsndash c(T)R+ defines cost of executing each subset of tasks
Assumptions on cost function1 c() = 02 The cost of a subset of tasks does not depend on who carries out
them (Idealized situation)3 Cost function is monotonic which means that more tasks more
cost (It canrsquot cost less to take on more tasks) i T1 T2 implies c(T1) c(T2)
32
Redistribution of TasksGiven a TOD ltT A1A2 cgt T is original assignment D i
s assignment after the ldquodealrdquobull An encounter (instance) within the TOD is an ordered
list (T1 T2) such that for all k Tk T This is an original allocation of tasks that they might want to reallocate
bull A pure deal on an encounter is the redistribution of tasks among agents (D1 D2) such that all tasks are reassigned
D1 D2= T1 T2
Specifically (D1 D2)=(T1 T2) is called the conflict deal bull For each deal =(D1 D2) the cost of such a deal to
agent k is Costk()=c(Dk) (ie cost to k of deal is cost of Dk krsquos part of deal)
33
Examples of TOD
bull Parcel Delivery
Several couriers have to deliver sets of parcels to different cities The target of negotiation is to reallocate deliveries so that the cost of travel to each courier is minimalbull Database Queries
Several agents have access to a common database and each has to carry out a set of queries The target of negotiation is to arrange queries so as to maximize efficiency of database operations (Join Projection Union Intersection hellip) You are doing a join as part of another operation so please save the results for me
34
Possible DealsConsider an encounter from the Parcel Delivery Domain Suppose we have two agents Both agents have parcels to deliver to city a and only agent 2 has parcels to deliver to city b There are nine distinct pure deals in this encounter
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
the conflict deal
35
Figure deals knowing union must be ab
bull Choices for first agent a b ab
bull Second agent must ldquopick up the slackrdquo
bull a for agent 1 b|ab (for agent 2)
bull b for agent 1a|ab
bull ab for agent 1 a|ab|b|
bull for agent 1 ab
36
Utility Function for AgentsGiven an encounter (T1 T2) the utility function for each agent is just the difference of costs and is defined as follow
Utilityk()=c(Tk)-Costk() = c(Tk)- c(Dk)
where =(D1 D2) is a deal
ndash c(Tk) is the stand-alone cost to agent k (the cost of achieving its goal with no help)
ndash Costk() is the cost of its part of the deal
Note that the utility of the conflict deal is always 0
37
Parcel Delivery Domain (assuming do not have to return home ndash like
Uhaul)Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
38
Dominant Dealsbull Deal dominates deal if is better for at least one agent
and not worse for the other ie is at least as good for every agent as
k12 Utilityk() Utilityk()
is better for some agent than
k12 Utilityk()gt Utilityk()
bull Deal weakly dominates deal if at least the first condition holds (deal isnrsquot worse for anyone)
Any reasonable agent would prefer (or go along with) over
if dominates or weakly dominates
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
23
Jointly Improving Direction method
Iterate overbull Mediator helps players criticize a tentative
agreement (could be status quo)bull Generates a compromise direction (where each
of the k issues is a direction in k-space)bull Mediator helps players to find a jointly preferred
outcome along the compromise direction and then proposes a new tentative agreement
24
Typical Negotiation ProblemsTask-Oriented Domains(TOD) an agents activity can be defined in terms of a set of tasks that it has to achieve The target of a negotiation is to minimize the cost of completing the tasks
State Oriented Domains(SOD) each agent is concerned with moving the world from an initial state into one of a set of goal states The target of a negotiation is to achieve a common goal Main attribute actions have side effects (positivenegative)
Worth Oriented Domains(WOD) agents assign a worth to each potential state which captures its desirability for the agent The target of a negotiation is to maximize mutual worth (rather than worth to individual)
25
Complex Negotiations
bull Some attributes that make the negotiation process
complex are
ndash Multiple attributes
bull Single attribute (price) ndash symmetric scenario (both benefit in the
same way by a cheaper price)
bull Multiple attributes ndash several inter-related attributes eg buying a
car
ndash The number of agents and the way they interact
bull One-to-one eg single buyer and single seller
bull Many-to-one eg multiple buyers and a single seller auctions
bull Many-to-many eg multiple buyers and multiple sellers
26
Single issue negotiation
bull Like moneybull Symmetric (If roles were reversed I would
benefit the same way you would) ndash If one task requires less travel both would benefit
equally by having less travelndash utility for a task is experienced the same way by
whomever is assigned to that taskbull Non-symmetric ndash we would benefit differently if
roles were reversedndash if you delivered the picnic table you could just throw it
in the back of your van If I delivered it I would have to rent a U-haul to transport it (as my car is small)
27
Multiple Issue negotiation
bull Could be hundreds of issues (cost delivery date size quality)
bull Some may be inter-related (as size goes down cost goes down quality goes up)
bull Not clear what a true concession is (larger may be cheaper but harder to store or spoils before can be used)
bull May not even be clear what is up for negotiation (I didnrsquot realize not having any test was an option) (on the jobhellipAsk for stock options bigger office work from home)
28
How many agents are involved
bull One to one
bull One to many (auction is an example of one seller and many buyers)
bull Many to many (could be divided into buyers and sellers or all could be identical in role)ndash n(n-1)2 number of pairs
29
Negotiation DomainsTask-oriented
bull rdquoDomains in which an agentrsquos activity can be defined
in terms of a set of tasks that it has to achieverdquo (Rosenschein amp Zlotkin 1994)
bull An agent can carry out the tasks without interference (or
help) from other agents ndash such as rdquowho will deliver the
mailrdquo
bull All resources are available to the agent
bull Tasks redistributed for the benefit of all agents
30
Task-oriented Domain Definitionbull How can an agent evaluate the utility of a specific deal
ndash Utility represents how much an agent has to gain from the deal (it is
always based on change from original allocation)
ndash Since an agent can achieve the goal on its own it can compare the cost of
achieving the goal on its own to the cost of its part of the deal
bull If utilitylt0 it is worse off than performing tasks on its own
bull Conflict deal (stay with status quo) if agents fail to reach an
agreement
ndash where no agent agrees to execute tasks other than its own
bull utlity = 0
31
Formalization of TODA Task Oriented Domain(TOD) is a triple ltT Ag cgt
wherendash T is a finite set of all possible tasks
ndash Ag=A1 A2hellip An is a list of participant agentsndash c(T)R+ defines cost of executing each subset of tasks
Assumptions on cost function1 c() = 02 The cost of a subset of tasks does not depend on who carries out
them (Idealized situation)3 Cost function is monotonic which means that more tasks more
cost (It canrsquot cost less to take on more tasks) i T1 T2 implies c(T1) c(T2)
32
Redistribution of TasksGiven a TOD ltT A1A2 cgt T is original assignment D i
s assignment after the ldquodealrdquobull An encounter (instance) within the TOD is an ordered
list (T1 T2) such that for all k Tk T This is an original allocation of tasks that they might want to reallocate
bull A pure deal on an encounter is the redistribution of tasks among agents (D1 D2) such that all tasks are reassigned
D1 D2= T1 T2
Specifically (D1 D2)=(T1 T2) is called the conflict deal bull For each deal =(D1 D2) the cost of such a deal to
agent k is Costk()=c(Dk) (ie cost to k of deal is cost of Dk krsquos part of deal)
33
Examples of TOD
bull Parcel Delivery
Several couriers have to deliver sets of parcels to different cities The target of negotiation is to reallocate deliveries so that the cost of travel to each courier is minimalbull Database Queries
Several agents have access to a common database and each has to carry out a set of queries The target of negotiation is to arrange queries so as to maximize efficiency of database operations (Join Projection Union Intersection hellip) You are doing a join as part of another operation so please save the results for me
34
Possible DealsConsider an encounter from the Parcel Delivery Domain Suppose we have two agents Both agents have parcels to deliver to city a and only agent 2 has parcels to deliver to city b There are nine distinct pure deals in this encounter
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
the conflict deal
35
Figure deals knowing union must be ab
bull Choices for first agent a b ab
bull Second agent must ldquopick up the slackrdquo
bull a for agent 1 b|ab (for agent 2)
bull b for agent 1a|ab
bull ab for agent 1 a|ab|b|
bull for agent 1 ab
36
Utility Function for AgentsGiven an encounter (T1 T2) the utility function for each agent is just the difference of costs and is defined as follow
Utilityk()=c(Tk)-Costk() = c(Tk)- c(Dk)
where =(D1 D2) is a deal
ndash c(Tk) is the stand-alone cost to agent k (the cost of achieving its goal with no help)
ndash Costk() is the cost of its part of the deal
Note that the utility of the conflict deal is always 0
37
Parcel Delivery Domain (assuming do not have to return home ndash like
Uhaul)Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
38
Dominant Dealsbull Deal dominates deal if is better for at least one agent
and not worse for the other ie is at least as good for every agent as
k12 Utilityk() Utilityk()
is better for some agent than
k12 Utilityk()gt Utilityk()
bull Deal weakly dominates deal if at least the first condition holds (deal isnrsquot worse for anyone)
Any reasonable agent would prefer (or go along with) over
if dominates or weakly dominates
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
24
Typical Negotiation ProblemsTask-Oriented Domains(TOD) an agents activity can be defined in terms of a set of tasks that it has to achieve The target of a negotiation is to minimize the cost of completing the tasks
State Oriented Domains(SOD) each agent is concerned with moving the world from an initial state into one of a set of goal states The target of a negotiation is to achieve a common goal Main attribute actions have side effects (positivenegative)
Worth Oriented Domains(WOD) agents assign a worth to each potential state which captures its desirability for the agent The target of a negotiation is to maximize mutual worth (rather than worth to individual)
25
Complex Negotiations
bull Some attributes that make the negotiation process
complex are
ndash Multiple attributes
bull Single attribute (price) ndash symmetric scenario (both benefit in the
same way by a cheaper price)
bull Multiple attributes ndash several inter-related attributes eg buying a
car
ndash The number of agents and the way they interact
bull One-to-one eg single buyer and single seller
bull Many-to-one eg multiple buyers and a single seller auctions
bull Many-to-many eg multiple buyers and multiple sellers
26
Single issue negotiation
bull Like moneybull Symmetric (If roles were reversed I would
benefit the same way you would) ndash If one task requires less travel both would benefit
equally by having less travelndash utility for a task is experienced the same way by
whomever is assigned to that taskbull Non-symmetric ndash we would benefit differently if
roles were reversedndash if you delivered the picnic table you could just throw it
in the back of your van If I delivered it I would have to rent a U-haul to transport it (as my car is small)
27
Multiple Issue negotiation
bull Could be hundreds of issues (cost delivery date size quality)
bull Some may be inter-related (as size goes down cost goes down quality goes up)
bull Not clear what a true concession is (larger may be cheaper but harder to store or spoils before can be used)
bull May not even be clear what is up for negotiation (I didnrsquot realize not having any test was an option) (on the jobhellipAsk for stock options bigger office work from home)
28
How many agents are involved
bull One to one
bull One to many (auction is an example of one seller and many buyers)
bull Many to many (could be divided into buyers and sellers or all could be identical in role)ndash n(n-1)2 number of pairs
29
Negotiation DomainsTask-oriented
bull rdquoDomains in which an agentrsquos activity can be defined
in terms of a set of tasks that it has to achieverdquo (Rosenschein amp Zlotkin 1994)
bull An agent can carry out the tasks without interference (or
help) from other agents ndash such as rdquowho will deliver the
mailrdquo
bull All resources are available to the agent
bull Tasks redistributed for the benefit of all agents
30
Task-oriented Domain Definitionbull How can an agent evaluate the utility of a specific deal
ndash Utility represents how much an agent has to gain from the deal (it is
always based on change from original allocation)
ndash Since an agent can achieve the goal on its own it can compare the cost of
achieving the goal on its own to the cost of its part of the deal
bull If utilitylt0 it is worse off than performing tasks on its own
bull Conflict deal (stay with status quo) if agents fail to reach an
agreement
ndash where no agent agrees to execute tasks other than its own
bull utlity = 0
31
Formalization of TODA Task Oriented Domain(TOD) is a triple ltT Ag cgt
wherendash T is a finite set of all possible tasks
ndash Ag=A1 A2hellip An is a list of participant agentsndash c(T)R+ defines cost of executing each subset of tasks
Assumptions on cost function1 c() = 02 The cost of a subset of tasks does not depend on who carries out
them (Idealized situation)3 Cost function is monotonic which means that more tasks more
cost (It canrsquot cost less to take on more tasks) i T1 T2 implies c(T1) c(T2)
32
Redistribution of TasksGiven a TOD ltT A1A2 cgt T is original assignment D i
s assignment after the ldquodealrdquobull An encounter (instance) within the TOD is an ordered
list (T1 T2) such that for all k Tk T This is an original allocation of tasks that they might want to reallocate
bull A pure deal on an encounter is the redistribution of tasks among agents (D1 D2) such that all tasks are reassigned
D1 D2= T1 T2
Specifically (D1 D2)=(T1 T2) is called the conflict deal bull For each deal =(D1 D2) the cost of such a deal to
agent k is Costk()=c(Dk) (ie cost to k of deal is cost of Dk krsquos part of deal)
33
Examples of TOD
bull Parcel Delivery
Several couriers have to deliver sets of parcels to different cities The target of negotiation is to reallocate deliveries so that the cost of travel to each courier is minimalbull Database Queries
Several agents have access to a common database and each has to carry out a set of queries The target of negotiation is to arrange queries so as to maximize efficiency of database operations (Join Projection Union Intersection hellip) You are doing a join as part of another operation so please save the results for me
34
Possible DealsConsider an encounter from the Parcel Delivery Domain Suppose we have two agents Both agents have parcels to deliver to city a and only agent 2 has parcels to deliver to city b There are nine distinct pure deals in this encounter
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
the conflict deal
35
Figure deals knowing union must be ab
bull Choices for first agent a b ab
bull Second agent must ldquopick up the slackrdquo
bull a for agent 1 b|ab (for agent 2)
bull b for agent 1a|ab
bull ab for agent 1 a|ab|b|
bull for agent 1 ab
36
Utility Function for AgentsGiven an encounter (T1 T2) the utility function for each agent is just the difference of costs and is defined as follow
Utilityk()=c(Tk)-Costk() = c(Tk)- c(Dk)
where =(D1 D2) is a deal
ndash c(Tk) is the stand-alone cost to agent k (the cost of achieving its goal with no help)
ndash Costk() is the cost of its part of the deal
Note that the utility of the conflict deal is always 0
37
Parcel Delivery Domain (assuming do not have to return home ndash like
Uhaul)Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
38
Dominant Dealsbull Deal dominates deal if is better for at least one agent
and not worse for the other ie is at least as good for every agent as
k12 Utilityk() Utilityk()
is better for some agent than
k12 Utilityk()gt Utilityk()
bull Deal weakly dominates deal if at least the first condition holds (deal isnrsquot worse for anyone)
Any reasonable agent would prefer (or go along with) over
if dominates or weakly dominates
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
25
Complex Negotiations
bull Some attributes that make the negotiation process
complex are
ndash Multiple attributes
bull Single attribute (price) ndash symmetric scenario (both benefit in the
same way by a cheaper price)
bull Multiple attributes ndash several inter-related attributes eg buying a
car
ndash The number of agents and the way they interact
bull One-to-one eg single buyer and single seller
bull Many-to-one eg multiple buyers and a single seller auctions
bull Many-to-many eg multiple buyers and multiple sellers
26
Single issue negotiation
bull Like moneybull Symmetric (If roles were reversed I would
benefit the same way you would) ndash If one task requires less travel both would benefit
equally by having less travelndash utility for a task is experienced the same way by
whomever is assigned to that taskbull Non-symmetric ndash we would benefit differently if
roles were reversedndash if you delivered the picnic table you could just throw it
in the back of your van If I delivered it I would have to rent a U-haul to transport it (as my car is small)
27
Multiple Issue negotiation
bull Could be hundreds of issues (cost delivery date size quality)
bull Some may be inter-related (as size goes down cost goes down quality goes up)
bull Not clear what a true concession is (larger may be cheaper but harder to store or spoils before can be used)
bull May not even be clear what is up for negotiation (I didnrsquot realize not having any test was an option) (on the jobhellipAsk for stock options bigger office work from home)
28
How many agents are involved
bull One to one
bull One to many (auction is an example of one seller and many buyers)
bull Many to many (could be divided into buyers and sellers or all could be identical in role)ndash n(n-1)2 number of pairs
29
Negotiation DomainsTask-oriented
bull rdquoDomains in which an agentrsquos activity can be defined
in terms of a set of tasks that it has to achieverdquo (Rosenschein amp Zlotkin 1994)
bull An agent can carry out the tasks without interference (or
help) from other agents ndash such as rdquowho will deliver the
mailrdquo
bull All resources are available to the agent
bull Tasks redistributed for the benefit of all agents
30
Task-oriented Domain Definitionbull How can an agent evaluate the utility of a specific deal
ndash Utility represents how much an agent has to gain from the deal (it is
always based on change from original allocation)
ndash Since an agent can achieve the goal on its own it can compare the cost of
achieving the goal on its own to the cost of its part of the deal
bull If utilitylt0 it is worse off than performing tasks on its own
bull Conflict deal (stay with status quo) if agents fail to reach an
agreement
ndash where no agent agrees to execute tasks other than its own
bull utlity = 0
31
Formalization of TODA Task Oriented Domain(TOD) is a triple ltT Ag cgt
wherendash T is a finite set of all possible tasks
ndash Ag=A1 A2hellip An is a list of participant agentsndash c(T)R+ defines cost of executing each subset of tasks
Assumptions on cost function1 c() = 02 The cost of a subset of tasks does not depend on who carries out
them (Idealized situation)3 Cost function is monotonic which means that more tasks more
cost (It canrsquot cost less to take on more tasks) i T1 T2 implies c(T1) c(T2)
32
Redistribution of TasksGiven a TOD ltT A1A2 cgt T is original assignment D i
s assignment after the ldquodealrdquobull An encounter (instance) within the TOD is an ordered
list (T1 T2) such that for all k Tk T This is an original allocation of tasks that they might want to reallocate
bull A pure deal on an encounter is the redistribution of tasks among agents (D1 D2) such that all tasks are reassigned
D1 D2= T1 T2
Specifically (D1 D2)=(T1 T2) is called the conflict deal bull For each deal =(D1 D2) the cost of such a deal to
agent k is Costk()=c(Dk) (ie cost to k of deal is cost of Dk krsquos part of deal)
33
Examples of TOD
bull Parcel Delivery
Several couriers have to deliver sets of parcels to different cities The target of negotiation is to reallocate deliveries so that the cost of travel to each courier is minimalbull Database Queries
Several agents have access to a common database and each has to carry out a set of queries The target of negotiation is to arrange queries so as to maximize efficiency of database operations (Join Projection Union Intersection hellip) You are doing a join as part of another operation so please save the results for me
34
Possible DealsConsider an encounter from the Parcel Delivery Domain Suppose we have two agents Both agents have parcels to deliver to city a and only agent 2 has parcels to deliver to city b There are nine distinct pure deals in this encounter
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
the conflict deal
35
Figure deals knowing union must be ab
bull Choices for first agent a b ab
bull Second agent must ldquopick up the slackrdquo
bull a for agent 1 b|ab (for agent 2)
bull b for agent 1a|ab
bull ab for agent 1 a|ab|b|
bull for agent 1 ab
36
Utility Function for AgentsGiven an encounter (T1 T2) the utility function for each agent is just the difference of costs and is defined as follow
Utilityk()=c(Tk)-Costk() = c(Tk)- c(Dk)
where =(D1 D2) is a deal
ndash c(Tk) is the stand-alone cost to agent k (the cost of achieving its goal with no help)
ndash Costk() is the cost of its part of the deal
Note that the utility of the conflict deal is always 0
37
Parcel Delivery Domain (assuming do not have to return home ndash like
Uhaul)Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
38
Dominant Dealsbull Deal dominates deal if is better for at least one agent
and not worse for the other ie is at least as good for every agent as
k12 Utilityk() Utilityk()
is better for some agent than
k12 Utilityk()gt Utilityk()
bull Deal weakly dominates deal if at least the first condition holds (deal isnrsquot worse for anyone)
Any reasonable agent would prefer (or go along with) over
if dominates or weakly dominates
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
26
Single issue negotiation
bull Like moneybull Symmetric (If roles were reversed I would
benefit the same way you would) ndash If one task requires less travel both would benefit
equally by having less travelndash utility for a task is experienced the same way by
whomever is assigned to that taskbull Non-symmetric ndash we would benefit differently if
roles were reversedndash if you delivered the picnic table you could just throw it
in the back of your van If I delivered it I would have to rent a U-haul to transport it (as my car is small)
27
Multiple Issue negotiation
bull Could be hundreds of issues (cost delivery date size quality)
bull Some may be inter-related (as size goes down cost goes down quality goes up)
bull Not clear what a true concession is (larger may be cheaper but harder to store or spoils before can be used)
bull May not even be clear what is up for negotiation (I didnrsquot realize not having any test was an option) (on the jobhellipAsk for stock options bigger office work from home)
28
How many agents are involved
bull One to one
bull One to many (auction is an example of one seller and many buyers)
bull Many to many (could be divided into buyers and sellers or all could be identical in role)ndash n(n-1)2 number of pairs
29
Negotiation DomainsTask-oriented
bull rdquoDomains in which an agentrsquos activity can be defined
in terms of a set of tasks that it has to achieverdquo (Rosenschein amp Zlotkin 1994)
bull An agent can carry out the tasks without interference (or
help) from other agents ndash such as rdquowho will deliver the
mailrdquo
bull All resources are available to the agent
bull Tasks redistributed for the benefit of all agents
30
Task-oriented Domain Definitionbull How can an agent evaluate the utility of a specific deal
ndash Utility represents how much an agent has to gain from the deal (it is
always based on change from original allocation)
ndash Since an agent can achieve the goal on its own it can compare the cost of
achieving the goal on its own to the cost of its part of the deal
bull If utilitylt0 it is worse off than performing tasks on its own
bull Conflict deal (stay with status quo) if agents fail to reach an
agreement
ndash where no agent agrees to execute tasks other than its own
bull utlity = 0
31
Formalization of TODA Task Oriented Domain(TOD) is a triple ltT Ag cgt
wherendash T is a finite set of all possible tasks
ndash Ag=A1 A2hellip An is a list of participant agentsndash c(T)R+ defines cost of executing each subset of tasks
Assumptions on cost function1 c() = 02 The cost of a subset of tasks does not depend on who carries out
them (Idealized situation)3 Cost function is monotonic which means that more tasks more
cost (It canrsquot cost less to take on more tasks) i T1 T2 implies c(T1) c(T2)
32
Redistribution of TasksGiven a TOD ltT A1A2 cgt T is original assignment D i
s assignment after the ldquodealrdquobull An encounter (instance) within the TOD is an ordered
list (T1 T2) such that for all k Tk T This is an original allocation of tasks that they might want to reallocate
bull A pure deal on an encounter is the redistribution of tasks among agents (D1 D2) such that all tasks are reassigned
D1 D2= T1 T2
Specifically (D1 D2)=(T1 T2) is called the conflict deal bull For each deal =(D1 D2) the cost of such a deal to
agent k is Costk()=c(Dk) (ie cost to k of deal is cost of Dk krsquos part of deal)
33
Examples of TOD
bull Parcel Delivery
Several couriers have to deliver sets of parcels to different cities The target of negotiation is to reallocate deliveries so that the cost of travel to each courier is minimalbull Database Queries
Several agents have access to a common database and each has to carry out a set of queries The target of negotiation is to arrange queries so as to maximize efficiency of database operations (Join Projection Union Intersection hellip) You are doing a join as part of another operation so please save the results for me
34
Possible DealsConsider an encounter from the Parcel Delivery Domain Suppose we have two agents Both agents have parcels to deliver to city a and only agent 2 has parcels to deliver to city b There are nine distinct pure deals in this encounter
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
the conflict deal
35
Figure deals knowing union must be ab
bull Choices for first agent a b ab
bull Second agent must ldquopick up the slackrdquo
bull a for agent 1 b|ab (for agent 2)
bull b for agent 1a|ab
bull ab for agent 1 a|ab|b|
bull for agent 1 ab
36
Utility Function for AgentsGiven an encounter (T1 T2) the utility function for each agent is just the difference of costs and is defined as follow
Utilityk()=c(Tk)-Costk() = c(Tk)- c(Dk)
where =(D1 D2) is a deal
ndash c(Tk) is the stand-alone cost to agent k (the cost of achieving its goal with no help)
ndash Costk() is the cost of its part of the deal
Note that the utility of the conflict deal is always 0
37
Parcel Delivery Domain (assuming do not have to return home ndash like
Uhaul)Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
38
Dominant Dealsbull Deal dominates deal if is better for at least one agent
and not worse for the other ie is at least as good for every agent as
k12 Utilityk() Utilityk()
is better for some agent than
k12 Utilityk()gt Utilityk()
bull Deal weakly dominates deal if at least the first condition holds (deal isnrsquot worse for anyone)
Any reasonable agent would prefer (or go along with) over
if dominates or weakly dominates
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
27
Multiple Issue negotiation
bull Could be hundreds of issues (cost delivery date size quality)
bull Some may be inter-related (as size goes down cost goes down quality goes up)
bull Not clear what a true concession is (larger may be cheaper but harder to store or spoils before can be used)
bull May not even be clear what is up for negotiation (I didnrsquot realize not having any test was an option) (on the jobhellipAsk for stock options bigger office work from home)
28
How many agents are involved
bull One to one
bull One to many (auction is an example of one seller and many buyers)
bull Many to many (could be divided into buyers and sellers or all could be identical in role)ndash n(n-1)2 number of pairs
29
Negotiation DomainsTask-oriented
bull rdquoDomains in which an agentrsquos activity can be defined
in terms of a set of tasks that it has to achieverdquo (Rosenschein amp Zlotkin 1994)
bull An agent can carry out the tasks without interference (or
help) from other agents ndash such as rdquowho will deliver the
mailrdquo
bull All resources are available to the agent
bull Tasks redistributed for the benefit of all agents
30
Task-oriented Domain Definitionbull How can an agent evaluate the utility of a specific deal
ndash Utility represents how much an agent has to gain from the deal (it is
always based on change from original allocation)
ndash Since an agent can achieve the goal on its own it can compare the cost of
achieving the goal on its own to the cost of its part of the deal
bull If utilitylt0 it is worse off than performing tasks on its own
bull Conflict deal (stay with status quo) if agents fail to reach an
agreement
ndash where no agent agrees to execute tasks other than its own
bull utlity = 0
31
Formalization of TODA Task Oriented Domain(TOD) is a triple ltT Ag cgt
wherendash T is a finite set of all possible tasks
ndash Ag=A1 A2hellip An is a list of participant agentsndash c(T)R+ defines cost of executing each subset of tasks
Assumptions on cost function1 c() = 02 The cost of a subset of tasks does not depend on who carries out
them (Idealized situation)3 Cost function is monotonic which means that more tasks more
cost (It canrsquot cost less to take on more tasks) i T1 T2 implies c(T1) c(T2)
32
Redistribution of TasksGiven a TOD ltT A1A2 cgt T is original assignment D i
s assignment after the ldquodealrdquobull An encounter (instance) within the TOD is an ordered
list (T1 T2) such that for all k Tk T This is an original allocation of tasks that they might want to reallocate
bull A pure deal on an encounter is the redistribution of tasks among agents (D1 D2) such that all tasks are reassigned
D1 D2= T1 T2
Specifically (D1 D2)=(T1 T2) is called the conflict deal bull For each deal =(D1 D2) the cost of such a deal to
agent k is Costk()=c(Dk) (ie cost to k of deal is cost of Dk krsquos part of deal)
33
Examples of TOD
bull Parcel Delivery
Several couriers have to deliver sets of parcels to different cities The target of negotiation is to reallocate deliveries so that the cost of travel to each courier is minimalbull Database Queries
Several agents have access to a common database and each has to carry out a set of queries The target of negotiation is to arrange queries so as to maximize efficiency of database operations (Join Projection Union Intersection hellip) You are doing a join as part of another operation so please save the results for me
34
Possible DealsConsider an encounter from the Parcel Delivery Domain Suppose we have two agents Both agents have parcels to deliver to city a and only agent 2 has parcels to deliver to city b There are nine distinct pure deals in this encounter
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
the conflict deal
35
Figure deals knowing union must be ab
bull Choices for first agent a b ab
bull Second agent must ldquopick up the slackrdquo
bull a for agent 1 b|ab (for agent 2)
bull b for agent 1a|ab
bull ab for agent 1 a|ab|b|
bull for agent 1 ab
36
Utility Function for AgentsGiven an encounter (T1 T2) the utility function for each agent is just the difference of costs and is defined as follow
Utilityk()=c(Tk)-Costk() = c(Tk)- c(Dk)
where =(D1 D2) is a deal
ndash c(Tk) is the stand-alone cost to agent k (the cost of achieving its goal with no help)
ndash Costk() is the cost of its part of the deal
Note that the utility of the conflict deal is always 0
37
Parcel Delivery Domain (assuming do not have to return home ndash like
Uhaul)Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
38
Dominant Dealsbull Deal dominates deal if is better for at least one agent
and not worse for the other ie is at least as good for every agent as
k12 Utilityk() Utilityk()
is better for some agent than
k12 Utilityk()gt Utilityk()
bull Deal weakly dominates deal if at least the first condition holds (deal isnrsquot worse for anyone)
Any reasonable agent would prefer (or go along with) over
if dominates or weakly dominates
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
28
How many agents are involved
bull One to one
bull One to many (auction is an example of one seller and many buyers)
bull Many to many (could be divided into buyers and sellers or all could be identical in role)ndash n(n-1)2 number of pairs
29
Negotiation DomainsTask-oriented
bull rdquoDomains in which an agentrsquos activity can be defined
in terms of a set of tasks that it has to achieverdquo (Rosenschein amp Zlotkin 1994)
bull An agent can carry out the tasks without interference (or
help) from other agents ndash such as rdquowho will deliver the
mailrdquo
bull All resources are available to the agent
bull Tasks redistributed for the benefit of all agents
30
Task-oriented Domain Definitionbull How can an agent evaluate the utility of a specific deal
ndash Utility represents how much an agent has to gain from the deal (it is
always based on change from original allocation)
ndash Since an agent can achieve the goal on its own it can compare the cost of
achieving the goal on its own to the cost of its part of the deal
bull If utilitylt0 it is worse off than performing tasks on its own
bull Conflict deal (stay with status quo) if agents fail to reach an
agreement
ndash where no agent agrees to execute tasks other than its own
bull utlity = 0
31
Formalization of TODA Task Oriented Domain(TOD) is a triple ltT Ag cgt
wherendash T is a finite set of all possible tasks
ndash Ag=A1 A2hellip An is a list of participant agentsndash c(T)R+ defines cost of executing each subset of tasks
Assumptions on cost function1 c() = 02 The cost of a subset of tasks does not depend on who carries out
them (Idealized situation)3 Cost function is monotonic which means that more tasks more
cost (It canrsquot cost less to take on more tasks) i T1 T2 implies c(T1) c(T2)
32
Redistribution of TasksGiven a TOD ltT A1A2 cgt T is original assignment D i
s assignment after the ldquodealrdquobull An encounter (instance) within the TOD is an ordered
list (T1 T2) such that for all k Tk T This is an original allocation of tasks that they might want to reallocate
bull A pure deal on an encounter is the redistribution of tasks among agents (D1 D2) such that all tasks are reassigned
D1 D2= T1 T2
Specifically (D1 D2)=(T1 T2) is called the conflict deal bull For each deal =(D1 D2) the cost of such a deal to
agent k is Costk()=c(Dk) (ie cost to k of deal is cost of Dk krsquos part of deal)
33
Examples of TOD
bull Parcel Delivery
Several couriers have to deliver sets of parcels to different cities The target of negotiation is to reallocate deliveries so that the cost of travel to each courier is minimalbull Database Queries
Several agents have access to a common database and each has to carry out a set of queries The target of negotiation is to arrange queries so as to maximize efficiency of database operations (Join Projection Union Intersection hellip) You are doing a join as part of another operation so please save the results for me
34
Possible DealsConsider an encounter from the Parcel Delivery Domain Suppose we have two agents Both agents have parcels to deliver to city a and only agent 2 has parcels to deliver to city b There are nine distinct pure deals in this encounter
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
the conflict deal
35
Figure deals knowing union must be ab
bull Choices for first agent a b ab
bull Second agent must ldquopick up the slackrdquo
bull a for agent 1 b|ab (for agent 2)
bull b for agent 1a|ab
bull ab for agent 1 a|ab|b|
bull for agent 1 ab
36
Utility Function for AgentsGiven an encounter (T1 T2) the utility function for each agent is just the difference of costs and is defined as follow
Utilityk()=c(Tk)-Costk() = c(Tk)- c(Dk)
where =(D1 D2) is a deal
ndash c(Tk) is the stand-alone cost to agent k (the cost of achieving its goal with no help)
ndash Costk() is the cost of its part of the deal
Note that the utility of the conflict deal is always 0
37
Parcel Delivery Domain (assuming do not have to return home ndash like
Uhaul)Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
38
Dominant Dealsbull Deal dominates deal if is better for at least one agent
and not worse for the other ie is at least as good for every agent as
k12 Utilityk() Utilityk()
is better for some agent than
k12 Utilityk()gt Utilityk()
bull Deal weakly dominates deal if at least the first condition holds (deal isnrsquot worse for anyone)
Any reasonable agent would prefer (or go along with) over
if dominates or weakly dominates
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
29
Negotiation DomainsTask-oriented
bull rdquoDomains in which an agentrsquos activity can be defined
in terms of a set of tasks that it has to achieverdquo (Rosenschein amp Zlotkin 1994)
bull An agent can carry out the tasks without interference (or
help) from other agents ndash such as rdquowho will deliver the
mailrdquo
bull All resources are available to the agent
bull Tasks redistributed for the benefit of all agents
30
Task-oriented Domain Definitionbull How can an agent evaluate the utility of a specific deal
ndash Utility represents how much an agent has to gain from the deal (it is
always based on change from original allocation)
ndash Since an agent can achieve the goal on its own it can compare the cost of
achieving the goal on its own to the cost of its part of the deal
bull If utilitylt0 it is worse off than performing tasks on its own
bull Conflict deal (stay with status quo) if agents fail to reach an
agreement
ndash where no agent agrees to execute tasks other than its own
bull utlity = 0
31
Formalization of TODA Task Oriented Domain(TOD) is a triple ltT Ag cgt
wherendash T is a finite set of all possible tasks
ndash Ag=A1 A2hellip An is a list of participant agentsndash c(T)R+ defines cost of executing each subset of tasks
Assumptions on cost function1 c() = 02 The cost of a subset of tasks does not depend on who carries out
them (Idealized situation)3 Cost function is monotonic which means that more tasks more
cost (It canrsquot cost less to take on more tasks) i T1 T2 implies c(T1) c(T2)
32
Redistribution of TasksGiven a TOD ltT A1A2 cgt T is original assignment D i
s assignment after the ldquodealrdquobull An encounter (instance) within the TOD is an ordered
list (T1 T2) such that for all k Tk T This is an original allocation of tasks that they might want to reallocate
bull A pure deal on an encounter is the redistribution of tasks among agents (D1 D2) such that all tasks are reassigned
D1 D2= T1 T2
Specifically (D1 D2)=(T1 T2) is called the conflict deal bull For each deal =(D1 D2) the cost of such a deal to
agent k is Costk()=c(Dk) (ie cost to k of deal is cost of Dk krsquos part of deal)
33
Examples of TOD
bull Parcel Delivery
Several couriers have to deliver sets of parcels to different cities The target of negotiation is to reallocate deliveries so that the cost of travel to each courier is minimalbull Database Queries
Several agents have access to a common database and each has to carry out a set of queries The target of negotiation is to arrange queries so as to maximize efficiency of database operations (Join Projection Union Intersection hellip) You are doing a join as part of another operation so please save the results for me
34
Possible DealsConsider an encounter from the Parcel Delivery Domain Suppose we have two agents Both agents have parcels to deliver to city a and only agent 2 has parcels to deliver to city b There are nine distinct pure deals in this encounter
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
the conflict deal
35
Figure deals knowing union must be ab
bull Choices for first agent a b ab
bull Second agent must ldquopick up the slackrdquo
bull a for agent 1 b|ab (for agent 2)
bull b for agent 1a|ab
bull ab for agent 1 a|ab|b|
bull for agent 1 ab
36
Utility Function for AgentsGiven an encounter (T1 T2) the utility function for each agent is just the difference of costs and is defined as follow
Utilityk()=c(Tk)-Costk() = c(Tk)- c(Dk)
where =(D1 D2) is a deal
ndash c(Tk) is the stand-alone cost to agent k (the cost of achieving its goal with no help)
ndash Costk() is the cost of its part of the deal
Note that the utility of the conflict deal is always 0
37
Parcel Delivery Domain (assuming do not have to return home ndash like
Uhaul)Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
38
Dominant Dealsbull Deal dominates deal if is better for at least one agent
and not worse for the other ie is at least as good for every agent as
k12 Utilityk() Utilityk()
is better for some agent than
k12 Utilityk()gt Utilityk()
bull Deal weakly dominates deal if at least the first condition holds (deal isnrsquot worse for anyone)
Any reasonable agent would prefer (or go along with) over
if dominates or weakly dominates
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
30
Task-oriented Domain Definitionbull How can an agent evaluate the utility of a specific deal
ndash Utility represents how much an agent has to gain from the deal (it is
always based on change from original allocation)
ndash Since an agent can achieve the goal on its own it can compare the cost of
achieving the goal on its own to the cost of its part of the deal
bull If utilitylt0 it is worse off than performing tasks on its own
bull Conflict deal (stay with status quo) if agents fail to reach an
agreement
ndash where no agent agrees to execute tasks other than its own
bull utlity = 0
31
Formalization of TODA Task Oriented Domain(TOD) is a triple ltT Ag cgt
wherendash T is a finite set of all possible tasks
ndash Ag=A1 A2hellip An is a list of participant agentsndash c(T)R+ defines cost of executing each subset of tasks
Assumptions on cost function1 c() = 02 The cost of a subset of tasks does not depend on who carries out
them (Idealized situation)3 Cost function is monotonic which means that more tasks more
cost (It canrsquot cost less to take on more tasks) i T1 T2 implies c(T1) c(T2)
32
Redistribution of TasksGiven a TOD ltT A1A2 cgt T is original assignment D i
s assignment after the ldquodealrdquobull An encounter (instance) within the TOD is an ordered
list (T1 T2) such that for all k Tk T This is an original allocation of tasks that they might want to reallocate
bull A pure deal on an encounter is the redistribution of tasks among agents (D1 D2) such that all tasks are reassigned
D1 D2= T1 T2
Specifically (D1 D2)=(T1 T2) is called the conflict deal bull For each deal =(D1 D2) the cost of such a deal to
agent k is Costk()=c(Dk) (ie cost to k of deal is cost of Dk krsquos part of deal)
33
Examples of TOD
bull Parcel Delivery
Several couriers have to deliver sets of parcels to different cities The target of negotiation is to reallocate deliveries so that the cost of travel to each courier is minimalbull Database Queries
Several agents have access to a common database and each has to carry out a set of queries The target of negotiation is to arrange queries so as to maximize efficiency of database operations (Join Projection Union Intersection hellip) You are doing a join as part of another operation so please save the results for me
34
Possible DealsConsider an encounter from the Parcel Delivery Domain Suppose we have two agents Both agents have parcels to deliver to city a and only agent 2 has parcels to deliver to city b There are nine distinct pure deals in this encounter
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
the conflict deal
35
Figure deals knowing union must be ab
bull Choices for first agent a b ab
bull Second agent must ldquopick up the slackrdquo
bull a for agent 1 b|ab (for agent 2)
bull b for agent 1a|ab
bull ab for agent 1 a|ab|b|
bull for agent 1 ab
36
Utility Function for AgentsGiven an encounter (T1 T2) the utility function for each agent is just the difference of costs and is defined as follow
Utilityk()=c(Tk)-Costk() = c(Tk)- c(Dk)
where =(D1 D2) is a deal
ndash c(Tk) is the stand-alone cost to agent k (the cost of achieving its goal with no help)
ndash Costk() is the cost of its part of the deal
Note that the utility of the conflict deal is always 0
37
Parcel Delivery Domain (assuming do not have to return home ndash like
Uhaul)Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
38
Dominant Dealsbull Deal dominates deal if is better for at least one agent
and not worse for the other ie is at least as good for every agent as
k12 Utilityk() Utilityk()
is better for some agent than
k12 Utilityk()gt Utilityk()
bull Deal weakly dominates deal if at least the first condition holds (deal isnrsquot worse for anyone)
Any reasonable agent would prefer (or go along with) over
if dominates or weakly dominates
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
31
Formalization of TODA Task Oriented Domain(TOD) is a triple ltT Ag cgt
wherendash T is a finite set of all possible tasks
ndash Ag=A1 A2hellip An is a list of participant agentsndash c(T)R+ defines cost of executing each subset of tasks
Assumptions on cost function1 c() = 02 The cost of a subset of tasks does not depend on who carries out
them (Idealized situation)3 Cost function is monotonic which means that more tasks more
cost (It canrsquot cost less to take on more tasks) i T1 T2 implies c(T1) c(T2)
32
Redistribution of TasksGiven a TOD ltT A1A2 cgt T is original assignment D i
s assignment after the ldquodealrdquobull An encounter (instance) within the TOD is an ordered
list (T1 T2) such that for all k Tk T This is an original allocation of tasks that they might want to reallocate
bull A pure deal on an encounter is the redistribution of tasks among agents (D1 D2) such that all tasks are reassigned
D1 D2= T1 T2
Specifically (D1 D2)=(T1 T2) is called the conflict deal bull For each deal =(D1 D2) the cost of such a deal to
agent k is Costk()=c(Dk) (ie cost to k of deal is cost of Dk krsquos part of deal)
33
Examples of TOD
bull Parcel Delivery
Several couriers have to deliver sets of parcels to different cities The target of negotiation is to reallocate deliveries so that the cost of travel to each courier is minimalbull Database Queries
Several agents have access to a common database and each has to carry out a set of queries The target of negotiation is to arrange queries so as to maximize efficiency of database operations (Join Projection Union Intersection hellip) You are doing a join as part of another operation so please save the results for me
34
Possible DealsConsider an encounter from the Parcel Delivery Domain Suppose we have two agents Both agents have parcels to deliver to city a and only agent 2 has parcels to deliver to city b There are nine distinct pure deals in this encounter
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
the conflict deal
35
Figure deals knowing union must be ab
bull Choices for first agent a b ab
bull Second agent must ldquopick up the slackrdquo
bull a for agent 1 b|ab (for agent 2)
bull b for agent 1a|ab
bull ab for agent 1 a|ab|b|
bull for agent 1 ab
36
Utility Function for AgentsGiven an encounter (T1 T2) the utility function for each agent is just the difference of costs and is defined as follow
Utilityk()=c(Tk)-Costk() = c(Tk)- c(Dk)
where =(D1 D2) is a deal
ndash c(Tk) is the stand-alone cost to agent k (the cost of achieving its goal with no help)
ndash Costk() is the cost of its part of the deal
Note that the utility of the conflict deal is always 0
37
Parcel Delivery Domain (assuming do not have to return home ndash like
Uhaul)Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
38
Dominant Dealsbull Deal dominates deal if is better for at least one agent
and not worse for the other ie is at least as good for every agent as
k12 Utilityk() Utilityk()
is better for some agent than
k12 Utilityk()gt Utilityk()
bull Deal weakly dominates deal if at least the first condition holds (deal isnrsquot worse for anyone)
Any reasonable agent would prefer (or go along with) over
if dominates or weakly dominates
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
32
Redistribution of TasksGiven a TOD ltT A1A2 cgt T is original assignment D i
s assignment after the ldquodealrdquobull An encounter (instance) within the TOD is an ordered
list (T1 T2) such that for all k Tk T This is an original allocation of tasks that they might want to reallocate
bull A pure deal on an encounter is the redistribution of tasks among agents (D1 D2) such that all tasks are reassigned
D1 D2= T1 T2
Specifically (D1 D2)=(T1 T2) is called the conflict deal bull For each deal =(D1 D2) the cost of such a deal to
agent k is Costk()=c(Dk) (ie cost to k of deal is cost of Dk krsquos part of deal)
33
Examples of TOD
bull Parcel Delivery
Several couriers have to deliver sets of parcels to different cities The target of negotiation is to reallocate deliveries so that the cost of travel to each courier is minimalbull Database Queries
Several agents have access to a common database and each has to carry out a set of queries The target of negotiation is to arrange queries so as to maximize efficiency of database operations (Join Projection Union Intersection hellip) You are doing a join as part of another operation so please save the results for me
34
Possible DealsConsider an encounter from the Parcel Delivery Domain Suppose we have two agents Both agents have parcels to deliver to city a and only agent 2 has parcels to deliver to city b There are nine distinct pure deals in this encounter
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
the conflict deal
35
Figure deals knowing union must be ab
bull Choices for first agent a b ab
bull Second agent must ldquopick up the slackrdquo
bull a for agent 1 b|ab (for agent 2)
bull b for agent 1a|ab
bull ab for agent 1 a|ab|b|
bull for agent 1 ab
36
Utility Function for AgentsGiven an encounter (T1 T2) the utility function for each agent is just the difference of costs and is defined as follow
Utilityk()=c(Tk)-Costk() = c(Tk)- c(Dk)
where =(D1 D2) is a deal
ndash c(Tk) is the stand-alone cost to agent k (the cost of achieving its goal with no help)
ndash Costk() is the cost of its part of the deal
Note that the utility of the conflict deal is always 0
37
Parcel Delivery Domain (assuming do not have to return home ndash like
Uhaul)Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
38
Dominant Dealsbull Deal dominates deal if is better for at least one agent
and not worse for the other ie is at least as good for every agent as
k12 Utilityk() Utilityk()
is better for some agent than
k12 Utilityk()gt Utilityk()
bull Deal weakly dominates deal if at least the first condition holds (deal isnrsquot worse for anyone)
Any reasonable agent would prefer (or go along with) over
if dominates or weakly dominates
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
33
Examples of TOD
bull Parcel Delivery
Several couriers have to deliver sets of parcels to different cities The target of negotiation is to reallocate deliveries so that the cost of travel to each courier is minimalbull Database Queries
Several agents have access to a common database and each has to carry out a set of queries The target of negotiation is to arrange queries so as to maximize efficiency of database operations (Join Projection Union Intersection hellip) You are doing a join as part of another operation so please save the results for me
34
Possible DealsConsider an encounter from the Parcel Delivery Domain Suppose we have two agents Both agents have parcels to deliver to city a and only agent 2 has parcels to deliver to city b There are nine distinct pure deals in this encounter
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
the conflict deal
35
Figure deals knowing union must be ab
bull Choices for first agent a b ab
bull Second agent must ldquopick up the slackrdquo
bull a for agent 1 b|ab (for agent 2)
bull b for agent 1a|ab
bull ab for agent 1 a|ab|b|
bull for agent 1 ab
36
Utility Function for AgentsGiven an encounter (T1 T2) the utility function for each agent is just the difference of costs and is defined as follow
Utilityk()=c(Tk)-Costk() = c(Tk)- c(Dk)
where =(D1 D2) is a deal
ndash c(Tk) is the stand-alone cost to agent k (the cost of achieving its goal with no help)
ndash Costk() is the cost of its part of the deal
Note that the utility of the conflict deal is always 0
37
Parcel Delivery Domain (assuming do not have to return home ndash like
Uhaul)Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
38
Dominant Dealsbull Deal dominates deal if is better for at least one agent
and not worse for the other ie is at least as good for every agent as
k12 Utilityk() Utilityk()
is better for some agent than
k12 Utilityk()gt Utilityk()
bull Deal weakly dominates deal if at least the first condition holds (deal isnrsquot worse for anyone)
Any reasonable agent would prefer (or go along with) over
if dominates or weakly dominates
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
34
Possible DealsConsider an encounter from the Parcel Delivery Domain Suppose we have two agents Both agents have parcels to deliver to city a and only agent 2 has parcels to deliver to city b There are nine distinct pure deals in this encounter
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
the conflict deal
35
Figure deals knowing union must be ab
bull Choices for first agent a b ab
bull Second agent must ldquopick up the slackrdquo
bull a for agent 1 b|ab (for agent 2)
bull b for agent 1a|ab
bull ab for agent 1 a|ab|b|
bull for agent 1 ab
36
Utility Function for AgentsGiven an encounter (T1 T2) the utility function for each agent is just the difference of costs and is defined as follow
Utilityk()=c(Tk)-Costk() = c(Tk)- c(Dk)
where =(D1 D2) is a deal
ndash c(Tk) is the stand-alone cost to agent k (the cost of achieving its goal with no help)
ndash Costk() is the cost of its part of the deal
Note that the utility of the conflict deal is always 0
37
Parcel Delivery Domain (assuming do not have to return home ndash like
Uhaul)Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
38
Dominant Dealsbull Deal dominates deal if is better for at least one agent
and not worse for the other ie is at least as good for every agent as
k12 Utilityk() Utilityk()
is better for some agent than
k12 Utilityk()gt Utilityk()
bull Deal weakly dominates deal if at least the first condition holds (deal isnrsquot worse for anyone)
Any reasonable agent would prefer (or go along with) over
if dominates or weakly dominates
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
35
Figure deals knowing union must be ab
bull Choices for first agent a b ab
bull Second agent must ldquopick up the slackrdquo
bull a for agent 1 b|ab (for agent 2)
bull b for agent 1a|ab
bull ab for agent 1 a|ab|b|
bull for agent 1 ab
36
Utility Function for AgentsGiven an encounter (T1 T2) the utility function for each agent is just the difference of costs and is defined as follow
Utilityk()=c(Tk)-Costk() = c(Tk)- c(Dk)
where =(D1 D2) is a deal
ndash c(Tk) is the stand-alone cost to agent k (the cost of achieving its goal with no help)
ndash Costk() is the cost of its part of the deal
Note that the utility of the conflict deal is always 0
37
Parcel Delivery Domain (assuming do not have to return home ndash like
Uhaul)Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
38
Dominant Dealsbull Deal dominates deal if is better for at least one agent
and not worse for the other ie is at least as good for every agent as
k12 Utilityk() Utilityk()
is better for some agent than
k12 Utilityk()gt Utilityk()
bull Deal weakly dominates deal if at least the first condition holds (deal isnrsquot worse for anyone)
Any reasonable agent would prefer (or go along with) over
if dominates or weakly dominates
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
36
Utility Function for AgentsGiven an encounter (T1 T2) the utility function for each agent is just the difference of costs and is defined as follow
Utilityk()=c(Tk)-Costk() = c(Tk)- c(Dk)
where =(D1 D2) is a deal
ndash c(Tk) is the stand-alone cost to agent k (the cost of achieving its goal with no help)
ndash Costk() is the cost of its part of the deal
Note that the utility of the conflict deal is always 0
37
Parcel Delivery Domain (assuming do not have to return home ndash like
Uhaul)Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
38
Dominant Dealsbull Deal dominates deal if is better for at least one agent
and not worse for the other ie is at least as good for every agent as
k12 Utilityk() Utilityk()
is better for some agent than
k12 Utilityk()gt Utilityk()
bull Deal weakly dominates deal if at least the first condition holds (deal isnrsquot worse for anyone)
Any reasonable agent would prefer (or go along with) over
if dominates or weakly dominates
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
37
Parcel Delivery Domain (assuming do not have to return home ndash like
Uhaul)Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
38
Dominant Dealsbull Deal dominates deal if is better for at least one agent
and not worse for the other ie is at least as good for every agent as
k12 Utilityk() Utilityk()
is better for some agent than
k12 Utilityk()gt Utilityk()
bull Deal weakly dominates deal if at least the first condition holds (deal isnrsquot worse for anyone)
Any reasonable agent would prefer (or go along with) over
if dominates or weakly dominates
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
38
Dominant Dealsbull Deal dominates deal if is better for at least one agent
and not worse for the other ie is at least as good for every agent as
k12 Utilityk() Utilityk()
is better for some agent than
k12 Utilityk()gt Utilityk()
bull Deal weakly dominates deal if at least the first condition holds (deal isnrsquot worse for anyone)
Any reasonable agent would prefer (or go along with) over
if dominates or weakly dominates
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
39
Negotiation Set Space of Negotiation
bull A deal is called individual rational if weakly dominates the conflict deal (no worse than what you have already)
bull A deal is called Pareto optimal if there does not exist another deal that dominates (best deal for x without disadvantaging y)
bull The set of all deals that are individual rational and Pareto optimal is called the negotiation set (NS)
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
40
Utility Function for Agents (example from previous slide)
1Utility1(a b) =0
2Utility1(b a)=0
3Utility1(ab )=-2
4Utility1( ab)=1
5Utility1(a ab)=0
6Utility1(b ab)=0
7Utility1(ab a)=-2
8Utility1(ab b)=-2
9Utility1(ab ab)=-2
1Utility2(a b) =2
2Utility2 (b a)=2
3Utility2 (ab )=3
4Utility2 ( ab)=0
5Utility2 (a ab)=0
6Utility2 (b ab)=0
7Utility2 (ab a)=2
8Utility2 (ab b)=2
9Utility2 (ab ab)=0
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
41
Individual Rational for Both(eliminate any choices that are negative for either)
1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
individualrational
(a b)
(b a)
( ab)
(a ab)
(b ab)
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
42
Pareto Optimal Deals1 (a b)
2 (b a)
3 (ab )
4 ( ab)
5 (a ab)
6 (b ab)
7 (ab a)
8 (ab b)
9 (ab ab)
ParetoOptimal
(a b)
(b a)
(ab )
( ab)Beaten by (ab) deal
is (-23) but nothing beats 3 for agent 2
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
43
Negotiation Set
Negotiation Set
(a b)
(b a)
( ab)
Individual Rational Deals
(a b)
(b a)
( ab)
(a ab)
(b ab)
Pareto Optimal Deals
(a b)
(b a)
(ab )
( ab)
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
44
Negotiation Set illustrated
bull Create a scatter plot of the utility for i over the utility for j
bull Only those where both is positive are individually rational (for both) (origin is conflict deal)
bull Which are pareto optimal
Utility for i
Utility for j
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
45
Negotiation Set in Task-oriented Domains
AC
B
D
E
Utility for agent i
Utility for agent j
Utility of conflict Deal for agent i
Utility of conflict Deal for agent j
Conflict deal
The circle delimits the space of all possible deals
Negotiation set
(pareto optimal+
Individual rational)
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
46
Negotiation Protocol () ndash Product of the two agent utilities from bull product maximizing negotiation protocol One step protocol
ndash Concession protocol
bull At t gt= 0 A offers (At) and B offers (Bt) such thatndash Both deals are from the negotiation set i andt gt0 Utilityi((it)) lt= Utilityi((it-1)) ndash I propose something less desirable for me
bull Negotiation endingndash Conflict - Utilityi((it)) = Utilityi((it-1))ndash Agreement j =i Utilityj((it)) gt= Utilityj((jt))
bull Only A =gt agree (Bt) either agrees with proposalbull Only B =gt agree (At) either agrees with proposalbull Both AB =gt agree (kt) such that ((k))=max((A))((B))bull Both AB and ((A))=((B)) =gt flip a coin (product is the same but may
not be the same for each agent ndash flip coin to decide which deal to use)
Pure deals
Mixeddeal
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
47
The Monotonic Concession Protocol ndash One direction move towards middle
Rules of this protocol are as follows bull Negotiation proceeds in roundsbull On round 1 agents simultaneously propose a deal from the negotiation
set (can re-propose same one)bull Agreement is reached if one agent finds that the deal proposed by the
other is at least as good or better than its proposalbull If no agreement is reached then negotiation proceeds to another round
of simultaneous proposalsbull An agent is not allowed to offer the other agent less (in term of utility )
than it did in the previous round It can either stand still or make a concession Assumes we know what the other agent values
bull If neither agent makes a concession in some round then negotiation terminates with the conflict deal
bull Meta data explanation or critique of deal
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
48
Condition to Consent an Agreement
If both of the agents finds that the deal proposed by the other is at least as good or better than the proposal it made
Utility1(2) Utility1(1)and
Utility2(1) Utility2(2)
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
49
The Monotonic Concession Protocol
bull Advantages
ndash Symmetrically distributed (no agent plays a special role)
ndash Ensures convergence
ndash It will not go on indefinitely
bull Disadvantages
ndash Agents can run into conflicts
ndash Inefficient ndash no quarantee that an agreement will be
reached quickly
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
50
Negotiation Strategy
Given the negotiation space and the Monotonic Concession Protocol a strategy of negotiation is an answer to the following questionsbull What should an agentrsquos first proposal bebull On any given round who should concedebull If an agent concedes then how much should it concede
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
51
The Zeuthen Strategy ndash a refinement of monotonic protocolQ What should my first proposal be
A the best deal for you among all possible deals in the negotiation set (Is a way of telling others what you value)
Agent 1s best deal agent 2s best deal
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
52
The Zeuthen StrategyQ I make a proposal in every round but may be the same as last
time Do I need to make a concession in this round
A If you are not willing to risk a conflict you should make a concession
How much am I willing to risk a
conflict
Agent 1s best deal agent 2s best deal
How much am I willing to risk a
conflict
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
53
Willingness to Risk Conflict
Suppose you have conceded a lot Thenndash You have lost your expected utility (closer to zero)ndash In case conflict occurs you are not much worse offndash You are more willing to risk conflictAn agent will be more willing to risk conflict if the
difference in utility between your loss in making an concession and your loss in taking a conflict deal with respect to your current offer
bull If both are equally willing to risk both concede
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
54
Risk Evaluation
riski= utility agent i loses by conceding and accepting agent js offer
utility agent 1 loses by not conceding and causing a conflict
You have to calculatebull How much you will lose if you make a concession and
accept your opponents offerbull How much you will lose if you stand still which causes a
conflict
=Utilityi (i )-Utilityi (j )
Utilityi (i )
where i and i are the current offer of agent i and j respectively
risk is willingness to risk conflict (1 is perfectly willing to risk)risk is willingness to risk conflict (1 is perfectly willing to risk)
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
55
Risk Evaluation
bull risk measures the fraction you have left to gain If it is close to one you have gained little (and are more willing to risk)
bull This assumes you know what others utility is
bull What one sets as initial goal affects risk If I set an impossible goal my willingness to risk is always higher
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
56
The Risk Factor
One way to think about which agent should
concede is to consider how much each has to loose
by running into conflict at that point
Ai best deal Aj best deal
Conflict deal
How much am I willing to risk a conflict
Maximum to gain from agreement
Maximum still hope to gain
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
57
The Zeuthen Strategy
Q If I concedes then how much should I concede
A Enough to change the balance of risk (who has more to lose) (Otherwise it will just be your turn to concede again at the next round) Not so much that you give up more than you needed to
Q What if both have equal risk
A Both concede
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
58
About MCP and Zeuthen Strategies
bull Advantages
ndash Simple and reflects the way human negotiations work
ndash Stability ndash in Nash equilibrium ndash if one agent is using the strategy
then the other can do no better than using it himherself
bull Disadvantages
ndash Computationally expensive ndash players need to compute the entire
negotiation set
ndash Communication burden ndash negotiation process may involve
several steps
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
59
Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
Negotiation Set
(a b)
(b a)
( ab)
First offer
( ab)
(a b)
Agent 1
Agent 2
Utility of agent 1
Utility1(a b) = 0
Utility1(b a) = 0
Utility1( ab)=1
Utility of agent 2
Utility2(a b) =2
Utility2(b a) = 2
Utility2( ab)=0
Risk of conflict
1
1
Can they reach an agreementWho will concede
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
60
Conflict Deal
He should concede
Agent 1s best deal agent 2s best deal
He should concede
Zeuthen does not reach a settlement as neither will concede as there is no middle ground
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
61
Parcel Delivery Domain Example 2 (donrsquot return to dist point)Distribution Point
a d
7 7
Cost functionc()=0c(a)=c(d)=7c(b)=c(c)=c(ab)=c(cd)=8c(bc)=c(abc)=c(bcd)=9c(ad)=c(abd)=c(acd)=c(abcd)=10
b c1 1 1
Negotiation Set (abcd ) (abc) d) (ab cd) (a bcd) ( abcd)
Conflict Deal (abcd abcd)
All choices are IR as canrsquot do worse (acbd) is dominated by (abcd)
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
62
Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
No Pure Deal Agent 1s Utility Agent 2s Utility
1 (abcd ) 0 10
2 (abc) d) 1 3
3 (ab cd) 2 2
4 (a bcd) 3 1
5 ( abcd) 10 0
Conflict deal 0 0
agent 1 agent 25 4 3 2 1
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
63
What bothers you about the previous agreement
bull Decide to both get (22) utility rather than the expected utility of (010) for another choice
bull Is there a solution
bull Fair versus higher global utility
bull Restrictions of this method (no promises for future or sharing of utility)
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
64
Nash Equilibrium
bullThe Zeuthen strategy is in Nash equilibrium under the assumption that when one agent is using the strategy the other can do no better than use it himselfbullGenerally Nash equilibrium is not applicable in negotiation setting because it requires both sides utility function bullIt is of particular interest to the designer of automated agents It does away with any need for secrecy on the part of the programmer since first step reveals true desiresbullAn agentrsquos strategy can be publicly known and no other agent designer can exploit the information by choosing a different strategy In fact it is desirable that the strategy be known to avoid inadvertent conflicts
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
65
State Oriented Domainbull Goals are acceptable final states (superset of TOD)
bull Have side effects - agent doing one action might hinder or help another agent Example on(whitegray) has side effect of clear(black)
bull Negotiation develop joint plans and schedules for the agents to help and not hinder other agents
bull Example ndash Slotted blocks world -blocks cannot go anywhere on table ndash only in slots (restricted resource)
bull Note how this simple change (slots) makes it so two workers get in each ohterrsquos way even if goals are unrelated
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
66
bull Joint plan is used to mean ldquowhat they both dordquo not ldquowhat they do togetherrdquo ndash just the joining of plans There is no joint goal
bull The actions taken by agent k in the joint plan are called krsquos role and is written as Jk
bull C(J)k is the cost of krsquos role in joint plan Jbull In TOD you cannot do anotherrsquos task as a side effect of
doing yours or get in their way bull In TOD coordinated plans are never worse as you can
just do your original taskbull With SOD you may get in each otherrsquos waybull Donrsquot accept partially completed plans
State oriented domain is a bit more powerful than TOD
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
67
Assumptions of SOD1 Agents will maximize expected utility (will prefer
51 chance of getting $100 than a sure $50)2 Agent cannot commit himself (as part of current
negotiation) to behavior in future negotiation3 Interagent comparison of utility common utility
units4 Symmetric abilities (all can perform tasks and cost
is same regardless of agent performing)5 Binding commitments6 No explicit utility transfer (no ldquomoneyrdquo that can be
used to compensate one agent for a disadvantageous agreement)
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
68
Achievement of Final State
bull Goal of each agent is represented as a set of states that they would be happy with
bull Looking for a state in intersection of goalsbull Possibilities
ndash Both can be achieved at gain to both (eg travel to same location and split cost)
ndash Goals may contradict so no mutually acceptable state (eg both need a car)
ndash Can find common state but perhaps it cannot be reached with the primitive operations in the domain (could both travel together but may need to know how to pickup another)
ndash Might be a reachable state which satisfies both but may be too expensive ndash unwilling to expend effort (ie we could save a bit if we car-pooled but is too complicated for so little gain)
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
69
What if choices donrsquot benefit others fairly
bull Suppose there are two states that satisfy both agents
bull State 1 one has a cost of 6 for one agent and 2 for the other
bull State 2 costs both agents 5bull State 1 is cheaper (overall) but state 2 is
more equal How can we get cooperation (as why should one agent agree to do more)
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
70
Mixed deal
bull Instead of picking the plan that is unfair to one agent (but better overall) use a lottery
bull Assign a probability that one would get a certain plan
bull Called a mixed deal ndash deal with probability Compute probabilty so that expected utility is the same for both
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
71
Cost
bull If = (Jp) is a deal then
costi() = pc(J)i + (1-p)c(J)k where k is irsquos opponent -the role i plays with (1-p) probability
bull Utility is simply difference between cost of achieving goal alone and expected utility of joint plan
bull For postman Example
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
72
Parcel Delivery Domain (assuming do not have to return home)
Distribution Point
city a city b
1 1
Cost functionc()=0c(a)=1c(b)=1c(ab)=3
Utility for agent 1 (org a)
1 Utility1(a b) = 0
2 Utility1(b a) = 0
3 Utility1(a b ) = -2
4 Utility1( a b) = 1
hellip
Utility for agent 2 (org ab)
1 Utility2(a b) = 2
2 Utility2(b a) = 2
3 Utility2(a b ) = 3
4 Utility2( a b) = 0
hellip
2
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
73
Consider deal 3 with probability
bull (ab)p means agent 1 does with p probabilty and ab with (1-p) probabilty
bull What should p be to be fair to both (equal utility)bull (1-p)(-2) + p1 = utility for agent 1bull (1-p)(3) + p0 = utility for agent 2bull (1-p)(-2) + p1= (1-p)(3) + p0 bull -2+2p+p = 3-3p =gt p=56bull If agent 1 does no deliveries 56 of the time it is
fair
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
74
Try again with other choice in negotiation set
bull (ab)p means agent 1 does a with p probabilty and b with (1-p) probabilty
bull What should p be to be fair to both (equal utility)
bull (1-p)(0) + p0 = utility for agent 1bull (1-p)(2) + p2 = utility for agent 2bull 0=2 no solutionbull Can you see why we canrsquot use a p to
make this fair
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
75
Mixed deal
bull All or nothing deal (one does everything) such that ndash mixed deal m = [(TATB )p] NS (m) = maxNS(d)
bull Mixed deal makes the solution space of deals continuous rather than discrete as it was before
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
76
bull A symmetric mechanism is in equilibrium if no one is motivated to change strategies We choose to use one which maximizes the product of utilities (as is a fairer division) Try dividing a total utility of 10 (zero sum) various ways to see when product is maximized
bull We may flip between choices even if both are the same just to avoid possible bias ndash like switching goals in soccer
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
77
Examples CooperativeEach is helped by joint plan
bull Slotted blocks world initially white block is at 1 and black block at 2 Agent 1 wants black in 1 Agent 2 wants white in 2 (Both goals are compatible)
bull Assume pick up is cost 1 and set down is onebull Mutually beneficial ndash each can pick up at the
same time costing each 2 ndash Win ndash as didnrsquot have to move other block out of the way
bull If done by one cost would be four ndash so utility to each is 2
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
78
Examples CompromiseBoth can succeed but worse for both
than if other agent werenrsquot therebull Slotted blocks world initially white block is at 1 and black block
at 2 two gray blocks at 3 Agent 1 wants black in 1 but not on table Agent 2 wants white in 2 but not directly on table
bull Alone agent 1 could just pick up black and place on white Similarly for agent 2 But would undo others goal
bull But together all blocks must be picked up and put down Best plan one agent picks up black while other agent rearranges (cost 6 for one 2 for other)
bull Can both be happy but unequal roles
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
79
Choices
bull Maybe each goal doesnrsquot need to be achieved Cost for one is two Cost for both averages four
bull If both value it the same flip a coin to decide who does most of the work p=12
bull What if we donrsquot value the goal the same way Canrsquot really look at utility in same way as the other personrsquos goals changes the original plan
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
80
Compromise continuedbull Who should get to do the easier role bull If you value it more shouldnrsquot you do more of the work to achieve a
common goal What does this mean if partnerroommate doesnrsquot value a clean house or a good meal
bull Look at worth If A1 assigns worth (utility) of 3 and A2 assigns worth (utility) of 6 to final goal we could use probability to make it ldquofairrdquo
bull Assign (26) p of the timebull Utilty for agent 1= p(1) + (1-p)(-3) loses utilty if takes 6 for benefit 3bull Utility for agent 2 = p(0) + (1-p)4bull Solving for p by setting utitlies equalbull 4p-3 = 4-4pbull p = 78bull Thus I can take an unfair division and make it fair
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
81
Example conflictbull I want black on white (in slot 1)bull You want white on black (in slot 1)bull Canrsquot both win Could flip a coin to decide who
wins Better than both losing Weightings on coin neednrsquot be 50-50
bull May make sense to have person with highest worth get his way ndash as utility is greater (Would accomplish his goal alone) Efficient but not fair
bull What if we could transfer half of the gained utility to the other agent This is not normally allowed but could work out well
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
82
Examplesemi-cooperative
bull Both agents want contents of slots 1 and 1 swapped (and it is more efficient to cooperate)
bull Both have (possibly) conflicting goals for other slots
bull To accomplish one Agentrsquos goal by oneself is 26 8 for each swap and 10 for rest (pulling numbers out of the air)
bull Cooperative swap is 4 (pulling numbers out of air)
bull Idea work together to swap and then flip coin to see who gets his way for rest
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
83
Example semi-cooperative cont
bull Winning agent utility 26-4-10 = 12bull Losing agent utility -4 (as helped with swap)bull So with frac12 probability 1212 -412 = 4bull If they could have both been satisfied assume
cost for each is 24 Then utility is 2bull Note they double their utility if they are willing
to risk not achieving the goalbull Note kept just the joint part of the plan that was
more efficient and gambled on the rest (to remove the need to satisfy the other)
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
84
Negotiation Domains Worth-oriented
bull rdquoDomains where agents assign a worth to each
potential state (of the environment) which captures
its desirability for the agentrdquo (Rosenschein amp Zlotkin 1994)
bull agentrsquos goal is to bring about the state of the environment with
highest value
bull we assume that the collection of agents have available a set of
joint plans ndash a joint plan is executed by several different agents
bull Note ndash not rdquoall or nothingrdquo ndash but how close you got to goal
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
85
Worth-oriented Domain Definition
bull Can be defined as a tuple
EAgJc
bull E set of possible envirinment states
bull Ag set of possible agents
bull J set of possible joint plans
bull C cost of executing the plan
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
86
Worth Oriented Domain
bull Rates the acceptability of final statesbull Allows partially completed goalsbull Negotiation a joint plan schedules and goal relaxation May
reach a state that might be a little worse that the ultimate objective
bull Example ndash Multi-agent Tile world (like airport shuttle) ndash isnrsquot just a specific state but the value of work accomplished
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
87
Worth-oriented Domains and Multiple Attributes
bull If you want to pay for some software then you might consider
several attributes of the software such as the price quality and
support ndash multiple set of attributes
bull You may be willing to pay more if the quality is above a given limit
ie you canrsquot get it cheaper without compromising on quality
Pareto Optimal ndash Need to find the price for acceptable quality and
support (without compromising on some attributes)
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
88
How can we calculate Utility
bull Weighting each attribute
ndash Utility = Price60 + quality15 + support25
bull Ratingranking each attribute
ndash Price 1 quality 2 support 3
bull Using constraints on an attribute
ndash Price[5100] quality[0-10] support[1-5]
ndash Try to find the pareto optimum
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
89
Incomplete Information
bull Donrsquot know tasks of others in TODbull Solution
ndash Exchange missing informationndash Penalty for lie
bull Possible liesndash False information
bull Hiding lettersbull Phantom letters
ndash Not carry out a commitment
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
90
Subadditive Task Oriented Domainbull the cost of the union of sum of the costs of the separate
sets ndash adds to a sub-costbull for finite XY in T c(X U Y) lt= c(X) + c(Y))bull Example of subadditive
ndash Deliver to one saves distance to other (in a tree arrangement)
bull Example of subadditive TOD (= rather than lt)ndash deliver in opposite directions ndashdoing both saves nothing
bull Not subadditive doing both actually costs more than the sum of the pieces Say electrical power costs where I get above a threshold and have to buy new equipment
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
91
Decoy task
bull We call producible phantom tasks decoy tasks (no risk of being discovered) Only unproducible phantom tasks are called phantom tasks
bull Example bull Need to pick something up at store (Can think
of something for them to pick up but if you are the one assigned you wonrsquot bother to make the trip)
bull Need to deliver empty letter (no good but deliverer wonrsquot discover lie)
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
92
Incentive compatible Mechanism
bull L there exists a beneficial lie in some encounterbull T There exists no beneficial liebull TP Truth is dominant if the penalty for lying is stiff
enough
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
93
Explanation of arrow
bull If it is never beneficial in a mixed deal encounter to use a phntom lie (with penalties) then it is certainly never beneficial to do so in an all-or-nothing mixed deal encounter (which is just a subset of the mixed deal encounters)
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
94
Concave Task Oriented Domainbull We have 2 tasks X and Y where X is a subset of Ybull Another set of task Z is introduced
ndash c(X U Z) - c(X) gt= c(Y U Z) - c(Y)
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
95
Tentative Explanation of Previous Chart
bull I think Arrows show reasons we know this fact (diagonal arrows are between domains) Rule beginning is a fixed point
bull For example What is true of a phantom task may be true for a decoy task in same domain as a phantom is just a decoy task we donrsquot have to create
bull Similarly what is true for a mixed deal may be true for an all or nothing deal (in the same domain) as a mixed deal is an all or nothing deal where one choice is empty The direction of the relationship may depend on truth (never helps) or lie (sometimes helps)
bull The relationships can also go between domains as sub-additive is a superclass of concave and a super class of modular
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
96
Modular TODbull c(X U Y) = c(X) + c(Y) - c(X Y)bull Notice modular encourages truth telling more than others
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
97
For subadditive domain
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
98
Attributesof task system-Concavity
bullc(YU Z) ndashc(Y) lec(XU Z) ndashc(X)bullThe cost of tasks Z adds to set of tasks Y cannot be greater than the cost Z add to a subset of Y bullExpect it to add more to subset (as is smaller)
bullAt seats ndash is postmen doman concave (no unless restricted to trees)
Example Y is all shadedblue nodes X is nodes in polygon
adding Z adds 0 to X (as was going that way anyway) but adds 2 to its superset Y (as was going around loop)
bull Concavity implies sub-additivitybullModularity implies concavity
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
99
Examples of task systems
Database Queries
bullAgents have to access to a common DB and each has to carry out aset of queriesbullAgents can exchange results of queries and sub-queries
The Fax DomainbullAgents are sending faxes to locations on a telephone networkbullMultiple faxes can be sent once the connection is established with receiving nodebullThe Agents can exchange message to be faxed
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
100
Attributes-Modularity
bull c(XU Y) = c(X) + c(Y) ndashc(XcapY)
bull bullThe cost of the combination of 2 sets of tasks is exactly the sum of their individual costs minus the cost of their intersection
bull Only Fax Domain is modular (as costs are independent)
bull Modularity implies concavity
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
101
3-dimensional table of Characterization of Relationship Implied relationship between cells Implied relationship with same domain attribute
bull L means lying may be beneficial
bull T means telling the truth is always beneficial
bull TPrefers to lies which are not beneficial because they may always be discovered
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
102
Incentive Compatible Fixed Points (FP) (return home)
FP1 in SubadditiveTOD any Optimal Negotiation Mechanism (ONM) over A-or-N deals ldquohidingrdquo lies are not beneficial
bull ExA1hides letter to c his utility doesnrsquot increase
bull If he tells truth p=12 bull Expected util (abc)12 = 5bull Lie p=12 (as utility is same)bull Expected util (for 1) (abc)12 = frac12(0)
+ frac12(2) = 1 (as has to deliver the lie)
1
44
1
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
103
bull FP2 in SubadditiveTOD any ONM over Mixed deals every ldquophantomrdquo lie has a positive probability of being discovered (as if other person delivers phantom you are found out)
bull FP3 in Concave TOD any ONM over Mixed deals no ldquodecoyrdquo lie is beneficial (as less increased cost is assumed so probabilities would be assigned to reflect the assumed extra work)
bull FP4 in Modular TOD any ONM over Pure deals no ldquodecoyrdquo lie is beneficial (modular tends to add exact cost ndash hard to win)
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
104
FP4
Suppose agent 2 lies about having a delivery to c
Under Lie ndash benefits are shown
(the apparent benefit is no different than the real benefit)
Under truth The uitlities are 42 and someone has to get the better deal (under a pure deal) JUST LIKE IN THIS CASE The lie makes no difference
Irsquom assuming we have some way of deciding who gets the better deal that is fair over time
1 U(1) 2 U(2)
Seems
U(2)
(act)
a 2 bc 4 4
b 4 ac 2 2
bc 2 a 4 2
ab 0 c 6 6
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
105
Non-incentive compatible fixed points
bull FP5 in Concave TOD any ONM over Pure deals ldquoPhantomrdquo lies can be beneficial
bull Example from next slideA1creates Phantom letter at node c his utility has risen from 3 to 4
bull Truth p = frac12 so utility for agent 1 is (ab) frac12 = frac12(4) + frac12(2) = 3
bull Lie (bca) is logical division as no percentbull Util for agent 1 is 6 (org cost) ndash 2(deal cost) = 4
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
106
bull FP6 in SubadditiveTOD any ONM over A-or-N deals ldquoDecoyrdquo lies can be beneficial (not harmful) (as it changes the probability If you deliver I make you deliver to h)
bull Ex2 (from next slide)A1lies with decoy letter to h (trying to make agent 2 think picking up bc is worse for agent 1 than it is) his utility has rised from 15 to 172 (If I deliver I donrsquot deliver h)
bull If tells truth p (of agent 1 delivering all) = 914 as bull p(-1) + (1-p)6 = p(4) + (1-p)(-3) 14p=9bull If invents task h p=1118 asbull p(-3) + (1-p)6 = p(4) + (1-p)(-5)bull Utility(p=914) is p(-1) + (1-p)6 = -914 +3014 = 2114 =
15bull Utility(p=1118) is p(-1) + (1-p)6 = -1118 +4218 = 3118
= 172bull SO ndash lying helped
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
107
Postmen ndash return to postoffice
Concave
Subadditive(h is decoy)
Phantom
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
108
Non incentive compatible fixed points
bull FP7 in Modular TOD any ONM over Pure deals ldquoHiderdquo lie can be beneficial (as you think I have less so increase load will cost more than it realy does)
bull Ex3 (from next slide) A1 hides his letter node bbull (eb) = utility for A1 (under lie) is 0 = utility for A2 (under lie) is 4 UNFAIR (under lie)
bull (be) = utility for A1 (under lie) is 2 = utility for A2 (under lie) is 2bull So I get sent to b but I really needed to go there
anyway so my utility is actually 4 (as I donrsquot go to e)
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
109
bull FP8in Modular TOD any ONM over Mixed deals ldquoHiderdquo lies can be beneficial
bull Ex4 A1 hides his letter to node abull A1rsquos Utility is 45 gt 4 (Utility of telling the truth)bull Under truth Util(faebcd)12 = 4 (save going to two)bull Under lie divide as (efdcab)p (you always win and I always lose
Since work is same swapping cannot help In a mixed deal the choices must be unbalanced
bull Try again under lie (abcdef)pbull p(4) + (1-p)(0) = p(2) + (1-p)(6)bull 4p = -4p + 6 bull p = 34 bull Utility is actuallybull 34(6) + 14(0) = 45bull Note when I get assigned cdef frac14 of the time I STILL have to
deliver to node a (after completing by agreed upon deliveries) So I end up going 5 places (which is what I was assigned originally) Zero utility to that
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
110
Modular
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
111
Conclusion
ndash 1048698In order to use Negotiation Protocols it is necessary to know when protocols are appropriate
ndash 1048698TODrsquoscover an important set of Multi-agent interaction
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
112
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
113
MAS Compromise Negotiation process for conflicting goals
bull Identify potential interactionsbull Modify intentions to avoid harmful interactions or
create cooperative situations
bull Techniques requiredndash Representing and maintaining belief modelsndash Reasoning about other agents beliefsndash Influencing other agents intentions and beliefs
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
114
PERSUADER ndash case study
bull Program to resolve problems in labor relations domainbull Agents
ndash Companyndash Unionndash Mediator
bull Tasksndash Generation of proposalndash Generation of counter proposal based on feedback from
dissenting partyndash Persuasive argumentation
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
115
Negotiation Methods Case Based Reasoning
bull Uses past negotiation experiences as guides to present negotiation (like in court of law ndash cite previous decisions)
bull Processndash Retrieve appropriate precedent cases from memoryndash Select the most appropriate casendash Construct an appropriate solutionndash Evaluate solution for applicability to current casendash Modify the solution appropriately
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
116
Case Based Reasoning
bull Cases organized and retrieved according to conceptual similarities
bull Advantagesndash Minimizes need for information exchangendash Avoids problems by reasoning from past failures Intentional
remindingndash Repair for past failure is used Reduces computation
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
117
Negotiation Methods Preference Analysis
bull From scratch planning methodbull Based on multi attribute utility theorybull Gets a overall utility curve out of individual onesbull Expresses the tradeoffs an agent is willing to makebull Property of the proposed compromise
ndash Maximizes joint payoffndash Minimizes payoff difference
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
118
Persuasive argumentation
bull Argumentation goalsndash Ways that an agentrsquos beliefs and behaviors can be affected by
an argument
bull Increasing payoffndash Change importance attached to an issuendash Changing utility value of an issue
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
119
Narrowing differences
bull Gets feedback from rejecting partyndash Objectionable issuesndash Reason for rejectionndash Importance attached to issues
bull Increases payoff of rejecting party by greater amount than reducing payoff for agreed parties
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
120
Experiments
bull Without Memory ndash 30 more proposalsbull Without argumentation ndash fewer proposals and
better solutionsbull No failure avoidance ndash more proposals with
objectionsbull No preference analysis ndash Oscillatory conditionbull No feedback ndash communication overhead
increased by 23
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
121
Multiple Attribute Example
2 agents are trying to set up a meeting The first agent wishes to
meet later in the day while the second wishes to meet earlier in the
day Both prefer today to tomorrow While the first agent assigns
highest worth to a meeting at 1600hrs she also assigns
progressively smaller worths to a meeting at 1500hrs 1400hrshellip
By showing flexibility and accepting a sub-optimal time an agent
can accept a lower worth which may have other payoffs (eg
reduced travel costs)
Worth function for first agent
0
100
9 12 16
Ref Rosenschein amp Zlotkin 1994
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
122
Utility Graphs - convergence
bull Each agent concedes in every round of negotiation
bull Eventually reach an agreement
time
Utility
No of negotiations
Agentj
Agenti
Point of acceptance
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
123
Utility Graphs - no agreement
bullNo agreement
Agentj finds offer unacceptable
time
Utility
Agentj
Agenti
No of negotiations
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
124
Argumentation
bull The process of attempting to convince others of
something
bull Why argument-based negotiationsgame-theoretic
approaches have limitations
bull Positions cannot be justified ndash Why did the agent pay so
much for the car
bull Positions cannot be changed ndash Initially I wanted a car with a
sun roof But I changed preference during the buying
process
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
125
bull 4 modes of argument (Gilbert 1994)
1 Logical - rdquoIf you accept A and accept A implies
B then you must accept that Brdquo
2 Emotional - rdquoHow would you feel if it happened
to yourdquo
3 Visceral - participant stamps their feet and show
the strength of their feelings
4 Kisceral - Appeals to the intuitive ndash doesnrsquot this
seem reasonable
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
126
Logic Based Argumentation
bull Basic form of argumentation
Database (SentenceGrounds)Where
Database is a (possibly inconsistent) set of logical formulae
Sentence is a logical formula know as the conclusion
Grounds is a set of logical formula
grounds database
sentence can be proved from grounds
(we give reason for our conclusions)
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
127
Attacking Arguments
bull Milk is good for you
bull Cheese is made from milk
bull Cheese is good for you
Two fundamental kinds of attack
bull Undercut (invalidate premise) milk isnrsquot good for you if fatty
bull Rebut (contradict conclusion) Cheese is bad for bones
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
128
Attacking arguments
bull Derived notions of attack used in Literature
ndash A attacks B = A u B or A r B
ndash A defeats B = A u B or (A r B and not B u A)
ndash A strongly attacks B = A a B and not B u A
ndash A strongly undercuts B = A u B and not B u A
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
129
Proposition Hierarchy of attacks
Undercuts = u
Strongly undercuts = su = u - u -1
Strongly attacks = sa = (u r ) - u -1
Defeats = d = u ( r - u -1)
Attacks = a = u r
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
130
Abstract Argumentationbull Concerned with the overall structure of the argument
(rather than internals of arguments)bull Write x y indicates
ndash ldquoargument x attacks argument yrdquondash ldquox is a counterexample of yrdquondash ldquox is an attacker of yrdquo
where we are not actually concerned as to what x y arebull An abstract argument system is a collection or
arguments together with a relation ldquordquo saying what attacks what
bull An argument is out if it has an undefeated attacker and in if all its attackers are defeated
bull Assumption ndash true unless proven false
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
131
Admissible Arguments ndash mutually defensible
1 argument x is attacked if no member attacks y and yx
2 argument x is acceptable if every attacker of x is attacked
3 argument set is conflict free if none attack each other
4 set is admissible if conflict free and each argument is acceptable (any attackers are attacked)
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
132
a
b
cd
Which sets of arguments can be true c is always attacked
d is always accpetable
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-
133
An Example Abstract Argument System
- Slide 1
- Voting
- Slide 4
- Slide 5
- Slide 6
- Slide 7
- Borda Paradox ndash remove loser winner changes (notice c is always ahead of removed item)
- Strategic (insincere) voters
- Typical Competition Mechanisms
- Negotiation
- Mechanisms Protocols Strategies
- Slide 13
- Protocol
- Game Theory
- Mechanisms Design
- Attributes not universally accepted
- Negotiation Protocol
- Thought Question
- Negotiation Process 1
- Negotiation Process 2
- Many types of interactive concession based methods
- Jointly Improving Direction method
- Typical Negotiation Problems
- Complex Negotiations
- Single issue negotiation
- Multiple Issue negotiation
- How many agents are involved
- Negotiation DomainsTask-oriented
- Task-oriented Domain Definition
- Formalization of TOD
- Redistribution of Tasks
- Examples of TOD
- Possible Deals
- Figure deals knowing union must be ab
- Utility Function for Agents
- Parcel Delivery Domain (assuming do not have to return home ndash like Uhaul)
- Dominant Deals
- Negotiation Set Space of Negotiation
- Utility Function for Agents (example from previous slide)
- Individual Rational for Both (eliminate any choices that are negative for either)
- Pareto Optimal Deals
- Negotiation Set
- Negotiation Set illustrated
- Negotiation Set in Task-oriented Domains
- Slide 46
- The Monotonic Concession Protocol ndash One direction move towards middle
- Condition to Consent an Agreement
- The Monotonic Concession Protocol
- Negotiation Strategy
- The Zeuthen Strategy ndash a refinement of monotonic protocol
- The Zeuthen Strategy
- Willingness to Risk Conflict
- Risk Evaluation
- Slide 55
- The Risk Factor
- Slide 57
- About MCP and Zeuthen Strategies
- Parcel Delivery Domain recall agent1 delivered to a agent2 delivered to a and b
- Conflict Deal
- Parcel Delivery Domain Example 2 (donrsquot return to dist point)
- Parcel Delivery Domain Example 2 (Zeuthen works here both concede on equal risk)
- What bothers you about the previous agreement
- Nash Equilibrium
- State Oriented Domain
- Slide 66
- Assumptions of SOD
- Achievement of Final State
- What if choices donrsquot benefit others fairly
- Mixed deal
- Cost
- Parcel Delivery Domain (assuming do not have to return home)
- Consider deal 3 with probability
- Try again with other choice in negotiation set
- Slide 75
- Slide 76
- Examples Cooperative Each is helped by joint plan
- Examples Compromise Both can succeed but worse for both than if other agent werenrsquot there
- Choices
- Compromise continued
- Example conflict
- Examplesemi-cooperative
- Example semi-cooperative cont
- Negotiation Domains Worth-oriented
- Worth-oriented Domain Definition
- Worth Oriented Domain
- Worth-oriented Domains and Multiple Attributes
- How can we calculate Utility
- Incomplete Information
- Subadditive Task Oriented Domain
- Decoy task
- Incentive compatible Mechanism
- Explanation of arrow
- Concave Task Oriented Domain
- Tentative Explanation of Previous Chart
- Modular TOD
- For subadditive domain
- Slide 98
- Examples of task systems
- Attributes-Modularity
- 3-dimensional table of Characterization of Relationship
- Incentive Compatible Fixed Points (FP) (return home)
- Slide 103
- FP4
- Non-incentive compatible fixed points
- Slide 106
- Postmen ndash return to postoffice
- Non incentive compatible fixed points
- Slide 109
- Slide 110
- Conclusion
- Slide 112
- MAS Compromise Negotiation process for conflicting goals
- PERSUADER ndash case study
- Negotiation Methods Case Based Reasoning
- Case Based Reasoning
- Negotiation Methods Preference Analysis
- Persuasive argumentation
- Narrowing differences
- Experiments
- Multiple Attribute Example
- Utility Graphs - convergence
- Utility Graphs - no agreement
- Argumentation
- Slide 125
- Logic Based Argumentation
- Attacking Arguments
- Attacking arguments
- Proposition Hierarchy of attacks
- Abstract Argumentation
- Admissible Arguments ndash mutually defensible
- Slide 132
- An Example Abstract Argument System
-