jerry sussmanpage 1 of 61 class discussion “analyzing the mac-levelbehavior of wireless networks...
TRANSCRIPT
Jerry Sussman Page 1 of 61
Class Discussion“Analyzing the MAC-levelBehavior of
Wireless Networks in the Wild”
Discussion Guided byJerry Sussman
Jerry Sussman Page 2 of 61
Critique Guidance
• Critique Instructions:– Critique the paper, not the me!– All students should read the paper before class– Critique is due prior to the following week’s class– Discussion Leader MUST use slides to guide the
discussion– Critiques should be organized / structured per
website– This is a 300-level course….there should be a lot of
discussion.
Jerry Sussman Page 3 of 61
Critique Guidance• (10%) State the problem the paper is trying to solve. • (20%) State the main contribution of the paper: solving a new problem,
proposing a new algorithm, or presenting a new evaluation (analysis). If a new problem, why was the problem important? Is the problem still important today? Will the problem be important tomorrow? If a new algorithm or new evaluation (analysis), what are the improvements over previous algorithms or evaluations? How do they come up with the new algorithm or evaluation?
• (15%) Summarize the (at most) 3 key main ideas (each in 1 sentence.) • (30%) Critique the main contribution
– Rate the significance of the paper on a scale of 5 (breakthrough), 4 (significant contribution), 3 (modest contribution), 2 (incremental contribution), 1 (no contribution or negative contribution). Explain your rating in a sentence or two.
– Rate how convincing the methodology is: how do the authors justify the solution approach or evaluation? Do the authors use arguments, analyses, experiments, simulations, or a combination of them? Do the claims and conclusions follow from the arguments, analyses or experiments? Are the assumptions realistic (at the time of the research)? Are the assumptions still valid today? Are the experiments well designed? Are there different experiments that would be more convincing? Are there other alternatives the authors should have considered? (And, of course, is the paper free of methodological errors.)
– What is the most important limitation of the approach? • (15%) What lessons should researchers and builders take away from this
work. What (if any) questions does this work leave open? • (10%) Propose your improvement on the same problem. • Note: the purpose of this template is to serve as a starting point, instead of a
constraint. Use your judgment and creativity. Some advice through the resource link of the class can be helpful.
Jerry Sussman Page 4 of 61
Agenda
• Authors• Summary• Background• Wit• Theory Behind Wit• Implementation of Wit• Wit Evaluation• Inference versus Additional Monitors• Application in Live Environment• Conclusion
Jerry Sussman Page 5 of 61
Authors
Ratual Mahajan MicrosoftMaya Rodrig University of
WashingtonDavid Wetherall University of
WashingtonJohn Zahorjan University of WashingtonFunding NSF
Presented SIGCOMM’06September 11-15, 2006Pisa, Italy
Jerry Sussman Page 6 of 61
Summary First
• Paper Documents WIT– Passive Wireless Analysis Tool– Analyzes MAC-Level behavior on Wireless
Networks
• Paper Assesses WIT Performance– Based on Real & Simulated Data
• Authors tested WIT against live Wireless Network
Jerry Sussman Page 7 of 61
Why Is WIT Needed?
• ???
Jerry Sussman Page 8 of 61
Why Is WIT Needed?
– Understand how live networks communicate in different situations:• Highly loaded environment• Low load environments• Interfering wireless LANs, etc.
– Critical to knowing how to improve performance of wireless LANs.
Jerry Sussman Page 9 of 61
Background
• Measurement-driven analysis of live networks– Critical to understanding live performance
of networks– Critical to improving performance
• Measurement-driven refers to:– Part Measured / Collected data– Part ‘generated’ data
Jerry Sussman Page 10 of 61
Background
• Wireless Measurement-Driven Analysis– At time of paper publication, Lacking in:
• Software Collection/Analysis Tools• Performance data from wireless networks• Reasons:
– Based on Simple Network Mgt Protocol (SNMP) logs from AP
– AP logs» Low fidelity (i.e. course logs) of AP Side» No data from client view
– Packet traces from Wired hosts next to AP» Traces omit wireless retransmissions
Jerry Sussman Page 11 of 61
Background
• Unrealistic Solution– Instrument entire wireless network
• Proven Successful in control environments• Unrealistic and not a match for commercial application
• Only Realistic Solution– Obtain trace via passive monitoring
• 1 or more nodes declared “monitors”• Monitors placed in vicinity of wireless network• Record attributes of all transmissions
– Trivial to deploy
Jerry Sussman Page 12 of 61
Background
• Problems with “Passive Monitoring”– Data / Traces may be incomplete
• Packets dropped due to weak signal• Packets dropped due to collisions
– Difficult to know what packets are missing from a monitor
– Monitor stations can’t determine if destination properly received packets• Important for determining reception
probability
Jerry Sussman Page 13 of 61
Background
• This paper is trying to:– Find a way to assemble an accurate trace of
wireless environment for analysis• Use data from multiple monitoring stations• Determine missing packets• Re-create missing packets• Combine into single Trace file
– Determine Network Performance• How often do clients retransmit their
packets• Determine loss effects between two nodes• Effect of increased load on the network
Jerry Sussman Page 14 of 61
Background• Authors attempt to solve problem with
WIT:– Paper presents WIT, a tool for
Measurement-Driven Analysis.– WIT has 3 modules which solve key
problems identified earlier
Jerry Sussman Page 15 of 61
Wit
Jerry Sussman Page 16 of 61
Why Is WIT Needed?
• Quantify Wireless Network Performance
• Estimate # of competing stations• Assist in diagnosing wireless
network problems
Jerry Sussman Page 17 of 61
WIT Core Processing Steps
1. Merging procedure 2. Packet Reconstruction3. Determination of Network Performance
Jerry Sussman Page 18 of 61
Merging procedure{1st Core Processing Step}
• Combine incomplete traces from multiple, independent monitors
• Provides a complete trace for follow-on steps
• Based upon collected date– Not inferred or reconstructed
Jerry Sussman Page 19 of 61
Packet Reconstruction{Second Core Processing Step}
• Reconstructs packets not captured by any monitor– Strong inference engine– Determines if packet received at
destination– Again, provides more complete
trace for follow-on step
Jerry Sussman Page 20 of 61
Determination of Network Performance {Third Core Processing Step}
• WIT Calculates Network Performance
– Input:Constructed trace
– Output:• Typical simple network measurements• Packet reception probabilities• Estimates number of nodes contending for
medium – Not previously achieved according to authors
Jerry Sussman Page 21 of 61
Passive Monitoring Pipeline
Client
Client
Client
`
Monitor
`
Monitor
`
Monitor
MERGE InferDerive
Measurements
Incomplete Views / Traces
Merge Incomplete views into
one consistent
view
Determine and replace missing packets to form complete trace
Derive Network Level
Measurements
Jerry Sussman Page 22 of 61
WIT Evaluation
• After Development of WIT, Authors faced with Evaluation Task– Used mix of real and simulated data– Used WIT at SIGCOMM 2004 conference
• Multi-monitor traces captured• Uncovered MAC-layer characteristics of environment
– Network was dominated by period of low contention during which the medium was poorly utilized, even though APs were waiting to tx packets
» Suggests 802.11 MAC tuned for high traffic levels that are uncommon on real networks.
– Authors claim this can’t be obtained by other methods
Jerry Sussman Page 23 of 61
Now for the Theory behind WIT phases
{Implementation of Phases will follow….}
Jerry Sussman Page 24 of 61
3 Core Phases
• Merging of Traces• Inferring Missing Information• Deriving Measurments /
Performance
Jerry Sussman Page 25 of 61
3 Core Phases
• Merging of Traces• Inferring Missing Information• Deriving Measurments /
Performance
Jerry Sussman Page 26 of 61
Merging of Traces
Client
Client
Client
`
Monitor
`
Monitor
`
Monitor
MERGE
Incomplete Views / Traces
Merge Incomplete views into
one consistent
view
Jerry Sussman Page 27 of 61
Merging of Traces
• Input:– Number of Packet traces– 1 Trace per monitor– Timestamps reflect local AP Receive Packet
time
Jerry Sussman Page 28 of 61
Merging of Traces
• Output:– Merge into single, consistent timeline for
all packets observed• Eliminate duplicates• Assign coherent timestamps to all
packets independent of monitor• Timestamp accuracy to a few
microseconds required.• Identify and Eliminate Duplicates
Jerry Sussman Page 29 of 61
Merging of Traces
• Timing, the critical element– Only few packets carry info guaranteed to
be unique over a few miliseconds– Only way to distinguish duplicates is by
time– Accurate timestamps are vital to creating
the merged trace– Reference packets are the key
Jerry Sussman Page 30 of 61
Merging of Traces
• Three Step Merging Process1. Identify the reference packets common
to both monitors– Beacons generated by APs as references
» Contain unique source MAC address» Contain 64-bit value of local,
microsecond resolution timer
Jerry Sussman Page 31 of 61
Merging of Traces
• Three Step Merging Process2. Use reference timestamps to translate
the time coordinates– Pair up two reference timestamps across two
traces– Time interval of secondary is altered to match
baseline trace– Constant added to align the two traces
between the two individual reference points– Resizing / alignment process adjusts for clock
drift and alignment bias between two monitors
Jerry Sussman Page 32 of 61
Merging of Traces
• Three Step Merging Process3. Identify and Remove duplicates
• Identify by matching:– Packet Type– Same Source– Same Destination– Time stamp that is less than ½ of
minimum time to transmit a packet
Note: The code for this would be straight forward however I suspect much time was spent reviewing the data and proving that the code/scheme worked.
Jerry Sussman Page 33 of 61
Merging of Traces
• Waterfall Merging Process– Merge two traces– Then merge third trace to baseline
trace• Approach is not most time efficient• Approach provides improved
precision:– New reference points continually added– Easier to find set of shared reference points as
more monitor traces merged
Jerry Sussman Page 34 of 61
3 Core Phases
• Merging of Traces• Inferring Missing Information• Deriving Measurments /
Performance
Jerry Sussman Page 35 of 61
Inferring Missing Information
Infer
Determine and replace missing packets to form complete trace
Client
Client
Client
`
Monitor
`
Monitor
`
Monitor
MERGE
Incomplete Views / Traces
Merge Incomplete views into
one consistent
view
Jerry Sussman Page 36 of 61
Inferring Missing Information
• Two Fundamental Purposes:1. Infer missing packets from collected &
merged data2. Estimate whether packets were received
by their destination
• Authors claim this is new
Jerry Sussman Page 37 of 61
Inferring Missing Information
• Key Technique:– Transmitted packets imply useful data
about the packets it must have received– Example:
• AP send ASSOCIATION RESPONSE only if it recently receive an ASSOCIATION REQUEST.
• If the merge trace contains the response but no request then we know request was successfully sent
– Also, sender and destination of missing request are known from response packet.
Jerry Sussman Page 38 of 61
Inferring Missing Information
• Processing the merged trace– Scan each packet and process
• Classify each packet type• Generate markers
– Ex: Ongoing conversation end
• Formal Language Approach (FSM)– Infer Packet Reception– Infer Missing Packets– Construct Packets as Required
Jerry Sussman Page 39 of 61
3 Core Phases
• Merging of Traces• Inferring Missing Information• Deriving
Measurements/Performance
Jerry Sussman Page 40 of 61
Deriving Measurements / Performance
DeriveMeasurements
Derive Network Level
Measurements
Infer
Determine and replace missing packets to form complete trace
Client
Client
Client
`
Monitor
`
Monitor
`
Monitor
MERGE
Incomplete Views / Traces
Merge Incomplete views into
one consistent
view
Jerry Sussman Page 41 of 61
Inferring Missing Information
• Merged Trace Can Be Mined:• Many ways to study detailed behavior
– Packet Reception probability– Estimate number of stations that are
competing for medium per snapshot in time
– Requires access to ‘State’– ‘randomly selected backoff values
– DATA & DATAretry packets
Jerry Sussman Page 42 of 61
Now for the Implementation of WIT
Jerry Sussman Page 43 of 61
WIT Implementation
• WIT Implented in 3 Components– halfWit– nitWit– dimWit
• Half, Nit, & DIM correspond to three pipeline phases discussed earlier
Jerry Sussman Page 44 of 61
WIT Implementation
• halfWit– Merge phase
• 1st Insert all traces into database• Database used to merge data as defined
earlier• Database also used to pass final merged
trace to nitWit• Uses merge-sort methodology
– Traces handled like queues
Jerry Sussman Page 45 of 61
WIT Implementation
• nitWit– Inference phase
• nitWit take output of halfWit• Determines and recreates missing
packets• Annotates captured and inferred packets
– Critical annotation for each packet is whether it was received.
– Retry packet fields are tracked
Note: Original implementation did not ‘merge’ captured and inferred packets because exact timing uncertainty. Different than theory writeup section.
Jerry Sussman Page 46 of 61
WIT Implementation
• dimWit– Derived Measures Component
• dimWit take output of nitWit• Produces summary network information• Produces number of contenders in the
network• Implemented to analyze tens of millions
of packets in a few minutes.
Jerry Sussman Page 47 of 61
Wit Evaluation
Jerry Sussman Page 48 of 61
Wit Evaluation
• Purpose of Evaluation:– Understand how well each phase works– Key questions to be evaluated:
• Quality of time synchronization?• Quality of merged product?• Accuracy of inferences?• Fraction of missing packets inferred?• Number of Contenders – accuracy?• Analyze improvement from more monitors or
more inference?
Jerry Sussman Page 49 of 61
Wit Evaluation
• Reality of this type of evaluation:– Comparing against ground truth unrealistic
• Too much detail• Unrealistic to create absolute controlled
environment
– Reduced to simulation as primary validation method
Jerry Sussman Page 50 of 61
Wit Evaluation
• Simulated Environment:– 2 Access Points (AP’s)– 40 clients randomly distributed on a grid– Packet Simulator
• Reception probability based on: – signal strength– Transmission rate– Existing packets in environment– Random bit errors
Jerry Sussman Page 51 of 61
Wit Evaluation
• Simulated Environment (continued):– 10 randomly distributed monitors– Detailed logs of simulated packet
generation and simulated packet collection.
Jerry Sussman Page 52 of 61
Wit Evaluation
• Merging– Check correctness & characterize quality of
time synchronization• Basis for waterfall merging
• Inference– Check ability to infer packet reception
statuses and missing packets
• Estimating Contenders– Run dimWit on merged traces and compare
against logs
Jerry Sussman Page 53 of 61
Wit Evaluation
• Results– Will Limit results here to high priority end-result…
the Contenders.• Worst case simulation with 90% packets
captured, dimWit is within +-1 87% of cases• In smaller simulation with 98% packets
captured, estiates are within +- 1 for 95% of the cases.
• Closer study reveals:– High error values tend to correspond to
cases with high number of contenders
Jerry Sussman Page 54 of 61
Wit Evaluation
• Results– Will Limit results here to high priority end-result…
the Contenders.• Worst case simulation with 90% packets
captured, dimWit is within +-1 87% of cases• In smaller simulation with 98% packets
captured, estiates are within +- 1 for 95% of the cases.
• Closer study reveals:– High error values tend to correspond to
cases with high number of contenders
Jerry Sussman Page 55 of 61
Inference Versus Additional Monitors
Jerry Sussman Page 56 of 61
Inference Versus Additional Monitors
• Both more inference and more monitors increase quality of results
• Can’t Increase Both In Real Life• Which Has More Bang-for-the-
Buck?– Test show diminishing returns as number
of monitors increase – Expected Result
Jerry Sussman Page 57 of 61
Applying To Live Environment
Jerry Sussman Page 58 of 61
Applying To Live Environment
• SIGCOMM 2004 Conference wireless environment– 4 days– 550 attendees– Large / busy setting– 5 Access Points– Channels 1 and 11– Internet via DSL access lines– Interfering Wireless Networks
• Number of transient wireless networks• Hotel Wireless Network• Private Wireless Network on Ch 6
– Montoring 24/7 During Conference
Jerry Sussman Page 59 of 61
Applying To Live Environment
• Results:– Successful merge trace produced for each
channel– One monitor didn’t have enough references in
common with merged trace so it was excluded• Lesson Learned: Placement of monitors
– Significant overlap in what each monitor ‘hears’
– Additional monitors increases number of unique packets in each trace• True even when two monitors right next to
each other• Therefore, even dense array of monitors
will miss packets
Jerry Sussman Page 60 of 61
Applying To Live Environment
• Results:– nitWit inferred that 80% of unicast packets
were received by their destination– nitWit inferred that 90% of total packets
were captured by the monitors– dimWit determined that Uplink to the AP
was more reliable than the downlink – Medium was inefficiently utilized– Reception probability did not decrease with
contention– Performance was stable at high contention
levels
Jerry Sussman Page 61 of 61
Concluding Remarks
• Wit implementation provides wireless live data not previously available
• Measurement-driven analysis, implemented by Wit, successfully evaluated
• Further Study warranted– Will lead to increased efficiency of
Wireless LANs