minimum spanning treesdszajda/classes/... · minimum spanning trees spanning subgraph n subgraph of...
TRANSCRIPT
© 2015 Goodrich and Tamassia CS 315 1
Minimum Spanning Trees
Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, 2015
© 2015 Goodrich and Tamassia
Aside: Another Graph Search Algorithm
q The A* Search Algorithmn Sometimes referred to as a “best first” (a.k.a. ”informed search”)
algorithmn Like Dijksta, it maintains a tree of paths from a source, but in
this case there is a designated goal vertexw In fact it is considered an extension of Disjkstra’s Algorithm
n The tree is grown based on a heuristic estimation of the best path from the current tree to the goal
n Developed in 1968 by Stanford researchers working on robotic path planning
q You are responsible for learning it on your own: https://en.wikipedia.org/wiki/A*_search_algorithm
CS 315 2
© 2015 Goodrich and Tamassia
Application: Connecting a Networkq Suppose the remote mountain country of Vectoria has been given
a major grant to install a large Wi-Fi at the center of each of its mountain villages.
q Communication cables can run from the main Internet access point to a village tower and cables can also run between pairs of towers.
q The challenge is to interconnect all the towers and the Internet access point as cheaply as possible.
CS 315 3
© 2015 Goodrich and Tamassia
Application: Connecting a Networkq We can model this problem using a graph, G, where
each vertex in G is the location of a Wi-Fi Internet access point or village tower, and an edge in G is a possible cable we could run between two such vertices.
q Each edge in G could then be given a weight that is equal to the cost of running the cable that that edge represents.
q Thus, we are interested in finding a connected acyclic subgraph of G that includes all the vertices of G and has minimum total cost.
q Using the language of graph theory, we are interested in finding a minimum spanning tree (MST) of G.
CS 315 4
© 2015 Goodrich and Tamassia CS 315 5
Minimum Spanning TreesSpanning subgraph
n Subgraph of a graph Gcontaining all the vertices of G
Spanning treen Spanning subgraph that is
itself a (free) treeMinimum spanning tree (MST)
n Spanning tree of a weighted graph with minimum total edge weight
q Applicationsn Communications networksn Transportation networks
ORDPIT
ATL
STL
DEN
DFW
DCA
101
9
8
6
3
25
7
4
© 2015 Goodrich and Tamassia CS 315 6
Cycle PropertyCycle Property:
n Let T be a minimum spanning tree of a weighted graph G
n Let e be an edge of Gthat is not in T and C let be the cycle formed by ewith T
n For every edge f of C,weight(f) £ weight(e)
Proof:n By contradictionn If weight(f) > weight(e) we
can get a spanning tree of smaller weight by replacing e with f
84
2 36
7
7
9
8e
Cf
84
2 36
7
7
9
8
C
e
f
Replacing f with e yieldsa better spanning tree
© 2015 Goodrich and Tamassia CS 315 7
U V
Partition PropertyPartition Property:
n Consider a partition of the vertices of G into subsets U and V
n Let e be an edge of minimum weight across the partition
n There is a minimum spanning tree of G containing edge e
Proof:n Let T be an MST of Gn If T does not contain e, consider the
cycle C formed by e with T and let fbe an edge of C across the partition
n By the cycle property,weight(f) £ weight(e)
n Thus, weight(f) = weight(e)n We obtain another MST by replacing
f with e
74
2 85
7
3
9
8 e
f
74
2 85
7
3
9
8 e
f
Replacing f with e yieldsanother MST
U V
© 2015 Goodrich and Tamassia CS 315 8
Prim-Jarnik’s Algorithmq Similar to Dijkstra’s algorithm (though of course it has
a different goal)q We pick an arbitrary vertex s and we grow the MST as
a cloud of vertices, starting from sq We store with each vertex v a label d(v) representing
the smallest weight of an edge connecting v to a vertex in the cloud
q At each step:n We add to the cloud the vertex u outside the cloud with the
smallest distance labeln We update the labels of the vertices adjacent to u
© 2015 Goodrich and Tamassia CS 315 9
Prim-Jarnik Pseudo-code
© 2015 Goodrich and Tamassia CS 315 10
ExampleB
D
C
A
F
E
74
28
5
7
3
9
8
0 7
2
8 ¥
¥
BD
C
A
F
E
74
28
5
7
3
9
8
0 7
2
5 ¥
7
BD
C
A
F
E
74
28
5
7
3
9
8
0 7
2
5 ¥
7
BD
C
A
F
E
74
28
5
7
3
9
8
0 7
2
5 4
7
© 2015 Goodrich and Tamassia CS 315 11
Example (contd.)
BD
C
A
F
E
74
28
5
7
3
9
8
0 3
2
5 4
7
BD
C
A
F
E
74
28
5
7
3
9
8
0 3
2
5 4
7
© 2015 Goodrich and Tamassia CS 315 12
Analysisq Graph operations
n We cycle through the incident edges once for each vertexq Label operations
n We set/get the distance, parent and locator labels of vertex z O(deg(z))times
n Setting/getting a label takes O(1) timeq Priority queue operations
n Each vertex is inserted once into and removed once from the priority queue, where each insertion or removal takes O(log n) time
n The key of a vertex w in the priority queue is modified at most deg(w) times, where each key change takes O(log n) time
q Prim-Jarnik’s algorithm runs in O((n + m) log n) time provided the graph is represented by the adjacency list structuren Recall that Sv deg(v) = 2m
q The running time is O(m log n) since the graph is connected
© 2015 Goodrich and Tamassia
Kruskal’s Approachq Maintain a partition of the vertices into
clustersn Initially, single-vertex clustersn Keep an MST for each clustern Merge “closest” clusters and their MSTs
q A priority queue stores the edges outside clustersn Key: weightn Element: edge
q At the end of the algorithmn One cluster and one MSTCS 315 13
© 2015 Goodrich and Tamassia
Kruskal’s Algorithm
CS 315 14
© 2015 Goodrich and Tamassia CS 315 15
Example of Kruskal’s AlgorithmB
G
C
A
F
D
4
1 35
10
2
8
7
6E
H11
9
BG
C
A
F
D
4
1 35
10
2
8
7
6E
H11
9
BG
C
A
F
D
4
1 35
10
2
8
7
6E
H11
9
BG
C
A
F
D
4
1 35
10
2
8
7
6E
H11
9
© 2015 Goodrich and Tamassia CS 315 16
Example (contd.)
four steps
two s
teps
BG
C
A
F
D
4
1 35
10
2
8
7
6E
H11
9
BG
C
A
F
D
4
1 35
10
2
8
7
6E
H11
9
BG
C
A
F
D
4
1 35
10
2
8
7
6E
H11
9
BG
C
A
F
D
4
1 35
10
2
8
7
6E
H11
9
© 2015 Goodrich and Tamassia
Data Structure for Kruskal’s Algorithmq The algorithm maintains a forest of treesq A priority queue extracts the edges by increasing
weightq An edge is accepted it if connects distinct treesq We need a data structure that maintains a
partition, i.e., a collection of disjoint sets, with operations:n makeSet(u): create a set consisting of un find(u): return the set storing un union(A, B): replace sets A and B with their union
CS 315 17
© 2015 Goodrich and Tamassia CS 315 18
List-based Partitionq Each set is stored in a sequenceq Each element has a reference back to the set
n operation find(u) takes O(1) time, and returns the set of which u is a member.
n in operation union(A,B), we move the elements of the smaller set to the sequence of the larger set and update their references
n the time for operation union(A,B) is min(|A|, |B|)q Whenever an element is processed, it goes into a
set of size at least double, hence each element is processed at most log n times
© 2015 Goodrich and Tamassia
Partition-Based Implementationq Partition-based version of Kruskal’s
Algorithm n Cluster merges as unions n Cluster locations as finds
q Running time O((n + m) log n)n Priority Queue operations: O(m log n)n Union-Find operations: O(n log n)
CS 315 19
© 2015 Goodrich and Tamassia
Kruskal’s Algorithm
CS 315 20
© 2015 Goodrich and Tamassia CS 315 22
Baruvka’s Algorithmq Like Kruskal’s Algorithm, Baruvka’s algorithm grows many
clusters at once and maintains a forest Tq Each iteration of the while loop halves the number of
connected components in forest Tq The running time is O(m log n)
Algorithm BaruvkaMST(G)T ¬ V {just the vertices of G}
while T has fewer than n - 1 edges dofor each connected component C in T do
Let edge e be the smallest-weight edge from C to another component in Tif e is not already in T then
Add edge e to Treturn T
© 2015 Goodrich and Tamassia
Example of Baruvka’s Algorithm (animated)
CS 315 23
15
4
3
2
3
4
49
6
87
6
54
9
6
8
Slide by Matt Stallmann included with permission.
15
4
3
2
3
4
49
6
87
6
5