Adjacency List | ||||
Node | Pointers | |||
1 | 2 | 3 | 5 | null |
2 | 3 | 4 | null | |
3 | 2 | 4 | null | |
4 | null | |||
5 | 2 | 3 | 4 | null |
The nodes are visited in the order: 1 2 3 4 5
Why?
Depth First Search continually searches deeper into the graph until it comes upon nodes it has already visited. It then backtracks to continue searching for nodes it hasn't yet visited.
// template for depth-first search void dfs( Vertex v ) { v.visited = true; for each w in adjacency list of v if( !w.visited ) dfs( w ); }Lets trace the dfs algorithm on this graph:
stack frame current current visited vertex adjacency list dfs(1) 1 2,3,5 1 2 3,5 1 dfs(2) 2 3,4 1,2 3 4 1,2 dfs(3) 3 2,4 1,2,3 2 4 1,2,3 4 null 1,2,3 dfs(4) 4 null 1,2,3,4 dfs(3) 4 null 1,2,3,4 dfs(2) 3 4 1,2,3,4 4 null 1,2,3,4 dfs(1) 2 3,5 1,2,3,4 3 5 1,2,3,4 5 null 1,2,3,4 dfs(5) 5 2,3,4 1,2,3,4,5 2 3,4 1,2,3,4,5 3 4 1,2,3,4,5 4 null 1,2,3,4,5
The nodes are visited in the order: 1 2 3 5 4
Why?
Breadth First Search is the same as the level-order search we performed on trees. You place the initial node on a queue, and then continue to dequeue nodes unitl none are left. Each time a node is dequeued, you mark it as VISITED, and then you place its successor nodes on the queue if they are not marked VISITED.
// pseudo code for a Breadth First Search void bfs( Vertex v ) { Queue q = new Queue(); v.visited = true; q.enqueue( v ); while( !isEmpty() ) { v = dequeue(); for each w in adjacency list of v { if( !w.visited ) { w.visited = true; q.enqueue( w ); } } } }
Lets trace the bfs algorithm on this graph:
current queue currnent Visited vertex adjacency list 1 1 - 1 1 - 2,3,5 1 2 2 3,5 1,2 3 2,3 5 1,2,3 5 2,3,5 - 1,2,3,5 2 3,5 3,4 1,2,3,5 3 3,5 4 1,2,3,5 4 3,5,4 - 1,2,3,5,4 3 5,4 2,4 1,2,3,5,4 2 5,4 4 1,2,3,5,4 4 5,4 - 1,2,3,5,4 5 4 2,3,4 1,2,3,5,4 2 4 3,4 1,2,3,5,4 3 4 4 1,2,3,5,4 4 4 - 1,2,3,5,4 4 - - 1,2,3,5,4
1. Find the indegree of all vertices 2. Place vertex of indegree 0 on queue 3. While queue is not empty Get first vertex in queue Output this vertex for topological ordering Decrease indegree of every vertex it had an edge with by 1 Place any indegree 0 items on queue
Informal Proof. If N vertices, then there are N * (N-1) edges from each of the N vertices to each of the N-1 other vertices. But since exactly half of these are duplicate edges, then there are a total of (N*(N-1)) / 2
Proof By Induction:
Base case: N = 1, no edges and the formula holds.
Induction Step: Assume true for N vertices, show true for N+1 vertices.
By Induction: If a graph with N vertices has (N*(N-1)) / 2 edges show a graph with N+1 vertices has ((N+1)*N) / 2 edges. If we add a vertex to a graph with N vertices, we have to add a total of N new edges. So this graph will have the amount of edges in a graph with N vertices (which we know by the induction step) plus N more edges:
N * (N - 1) ------------ + N 2 N * (N - 1) + 2N = ----------------- 2 N^2 - N + 2N = -------------- 2 N^2 + N = --------- 2 (N + 1) * N = ------------- 2
The method you should describe is a TreeSort and the complexity is O(nlogn).
A[0:n-1]
.
If the graph is stored as an adjacency matrix, then each entry in the matrix can be a list of edges. If the graph is stored as an adjacency list, each edge can appear in the list (multiple edges will be in the list).
Index | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 |
Value | 35 | 24 | 31 | 15 | 22 | 12 | 3 | 13 |
YES!
Recall that in a max heap, every parent is larger than (or equal to) its parent.
Also recall that for any element in array position i, the left
child is in position 2i, the right child is in the cell after the
left child (2i + 1), and the parent is in position
floor
(i/2).
Here is the heap:
35 / \ 24 31 / \ / \ 15 22 12 3 / 13
true
or false
depending upon
whether the array is heap ordered or not (assume min heap ordering).
// returns true if A is heap ordered // A is int array, n number of items, i is index public boolean isheap( int A[], int n, int i ) { if (2*i > n) // reached a leaf node return true; // if heap condition satisfied by node and its children, recurse on the // two children else if ((A[i] <= A[2*i]) && (A[i] <= A[2*i+1])) return (isheap(A, n, 2*i) && isheap(A, n, 2*i+1)); else return false; // heap condition not satisfied }
The algorithm is MergeSort
To analyze the cost of mergesort, we note that the merge stop is a linear operation (O(N)) in the number of elements to sort. For N = 1 the time for mergesort is constant, and we can represent this as T(1). Otherwise, the time to mergesort N items is the time to do 2 recursive mergesorts of size N/2, plus the merge step (which is linear). We can express these facts as:
T(1) = 1 T(N) = 2 * T( N/2 ) + NThe above is a recurrence relation, expressing the recursive timing nature of this problem. To solve such a recurrence relation we do the following (we will assume N is a power of 2 so that our splits of the array are always into evenly balanced halves): Divide both sides by N:
T(N) T( N/2 ) ---- = -------- + 1 N N/2Since the expression above is valid for any power of 2, we can express the time to sort an array of N/2 elements as:
T( N/2 ) T( N/4 ) -------- = -------- + 1 N/2 N/4and continuing with successive powers of 2...
T( N/4 ) T( N/8 ) -------- = -------- + 1 N/4 N/8 ... = ... T(2) T(1) ---- = ---- + 1 2 1Now, add up all these equations. Notice that almost all the terms on the left hand sides of the equations also appear on the right hand sides of the equaions. Cancelling these terms, we are left with the following:
T(N) T(1) ---- = ---- + log N N 1The log N term is simply the sum of all the 1's on the right hand sides of the equations. Because we are splitting by powers of 2, there are log N equations total. Hence the sum of the 1's is log N.
Now, multiply through by N:
T(N) = N * T(1) + N * log N = N + N * log N = O(NlogN)
We can see that the selection sort for an array of size N takes N + (N-1) + (N-2) + ... + 1 iterations. Can we sum these up? This is just the sum of the first positive integers starting at 1, and the formula for this is:
SumOf_{i=0 to N} i = (N * (N + 1))/2We can prove this is the formula by the Induction. First, we need to establish a base case of i = 1:
Base Case: If i = 1, then
SumOf_{i=1 to 1} i = (1 * (1 + 1))/2 = 1So, the formula works for the case of i = 1 since 1 is the sum of the first integer in our sequence, 1. To show it works for all cases, we assume that the formula is true for i = k, and show it is true for i = k + 1. This is the Inductive Step. Once we do this, we can be sure it will always be true by just extending the base case one or more times using the induction step.
Induction Step: Assume the formula is true for i = 1 to N:
SumOf_{i=1 to N} i = (N * (N + 1))/2Now, show it also must be true for the case of i = 1 to N+1:
SumOf_{i=1 to N+1} i = (SumOf_{i=1 to N} i) + N + 1By the induction step assumption, the term above is simply the sum of the first N integers which we know to be equal to (N * (N + 1))/2, and
SumOf_{i=1 to N+1} i = (N*(N+1))/2 + N + 1 = (N^2 + N + 2N + 2)/2 = ((N + 1)(N + 2))/2which shows that the formula holds for input of size N+1
What this tells us is that selection sort is O(N^2), since it takes (N * (N + 1))/2 iterations. Of course we drop lower order terms and constants to get this result.