06-08-2013, 02:22 PM
Greedy Algorithms
Greedy Algorithm.ppt (Size: 120.5 KB / Downloads: 18)
Optimization problems
An optimization problem is one in which you want to find, not just a solution, but the best solution
A “greedy algorithm” sometimes works well for optimization problems
A greedy algorithm works in phases. At each phase:
You take the best you can get right now, without regard for future consequences
You hope that by choosing a local optimum at each step, you will end up at a global optimum
Example: Counting money
Suppose you want to count out a certain amount of money, using the fewest possible bills and coins
A greedy algorithm would do this would be:
At each step, take the largest possible bill or coin that does not overshoot
Example: To make $6.39, you can choose:
a $5 bill
a $1 bill, to make $6
a 25¢ coin, to make $6.25
A 10¢ coin, to make $6.35
four 1¢ coins, to make $6.39
For US money, the greedy algorithm always gives the optimum solution
A failure of the greedy algorithm
In some (fictional) monetary system, “krons” come in 1 kron, 7 kron, and 10 kron coins
Using a greedy algorithm to count out 15 krons, you would get
A 10 kron piece
Five 1 kron pieces, for a total of 15 krons
This requires six coins
A better solution would be to use two 7 kron pieces and one 1 kron piece
This only requires three coins
The greedy algorithm results in a solution, but not in an optimal solution
Traveling salesman
A salesman must visit every city (starting from city A), and wants to cover the least possible distance
He can revisit a city (and reuse a road) if necessary
He does this by using a greedy algorithm: He goes to the next nearest city from wherever
Analysis
A greedy algorithm typically makes (approximately) n choices for a problem of size n
(The first or last choice may be forced)
Hence the expected running time is:
O(n * O(choice(n))), where choice(n) is making a choice among n objects
Counting: Must find largest useable coin from among k sizes of coin (k is a constant), an O(k)=O(1) operation;
Therefore, coin counting is (n)
Huffman: Must sort n values before making n choices
Therefore, Huffman is O(n log n) + O(n) = O(n log n)
Minimum spanning tree: At each new node, must include new edges and keep them sorted, which is O(n log n) overall
Therefore, MST is O(n log n) + O(n) = O(n log n)
Analysis of Dijkstra’s algorithm I
Assume that the average out-degree of a node is some constant k
Initially,
Mark the given node as known (path length is zero)
This takes O(1) (constant) time
For each out-edge, set the distance in each neighboring node equal to the cost (length) of the out-edge, and set its predecessor to the initially given node
If each node refers to a list of k adjacent node/edge pairs, this takes O(k) = O(1) time, that is, constant time
Notice that this operation takes longer if we have to extract a list of names from a hash table