25-07-2014, 11:09 AM
Algorithm Design Techniques:Greedy Algorithms
Algorithm Design.ppt (Size: 516.5 KB / Downloads: 12)
Introduction
Algorithm Design Techniques
Design of algorithms
Algorithms commonly used to solve problems
Greedy, Divide and Conquer, Dynamic Programming, Randomized, Backtracking
General approach
Examples
Time and space complexity (where appropriate)
Greedy Algorithms
Choose the best option during each phase
Dijkstra, Prim, Kruskal
Making change
Choose largest bill at each round
Does this always work?
Bad examples where greedy does not work?
Scheduling
Greedy-choice property: if shortest job does not go first, the y jobs before it will complete 3 time units faster, but j3 will be postponed by time to complete all jobs before it
Optimal substructure: if shortest job is removed from optimal solution, remaining solution for n-1 jobs is optimal
Huffman Codes
Goal: find a full binary tree of minimum cost where characters are stored in the leaves
Cost of tree: sum across all characters of the frequency of the character times its depth in the tree
frequently occurring characters should be highest in the tree
Huffman’s Algorithm
How do we produce a code?
Maintain a forest of trees
weight of a tree is the sum of the frequencies of the leaves
start with C trees to represent each character
weight of each is frequency of that character
Until there is only 1 tree
choose the 2 trees with the smallest weights and merge them by creating a new root and making each tree a right or left subtree
Running time – O (ClogC)
Optimality Proof – Idea
Greedy choice property: given x and y -- characters with lowest frequency in alphabet C, there exists an optimal prefix code for C in which the codewords for x and y have the same length and differ only in the last bit
Take an arbitrary optimal prefix code and modify it to make it a tree representing another optimal prefix code such that x and y are sibling leaves of max depth
On-line Algorithms
On-line algorithms cannot guarantee optimal solution
Problem: cannot know when input will end
M small items ½-ε – M large items ½+ε
Can fit into M bins with 1 large and 1 small in each bin
If all small come first, place in M separate bins
If input is only M small items, we have used twice as many bins as necessary
There are inputs that force any on-line bin-packing algorithm to use at least 4/3 the optimal number of bins.
Off-line Bin Packing
Sort items (in decreasing order) first for easier placement of large items
Apply first fit or best fit algorithm
First fit – (.8, .2) (.7, .3) (.5, .4, .1)
Let M be the optimal number of bins required to pack a list I of items. Then first fit decreasing never uses more than (11/9M)+4 bins.