11-03-2014, 03:08 PM
Huffman Coding
Huffman Coding[.ppt (Size: 743.5 KB / Downloads: 20)
A simple example
Suppose we have a message consisting of 5 symbols, e.g.
How can we code this message using 0/1 so the coded message will have minimum length (for transmission or saving!)
5 symbols at least 3 bits
For a simple encoding,
length of code is 10*3=30 bits
Source Coding Theorem
There exists a variable-length encoding C of an ensemble X such that the average length of an encoded symbol, L(C,X), satisfies
The Huffman coding algorithm produces optimal symbol codes
The Huffman Coding algorithm- History
In 1951, David Huffman and his MIT information theory classmates given the choice of a term paper or a final exam
Huffman hit upon the idea of using a frequency-sorted binary tree and quickly proved this method the most efficient.
In doing so, the student outdid his professor, who had worked with information theory inventor Claude Shannon to develop a similar code.
Huffman built the tree from the bottom up instead of from the top down
Disadvantages of the Huffman Code
Changing ensemble
If the ensemble changes the frequencies and probabilities change the optimal coding changes
e.g. in text compression symbol frequencies vary with context
Re-computing the Huffman code by running through the entire file in advance?!
Saving/ transmitting the code too?!
Does not consider ‘blocks of symbols’
‘strings_of_ch’ the next nine symbols are predictable ‘aracters_’ , but bits are used without conveying any new information
Huffman Coding Algorithm
for Image Compression
Step 1. Build a Huffman tree by sorting the histogram and successively combine the two bins of the lowest value until only one bin remains.
Step 2. Encode the Huffman tree and save the Huffman tree with the coded value.
Step 3. Encode the residual image.