Penguin
Note: You are viewing an old revision of this page. View the current version.

Rumour goes that a professor once said to his class that they'd get an A+ in a course if they could invent a better compression method than dictionary compression(?), believing the problem to be impossible. Mr Huffman was sitting in the audience, and actually achieved it, creating 'Huffman Coding' and an A+ :)

:This sounds like an UrbanLegend to me. Huffman? was not the one invented entropy driven compression; Shannon? and Fano? did that. Huffman?'s contribution was to come up with an algorithm to create an optimal dictionary given a set of weights for a set of symbols which results in the minimum possible length of the encoded data stream. --AristotlePagaltzis

HuffmanCoding is a PrefixFreeCode?, which means that no code is a prefix of another one.

an example HuffmanCoding might be |0| A |101| B |11|C

AAAAAACAACAABA

would be encoded as

00000011 00110010 10

for a total of 18 bits, compared to the 28 bits required for a straightforward 2 bits per character encoding.

ArithmeticEncoding which was invented later is even able to use partial bits to store characters. It is not in common use since it has been under patent by IBM.