Rumour goes that a professor once said to his class that they'd get an A+ in a course if they could invent a better compression method than dictionary compression(?), believing the problem to be impossible. Mr Huffman was sitting in the audience, and actually achieved it, creating 'Huffman Coding' and an A+ :)

HuffmanCoding is a __PrefixFreeCode__?, which means that no code is a prefix of another one.

an example HuffmanCoding might be |0| A |101| B |11|C

AAAAAACAACAABA

would be encoded as

00000011 00110010 10

for a total of 18 bits, compared to the 28 bits required for a straightforward 2 bits per character encoding.

ArithmeticEncoding which was invented later is even able to use partial bits to store characters. It is not in common use since it has been under patent by IBM.

See also http://datacompression.info/Huffman.shtml

This sounds like an UrbanLegend to me. DavidHuffman was not the one invented entropy driven compression; Claude Elwood Shannon and RobertMFano did that. DavidHuffman's contribution was to come up with an algorithm to create an optimal dictionary given a set of weights for a set of symbols which results in the minimum possible length of the encoded data stream. --AristotlePagaltzis

It's possibly a bit of an UrbanLegend, but he did design it for a term paper, I think anyone that invents an efficient compression method such as HuffmanCoding for a term paper deserves an A+ no matter what :) --PerryLorier

I seem to recall that the professor of that class gave students the option of a mix of papers and exam, or they could "bet" their whole course grade on a single paper. DavidHuffman chose the paper option and came up with it very shortly before the paper was due... -- JohnMcPherson

One page links to HuffmanCoding: