<< Chapter < Page Chapter >> Page >
This module introduces practical entropy coding techniques, such as Huffman Coding, Run-length Coding (RLC) and Arithmetic Coding.

In the module of Use of Laplacian PDFs in Image Compression we have assumed that ideal entropy coding has been used in order to calculate the bitrates for the coded data. In practise we must use real codes and we shall now see how this affects the compression performance.

There are three main techniques for achieving entropy coding:

  • Huffman Coding - one of the simplest variable length coding schemes.
  • Run-length Coding (RLC) - very useful for binary data containing long runs of ones of zeros.
  • Arithmetic Coding - a relatively new variable length coding scheme that can combine the best features ofHuffman and run-length coding, and also adapt to data with non-stationary statistics.
We shall concentrate on the Huffman and RLC methods for simplicity. Interested readers may find out more aboutArithmetic Coding in chapters 12 and 13 of the JPEG Book.

First we consider the change in compression performance if simple Huffman Coding is used to code the subimages of the4-level Haar transform.

The calculation of entropy in this equation from our discussion of entropy assumed that each message with probability p i could be represented by a word of length i 2 logbase --> p i bits. Huffman codes require the i to be integers and assume that the p i are adjusted to become:

p i ^ 2 i
where the i are integers, chosen subject to the constraint that i p i ^ 1 (to guarantee that sufficient uniquely decodable code words are available) and such that the mean Huffman word length(Huffman entropy), H ^ i p i i , is minimised.

We can use the probability histograms which generated the entropy plots in figures of level 1 energies , level 2 energies , level 3 energies and level 4 energies to calculate the Huffman entropies H ^ for each subimage and compare these with the true entropies to see the loss in performance caused by using realHuffman codes.

An algorithm for finding the optimum codesizes i is recommended in the JPEG specification [ the JPEG Book , Appendix A, Annex K.2, fig K.1]; and a Mathlab M-file toimplement it is given in M-file code .

Comparison of entropies (columns 1, 3, 5) and Huffman coded bit rates (columns 2, 4, 6) for the original (columns 1 and2) and transformed (columns 3 to 6) Lenna images. In columns 5 and 6, the zero amplitude state is run-length encoded toproduce many states with probabilities<0.5.
Numerical results used in the figure - entropies and bit rates of subimages for qstep=15
Column: 1 2 3 4 5 6 -
0.0264 0.0265 0.0264 0.0266
0.0220 0.0222 0.0221 0.0221 Level 4
0.0186 0.0187 0.0185 0.0186
0.0171 0.0172 0.0171 0.0173 -
0.0706 0.0713 0.0701 0.0705
0.0556 0.0561 0.0557 0.0560 Level 3
3.7106 3.7676 0.0476 0.0482 0.0466 0.0471 -
0.1872 0.1897 0.1785 0.1796
0.1389 0.1413 0.1340 0.1353 Level 2
0.1096 0.1170 0.1038 0.1048 -
0.4269 0.4566 0.3739 0.3762
0.2886 0.3634 0.2691 0.2702 Level 1
0.2012 0.3143 0.1819 0.1828 -
Totals: 3.7106 3.7676 1.6103 1.8425 1.4977 1.5071

shows the results of applying this algorithm to the probability histograms and lists the same results numerically for ease of analysis. Columns 1 and 2 compare theideal entropy with the mean word length or bit rate from using a Huffman code (the Huffman entropy) for the case of theuntransformed image where the original pels are quantized with Q step 15 . We see that the increase in bit rate from using the real code is: 3.7676 3.7106 1 1.5 % But when we do the same for the 4-level transformed subimages, we get columns 3 and 4. Here we see thatreal Huffman codes require an increase in bit rate of: 1.8425 1.6103 1 14.4 % Comparing the results for each subimage in columns 3 and 4, wesee that most of the increase in bit rate arises in the three level-1 subimages at the bottom of the columns. This is becauseeach of the probability histograms for these subimages (see figure ) contain one probability that is greater than 0.5. Huffman codes cannot allocate a word length ofless than 1 bit to a given event, and so they start to lose efficiency rapidly when 2 logbase --> p i becomes less than 1, ie when p i 0.5 .

Questions & Answers

I'm interested in biological psychology and cognitive psychology
Tanya Reply
what does preconceived mean
sammie Reply
physiological Psychology
Nwosu Reply
How can I develope my cognitive domain
Amanyire Reply
why is communication effective
Dakolo Reply
Communication is effective because it allows individuals to share ideas, thoughts, and information with others.
effective communication can lead to improved outcomes in various settings, including personal relationships, business environments, and educational settings. By communicating effectively, individuals can negotiate effectively, solve problems collaboratively, and work towards common goals.
it starts up serve and return practice/assessments.it helps find voice talking therapy also assessments through relaxed conversation.
miss
Every time someone flushes a toilet in the apartment building, the person begins to jumb back automatically after hearing the flush, before the water temperature changes. Identify the types of learning, if it is classical conditioning identify the NS, UCS, CS and CR. If it is operant conditioning, identify the type of consequence positive reinforcement, negative reinforcement or punishment
Wekolamo Reply
please i need answer
Wekolamo
because it helps many people around the world to understand how to interact with other people and understand them well, for example at work (job).
Manix Reply
Agreed 👍 There are many parts of our brains and behaviors, we really need to get to know. Blessings for everyone and happy Sunday!
ARC
A child is a member of community not society elucidate ?
JESSY Reply
Isn't practices worldwide, be it psychology, be it science. isn't much just a false belief of control over something the mind cannot truly comprehend?
Simon Reply
compare and contrast skinner's perspective on personality development on freud
namakula Reply
Skinner skipped the whole unconscious phenomenon and rather emphasized on classical conditioning
war
explain how nature and nurture affect the development and later the productivity of an individual.
Amesalu Reply
nature is an hereditary factor while nurture is an environmental factor which constitute an individual personality. so if an individual's parent has a deviant behavior and was also brought up in an deviant environment, observation of the behavior and the inborn trait we make the individual deviant.
Samuel
I am taking this course because I am hoping that I could somehow learn more about my chosen field of interest and due to the fact that being a PsyD really ignites my passion as an individual the more I hope to learn about developing and literally explore the complexity of my critical thinking skills
Zyryn Reply
good👍
Jonathan
and having a good philosophy of the world is like a sandwich and a peanut butter 👍
Jonathan
generally amnesi how long yrs memory loss
Kelu Reply
interpersonal relationships
Abdulfatai Reply
What would be the best educational aid(s) for gifted kids/savants?
Heidi Reply
treat them normal, if they want help then give them. that will make everyone happy
Saurabh
Got questions? Join the online conversation and get instant answers!
Jobilize.com Reply

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Pdf generation test course. OpenStax CNX. Dec 16, 2009 Download for free at http://legacy.cnx.org/content/col10278/1.5
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Pdf generation test course' conversation and receive update notifications?

Ask