Parity codes

Paulo E. D. Pinto; Fábio Protti; Jayme L. Szwarcfiter

RAIRO - Theoretical Informatics and Applications - Informatique Théorique et Applications (2005)

  • Volume: 39, Issue: 1, page 263-278
  • ISSN: 0988-3754

Abstract

top
Motivated by a problem posed by Hamming in 1980, we define even codes. They are Huffman type prefix codes with the additional property of being able to detect the occurrence of an odd number of 1-bit errors in the message. We characterize optimal even codes and describe a simple method for constructing the optimal codes. Further, we compare optimal even codes with Huffman codes for equal frequencies. We show that the maximum encoding in an optimal even code is at most two bits larger than the maximum encoding in a Huffman tree. Moreover, it is always possible to choose an optimal even code such that this difference drops to 1 bit. We compare average sizes and show that the average size of an encoding in a optimal even tree is at least 1 / 3 and at most 1 / 2 of a bit larger than that of a Huffman tree. These values represent the overhead in the encoding sizes for having the ability to detect an odd number of errors in the message. Finally, we discuss the case of arbitrary frequencies and describe some results for this situation.

How to cite

top

Pinto, Paulo E. D., Protti, Fábio, and Szwarcfiter, Jayme L.. "Parity codes." RAIRO - Theoretical Informatics and Applications - Informatique Théorique et Applications 39.1 (2005): 263-278. <http://eudml.org/doc/244903>.

@article{Pinto2005,
abstract = {Motivated by a problem posed by Hamming in 1980, we define even codes. They are Huffman type prefix codes with the additional property of being able to detect the occurrence of an odd number of 1-bit errors in the message. We characterize optimal even codes and describe a simple method for constructing the optimal codes. Further, we compare optimal even codes with Huffman codes for equal frequencies. We show that the maximum encoding in an optimal even code is at most two bits larger than the maximum encoding in a Huffman tree. Moreover, it is always possible to choose an optimal even code such that this difference drops to 1 bit. We compare average sizes and show that the average size of an encoding in a optimal even tree is at least $1/3$ and at most $1/2$ of a bit larger than that of a Huffman tree. These values represent the overhead in the encoding sizes for having the ability to detect an odd number of errors in the message. Finally, we discuss the case of arbitrary frequencies and describe some results for this situation.},
author = {Pinto, Paulo E. D., Protti, Fábio, Szwarcfiter, Jayme L.},
journal = {RAIRO - Theoretical Informatics and Applications - Informatique Théorique et Applications},
keywords = {Hamming-Huffman code; even code; prefix code; optimal even code},
language = {eng},
number = {1},
pages = {263-278},
publisher = {EDP-Sciences},
title = {Parity codes},
url = {http://eudml.org/doc/244903},
volume = {39},
year = {2005},
}

TY - JOUR
AU - Pinto, Paulo E. D.
AU - Protti, Fábio
AU - Szwarcfiter, Jayme L.
TI - Parity codes
JO - RAIRO - Theoretical Informatics and Applications - Informatique Théorique et Applications
PY - 2005
PB - EDP-Sciences
VL - 39
IS - 1
SP - 263
EP - 278
AB - Motivated by a problem posed by Hamming in 1980, we define even codes. They are Huffman type prefix codes with the additional property of being able to detect the occurrence of an odd number of 1-bit errors in the message. We characterize optimal even codes and describe a simple method for constructing the optimal codes. Further, we compare optimal even codes with Huffman codes for equal frequencies. We show that the maximum encoding in an optimal even code is at most two bits larger than the maximum encoding in a Huffman tree. Moreover, it is always possible to choose an optimal even code such that this difference drops to 1 bit. We compare average sizes and show that the average size of an encoding in a optimal even tree is at least $1/3$ and at most $1/2$ of a bit larger than that of a Huffman tree. These values represent the overhead in the encoding sizes for having the ability to detect an odd number of errors in the message. Finally, we discuss the case of arbitrary frequencies and describe some results for this situation.
LA - eng
KW - Hamming-Huffman code; even code; prefix code; optimal even code
UR - http://eudml.org/doc/244903
ER -

References

top
  1. [1] N. Faller, An adaptive Method for Data Compression, in Record of the 7th Asilomar Conference on Circuits, Systems and Computers, Naval Postgraduate School, Monterrey, Ca. (1973) 593–597. Zbl0304.68038
  2. [2] R.G. Gallager, Variations on a Theme by Huffman. IEEE Trans. Inform. Theory 24 (1978) 668–674. Zbl0399.94012
  3. [3] R.W. Hamming, Coding And Information Theory. Prentice Hall (1980). Zbl0431.94001MR555735
  4. [4] D.A. Huffman, A Method for the Construction of Minimum Redundancy Codes, in Proc. of the IRE 40 (1951) 1098–1101. 
  5. [5] D.E. Knuth, The Art of Computer Programming. Addison Wesley (1973). Zbl1127.68068MR378456
  6. [6] D.E. Knuth, Dynamic Huffman Coding. J. Algorithms 6 (1985) 163–180. Zbl0606.94007
  7. [7] E.S. Laber, Um algoritmo eficiente para construção de códigos de prefixo com restrição de comprimento. Master Thesis, PUC-RJ, Rio de Janeiro (1997). 
  8. [8] L.L. Larmore and D.S. Hirshberg, A fast algorithm for optimal length-limited Huffman codes. JACM 37 (1990) 464–473. Zbl0699.68070
  9. [9] R.L. Milidiu, E.S. Laber and A.A. Pessoa, Improved Analysis of the FGK Algorithm. J. Algorithms 28 (1999) 195–211. Zbl1111.94328
  10. [10] R.L. Milidiu and E.S. Laber, The Warm-up Algorithm: A Lagrangean Construction of Length Restricted Huffman Codes. SIAM J. Comput. 30 (2000) 1405–1426. Zbl0987.94020
  11. [11] R.L. Milidiu and E.S. Laber, Improved Bounds on the Inefficiency of Length Restricted Codes. Algorithmica 31 (2001) 513–529. Zbl1012.94008
  12. [12] A. Turpin and A. Moffat, Practical length-limited coding for large alphabets. Comput. J. 38 (1995) 339–347. 
  13. [13] P.E.D Pinto, F. Protti and J.L. Szwarcfiter, A Huffman-Based Error Detection Code, in Proc. of the Third International Workshop on Experimental and Efficient Algorithms (WEA 2004), Angra dos Reis, Brazil, 2004. Lect. Notes Comput. Sci. 3059 (2004) 446–457. 
  14. [14] E.S. Schwartz, An Optimum Encoding with Minimal Longest Code and Total Number of Digits. Inform. Control 7 (1964) 37–44. Zbl0116.35303

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.