Skip to main content
×
×
Home

Bounds on Data Compression Ratio with a Given Tolerable Error Probability

  • Ilan Sadeh (a1)
Abstract

The paper treats data compression from the viewpoint of probability theory where a certain error probability is tolerable. We obtain bounds for the minimal rate given an error probability for blockcoding of general stationary ergodic sources. An application of the theory of large deviations provides numerical methods to compute for memoryless sources, the minimal compression rate given a tolerable error probability. Interesting connections between Cramer's functions and Shannon's theory for lossy coding are found.

Copyright
References
Hide All
1.Algoet, P.H. & Cover, T.M. (1988). A sandwich proof of the Shannon-McMillan-Breiman theorem. Annals of Probability 16: 899909.
2.Arimoto, S. (1973). On the converse to the coding theorem for discrete memoryless channels. IEEE Transactions on Information Theory IT-19: 357359.
3.Barron, A.R. (1985). The strong ergodic theorem for densities generalized Shannon-McMillan-Breiman theorem. Annals of Probability 13: 12921303.
4.Bender, C.M. & Orszag, S.A. (1987). Advanced mathematical methods for scientists and engineers. New York: McGraw-Hill.
5.Blahut, R.E. (1972). Computation of channel capacity and rate distortion functions. IEEE Transactions on Information Theory IT-18: 460473.
6.Blahut, R.E. (1987). Principles and practice of information theory. Reading, MA: Addison-Wesley.
7.Bleistein, N. & Handelsman, R.A. (1975). Asymptotic expansions of integrals. New York: Holt, Rinehart Winston.
8.Breiman, L. (1957). The individual ergodic theorem of information theory. Annals of Mathematical Statistics 28: 809811 (corrected in 31: 809–810).
9.Covo, Y. (1992). Error bounds for noiseless channels by an asymptotic large deviations theory. M.Sc. thesis, Tel-Aviv University.
10.Covo, Y. & Schuss, Z. (1991). Error bounds for noiseless channels by an asymptotic large deviations theory. Preliminary report.
11.Csiszar, I. & Longo, G. (1971). On the error exponent for source coding and for testing simple statistical hypothesis. First published in the Hungarian Academy of Sciences, Budapest.
12.Davisson, L.D., Longo, G., & Sgarro, A. (1981). The error exponent for the noiseless encoding of finite ergodic Markov sources. IEEE Transactions on Information Theory IT-27: 431438.
13.Dueck, G. & Korner, J. (1979). Reliability function of a discrete memoryless channel at rates above capacity. IEEE Transactions on Information Theory IT-25: 8285.
14.Gardiner, C.W. (1985). Handbook of stochastic methods for physics, chemistry and the natural sciences. Springer-Verlag.
15.Gray, R.M. (1975). Sliding block source coding. IEEE Transactions on Information Theory IT-21: 357368.
16.Gray, R.M. (1990). Entropy and information theory. Springer-Verlag.
17.Gray, R.M., Neuhoff, D.L., & Omura, J.K. (1975). Process definitions of distortion rate functions and source coding theorems. IEEE Transactions on Information Theory IT-21: 524532.
18.Knessl, C., Matkowsky, B.J., Schuss, Z., & Tier, C. (1985). An asymptotic theory of large deviations for Markov jump processes. SIAM Journal of Applied Mathematics 46(6): 10061028.
19.Longo, G. &. Sgarro, A. (1979). The source coding theorem revisited: A combinatorial approach. IEEE Transactions on Information Theory IT-25: 544548.
20.Mackenthun, K.M. & Pursley, M.B. (1978). Variable rate universal block source coding subject to a fidelity constraint. IEEE Transactions on Information Theory IT-24(3): 340360.
21.Marton, K. (1974). Error exponent for source coding with a fidelity criterion. IEEE Transactions on Information Theory IT-20: 197199.
22.Omura, J. (1973). A coding theorem for discrete time sources. IEEE Transactions on Information Theory IT-19: 490498.
23.Orey, S. (1985). On the Shannon-Perez-Moy theorem. Contemporary Mathematics 41: 319327.
24.Ornstein, D.S. & Shields, P.C. (1990). Universal almost sure data compression. Annals of Probability 18: 441452.
25.Sadeh, I. (1996). Universal data compression algorithm based on approximate string matching. Probability in the Engineering and Informational Sciences 10: 465486.
26.Shannon, C.E. (1948). A mathematical theory of communication. Bell Systems Technical Journal 27: 379423, 623–656.
27.Shannon, C.E.. (1959). Coding theorems for a discrete source with a fidelity criterion. IRE National Conv. Rec. Part 4: 142163.
28.Ziv, J. (1972). Coding of sources with unknown statistics. Part 1: Probability of encoding error; Part 2: Distortion relative to a fidelity criterion. IEEE Transactions on Information Theory IT-18: 384394.
Recommend this journal

Email your librarian or administrator to recommend adding this journal to your organisation's collection.

Probability in the Engineering and Informational Sciences
  • ISSN: 0269-9648
  • EISSN: 1469-8951
  • URL: /core/journals/probability-in-the-engineering-and-informational-sciences
Please enter your name
Please enter a valid email address
Who would you like to send this to? *
×

Metrics

Altmetric attention score

Full text views

Total number of HTML views: 0
Total number of PDF views: 0 *
Loading metrics...

Abstract views

Total abstract views: 0 *
Loading metrics...

* Views captured on Cambridge Core between <date>. This data will be updated every 24 hours.

Usage data cannot currently be displayed