Noiseless Coding Theorems on New Generalized Useful Information Measure of order α and β type

Authors

  • Ashiq Hussain Bhat
  • Mirza Abdul Khalique Baig University of Kashmir

Keywords:

Shannon’s entropy, codeword length, useful information measure, Kraft inequality, Holder’s inequality, Huffman codes, Shannon-Fano codes, Noiseless coding theorem

Abstract

In this paper we define new generalized useful average code-word length  of order and type and its relationship with new generalized useful information measure of order and type has been discussed. The lower and upper bound of, in terms of are derived for a discrete noiseless channel. The measures defined in this communication are not only new but some well known measures are the particular cases of our proposed measures that already exist in the literature of useful information and coding theory. The noiseless coding theorems for discrete channel proved in this paper are verified by considering Huffman and Shannon-Fano coding schemes on taking empirical data. The important properties of  have also been studied.

References

Belis M, Guiasu S. A quantitative-qualitative Measure of Information in cybernetic System. IEEE Transaction on information theory, 1968; 14:593-594

Bhaker U.S, Hooda D.S. Mean value characterization of ‘useful’ information measure. Tamkang Journal of Mathematics, 1993; 24:283-294

Bhat A.H, Baig M.A.K. Some coding theorems on generalized Reyni’s entropy of order and type International Journal of Applied Mathematics and Information Sciences Letters, 2016; 5:1-5

Campbell L.L. A coding theorem and Renyi’s entropy. Information and control, 1965; 8:423-429

Guiasu S, Picard C.F. Borne inferieure de la longueur de certain codes. C.R Academic Sciences, Paris, 1971; 273:248-251

Gurdial, Pessoa F. On ‘useful’ Information of order Journal of Combinatorics Information and System Sciences, 1977; 2:158–162

Hartley R.V.L. Transmission of information. Bell System Technical Journal, 1928; 7: 535-563

Hooda D.S, Bhaker U.S. A generalized ‘useful’ information measure and coding theorems. Soochow Journal of Mathematics, 1997; 23:53–62

Jain P, Tuteja R.K. On coding theorem connected with ‘useful’ entropy of order International Journal of Mathematics and Mathematical Sciences, 1989; 12:193-198

Khan A.B, Bhat B.A, Pirzada S. Some results on a generalized ‘useful’ information measure. Journal of Inequalities in Pure and Applied Mathematics, 2005; 6:117

Kraft L.G. A device for quantizing grouping and coding amplitude modulates pulses. M.S Thesis, Department of Electrical Engineering, MIT, Cambridge, 1949

Longo G. A noiseless coding theorem for source having utilities. SIAM Journal of Applied Mathematics, 1976; 30:739-748

Mitter J, Mathur Y.D. Comparison of entropies of power distributions. ZAMM, 1972; 52:239-240

Nyquist H. Certain factors affecting telegraph speed. Bell System Technical Journal, 1924; 3:324-346

Nyquist H. Certain topics in telegraphy transmission theory. Journal of the American Institute of Electrical Engineers, 1928; 47:617

Renyi A. On measure of entropy and information. In: Proceeding Fourth Berkely Symposium on Mathematical Statistics and probability, University of California Press, 1961; 1:547-561

Shannon C.E. A mathematical theory of communication. Bell System Technical Journal, 1948; 27:379-423,623-659

Sharma B.D, Man Mohan, Mitter J. On measure of ‘useful’ information. Information and Control, 1978; 39:323-33.

Taneja H.C, Hooda D.S, Tuteja R.K. Coding theorems on a generalized ‘useful’ information. Soochow Journal of Mathematics, 1985; 11:123-131

Downloads

Additional Files

Published

2017-01-02

How to Cite

Noiseless Coding Theorems on New Generalized Useful Information Measure of order α and β type. (2017). Asian Journal of Fuzzy and Applied Mathematics, 4(6). https://ajouronline.com/index.php/AJFAM/article/view/4017

Similar Articles

1-10 of 26

You may also start an advanced similarity search for this article.