American Journal of Theoretical and Applied Statistics

| Peer-Reviewed |

Information Theoretic Models for Dependence Analysis And missing Data Estimation

Received: 03 February 2013    Accepted:     Published: 10 March 2013
Views:       Downloads:

Share This Article

Abstract

In the present communication information theoretic dependence measure has been defined using maximum entropy principle, which measures amount of dependence among the attributes in a contingency table. A relation between information theoretic measure of dependence and Chi-square statistic has been discussed. A generalization of this information theoretic dependence measure has been also studied. In the end Yate’s method and maximum entropy estimation of missing data in design of experiment have been described and illustrated by considering practical problems with empirical data.

DOI 10.11648/j.ajtas.20130202.12
Published in American Journal of Theoretical and Applied Statistics (Volume 2, Issue 2, March 2013)
Page(s) 15-20
Creative Commons

This is an Open Access article, distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution and reproduction in any medium or format, provided the original work is properly cited.

Copyright

Copyright © The Author(s), 2024. Published by Science Publishing Group

Keywords

Maximum Entropy Principle, Contingency Table, Chi-Square Statistics, Lagrange’s Multipliers And Depen-dence Measure

References
[1] Burg J.P.(1970). "The relationship between Maximum Entropy Spectra and Maximum Likelihood in Modern Spectra Analysis", ed D.G. Childers, pp130-131.
[2] Harvda, J. and Charvat, F. (1967). Quantification method of classification processes concepts of structural - entropy, Kybernetika, 3: 30-35.
[3] Kapur, J.N. and Kesavan, H.K. (1992)." Entropy optimization principles with applications." Academic press, San Diego.K. Elissa, "Title of paper if known," unpublished.
[4] Soofi, E.S. and Gokhale, D.V. (1997). "Information theoretic methods for categorical data. Advances in Econometrics." , JAI Press, Greenwich.
[5] Watanabe,S.(1969)."Knowing and Guessing".John Wiley,New York,1969.
[6] Watanabe, S. (1981). "Pattern recognition as a quest for minimum entropy.", Pattern Recognition. 13:381-387.
[7] Yates, F. (1933). "The analysis of replicated experiments when the field experiments are incomplete.", Emp. Journ. Exp. Agri. 1, 129-142.
Author Information
  • Department of Mathematics, Jaypee University of Engineering andTechnology, A.B. Road, Raghogarh, Distt.Guna-473226 (M.P.) India; Department of Statistics, University of Jammu, Jammu-(India)

  • Department of Mathematics, Jaypee University of Engineering andTechnology, A.B. Road, Raghogarh, Distt.Guna-473226 (M.P.) India; Department of Statistics, University of Jammu, Jammu-(India)

Cite This Article
  • APA Style

    D. S. Hooda, Permil Kumar. (2013). Information Theoretic Models for Dependence Analysis And missing Data Estimation. American Journal of Theoretical and Applied Statistics, 2(2), 15-20. https://doi.org/10.11648/j.ajtas.20130202.12

    Copy | Download

    ACS Style

    D. S. Hooda; Permil Kumar. Information Theoretic Models for Dependence Analysis And missing Data Estimation. Am. J. Theor. Appl. Stat. 2013, 2(2), 15-20. doi: 10.11648/j.ajtas.20130202.12

    Copy | Download

    AMA Style

    D. S. Hooda, Permil Kumar. Information Theoretic Models for Dependence Analysis And missing Data Estimation. Am J Theor Appl Stat. 2013;2(2):15-20. doi: 10.11648/j.ajtas.20130202.12

    Copy | Download

  • @article{10.11648/j.ajtas.20130202.12,
      author = {D. S. Hooda and Permil Kumar},
      title = {Information Theoretic Models for Dependence Analysis And missing Data Estimation},
      journal = {American Journal of Theoretical and Applied Statistics},
      volume = {2},
      number = {2},
      pages = {15-20},
      doi = {10.11648/j.ajtas.20130202.12},
      url = {https://doi.org/10.11648/j.ajtas.20130202.12},
      eprint = {https://download.sciencepg.com/pdf/10.11648.j.ajtas.20130202.12},
      abstract = {In the present communication information theoretic dependence measure has been defined using maximum entropy principle, which measures amount of dependence among the attributes in a contingency table. A relation between information theoretic measure of dependence and Chi-square statistic has been discussed. A generalization of this information theoretic dependence measure has been also studied. In the end Yate’s method and maximum entropy estimation of missing data in design of experiment have been described and illustrated by considering practical problems with empirical data.},
     year = {2013}
    }
    

    Copy | Download

  • TY  - JOUR
    T1  - Information Theoretic Models for Dependence Analysis And missing Data Estimation
    AU  - D. S. Hooda
    AU  - Permil Kumar
    Y1  - 2013/03/10
    PY  - 2013
    N1  - https://doi.org/10.11648/j.ajtas.20130202.12
    DO  - 10.11648/j.ajtas.20130202.12
    T2  - American Journal of Theoretical and Applied Statistics
    JF  - American Journal of Theoretical and Applied Statistics
    JO  - American Journal of Theoretical and Applied Statistics
    SP  - 15
    EP  - 20
    PB  - Science Publishing Group
    SN  - 2326-9006
    UR  - https://doi.org/10.11648/j.ajtas.20130202.12
    AB  - In the present communication information theoretic dependence measure has been defined using maximum entropy principle, which measures amount of dependence among the attributes in a contingency table. A relation between information theoretic measure of dependence and Chi-square statistic has been discussed. A generalization of this information theoretic dependence measure has been also studied. In the end Yate’s method and maximum entropy estimation of missing data in design of experiment have been described and illustrated by considering practical problems with empirical data.
    VL  - 2
    IS  - 2
    ER  - 

    Copy | Download

  • Sections