Information Theory and StatisticsCourier Corporation, 7 jul 1997 - 399 páginas Highly useful text studies logarithmic measures of information and their application to testing statistical hypotheses. Includes numerous worked examples and problems. References. Glossary. Appendix. 1968 2nd, revised edition. |
Índice
CHAPTER | 1 |
INEQUALITIES OF INFORMATION THEORY | 37 |
LIMITING PROPERTIES | 70 |
2 | 81 |
MULTINOMIAL POPULATIONS | 109 |
POISSON POPULATIONS | 142 |
CHAPTER PAGE | 159 |
CONTINGENCY TABLES | 187 |
CHAPTER PAGE | 239 |
MULTIVARIATE ANALYSIS THE MULTIVARIATE LINEAR HYPOTHESIS | 253 |
OTHER HYPOTHESES | 297 |
Problems | 341 |
5 | 348 |
Loge n and n log n for values of n from 1 through 1000 | 367 |
FP1 P2 P1 | 378 |
APPENDIX | 389 |
Otras ediciones - Ver todo
Términos y frases comunes
absolutely continuous analysis in table analysis of variance asymptotically distributed B₁ binomial canonical correlation chapter 9 coefficients Component due component in table conditional homogeneity conjugate distribution contingency table continuous with respect convex convex function corollary 3.2 covariance matrix defined degrees of freedom discrimination efficiency distributed as x² divergence dy(y equality equations example fi(x Fisher given H₁ H₂(R independent observations Information D.F. information theory Ĵ(H₁ k₁ k₂ lemma linear discriminant function log f(x m₁ M₂(T minimum discrimination information multinomial distribution N₁ N₂ noncentral normal distribution notation Note null hypothesis H₂ P₁ parameters partitioning Pijk Poisson distribution Poisson population Prob probability measure problem r₁ regression sample Show specified sufficient statistic Suppose theorem 2.1 Total unbiased estimates variates x₁ x²-distribution Xijk log y₁ μ₁ Σ Σ σ² Σα ΣΣ Στ