Information Theory and StatisticsCourier Corporation, 7 jul 1997 - 399 páginas Highly useful text studies logarithmic measures of information and their application to testing statistical hypotheses. Includes numerous worked examples and problems. References. Glossary. Appendix. 1968 2nd, revised edition. |
Índice
CHAPTER PAGE | 1 |
CHAPTER | 4 |
PROPERTIES OF INFORMATION | 12 |
INEQUALITIES OF INFORMATION THEORY | 36 |
LIMITING PROPERTIES | 70 |
Two samples | 100 |
POISSON POPULATIONS | 153 |
MULTIVARIATE NORMAL POPULATIONS | 179 |
MULTIVARIATE ANALYSIS THE MULTIVARIATE LINEAR HYPOTHESIS | 253 |
OTHER HYPOTHESES | 297 |
CHAPTER PAGE | 304 |
LINEAR DISCRIMINANT FUNCTIONS | 342 |
REFERENCES | 353 |
356 | |
81 | 363 |
Log n and n log n for values of n from 1 through 1000 | 367 |
LINEAR HYPOTHESIS | 211 |
oneway classification k categories | 219 |
Twopartition subhypothesis | 225 |
Example | 231 |
CHAPTER PAGE | 239 |
189 | 369 |
Fp p p log + a logº | 378 |
APPENDIX | 389 |
Otras ediciones - Ver todo
Términos y frases comunes
according additive analysis applications approximation asymptotically distributed binomial chapter 12 chapter 9 classification coefficients column component in table computed conditional Consider continuous corollary correlation corresponding covariance matrix defined definite degrees of freedom derived discrimination information distribution divergence dž(a efficiency elements equality equations error example expressed Fisher given homogeneity hypothesis H illustrate independent independent observations J(Hi lemma linear discriminant function logº means measure noncentral normal distribution normal populations notation Note null hypothesis observations obtained parameters partitioning Poisson Prob probability probability measure problem procedure Proof properties random rank regression relation remark respectively roots sample satisfies Show significance similar space specified statistic Suppose theorem theory transformation unbiased estimates variables variates Wilks write yields