Catalogue


Information-theoretic methods for estimating complicated probability distributions /
Zhi Zong.
edition
1st ed.
imprint
Boston : Elsevier, 2006.
description
xvii, 299 p. : ill. ; 24 cm. + 1 CD-ROM
ISBN
0444527966, 9780444527967
format(s)
Book
Holdings
More Details
author
imprint
Boston : Elsevier, 2006.
isbn
0444527966
9780444527967
catalogue key
6028190
 
Includes bibliographical references and index.
A Look Inside
Summaries
Bowker Data Service Summary
Exact estimation of the probability distribution of a random variable is very important. There have been constant efforts to find appropriate methods to determine complicated distributions based on random samples. This book documents the latest research in the subject.
Main Description
Mixing up various disciplines frequently produces something that are profound and far-reaching. Cybernetics is such an often-quoted example. Mix of information theory, statistics and computing technology proves to be very useful, which leads to the recent development of information-theory based methods for estimating complicated probability distributions.Estimating probability distribution of a random variable is the fundamental task for quite some fields besides statistics, such as reliability, probabilistic risk analysis (PSA), machine learning, pattern recognization, image processing, neural networks and quality control. Simple distribution forms such as Gaussian, exponential or Weibull distributions are often employed to represent the distributions of the random variables under consideration, as we are taught in universities. In engineering, physical and social science applications, however, the distributions of many random variables or random vectors are so complicated that they do not fit the simple distribution forms at al.Exact estimation of the probability distribution of a random variable is very important. Take stock market prediction for example. Gaussian distribution is often used to model the fluctuations of stock prices. If such fluctuations are not normally distributed, and we use the normal distribution to represent them, how could we expect our prediction of stock market is correct? Another case well exemplifying the necessity of exact estimation of probability distributions is reliability engineering. Failure of exact estimation of the probability distributions under consideration may lead to disastrous designs.There have been constant efforts to find appropriate methods to determine complicated distributions based on random samples, but this topic has never been systematically discussed in detail in a book or monograph. The present book is intended to fill the gap and documents the latest research in this subject.Determining a complicated distribution is not simply a multiple of the workload we use to determine a simple distribution, but it turns out to be a much harder task. Two important mathematical tools, function approximation and information theory, that are beyond traditional mathematical statistics, are often used. Several methods constructed based on the two mathematical tools for distribution estimation are detailed in this book. These methods have been applied by the author for several years to many cases. They are superior in the following senses:(1) No prior information of the distribution form to be determined is necessary. It can be determined automatically from the sample;(2) The sample size may be large or small;(3) They are particularly suitable for computers.It is the rapid development of computing technology that makes it possible for fast estimation of complicated distributions.The methods provided herein well demonstrate the significant cross influences between information theory and statistics, and showcase the fallacies of traditional statistics that, however, can be overcome by information theory.Key Features:- Density functions automatically determined from samples- Free of assuming density forms- Computation-effective methods suitable for PC- density functions automatically determined from samples- Free of assuming density forms- Computation-effective methods suitable for PC
Main Description
Mixing up various disciplines frequently produces something that are profound and far-reaching. Cybernetics is such an often-quoted example. Mix of information theory, statistics and computing technology proves to be very useful, which leads to the recent development of information-theory based methods for estimating complicated probability distributions. Estimating probability distribution of a random variable is the fundamental task for quite some fields besides statistics, such as reliability, probabilistic risk analysis (PSA), machine learning, pattern recognization, image processing, neural networks and quality control. Simple distribution forms such as Gaussian, exponential or Weibull distributions are often employed to represent the distributions of the random variables under consideration, as we are taught in universities. In engineering, physical and social science applications, however, the distributions of many random variables or random vectors are so complicated that they do not fit the simple distribution forms at al. Exact estimation of the probability distribution of a random variable is very important. Take stock market prediction for example. Gaussian distribution is often used to model the fluctuations of stock prices. If such fluctuations are not normally distributed, and we use the normal distribution to represent them, how could we expect our prediction of stock market is correct? Another case well exemplifying the necessity of exact estimation of probability distributions is reliability engineering. Failure of exact estimation of the probability distributions under consideration may lead to disastrous designs. There have been constant efforts to find appropriate methods to determine complicated distributions based on random samples, but this topic has never been systematically discussed in detail in a book or monograph. The present book is intended to fill the gap and documents the latest research in this subject. Determining a complicated distribution is not simply a multiple of the workload we use to determine a simple distribution, but it turns out to be a much harder task. Two important mathematical tools, function approximation and information theory, that are beyond traditional mathematical statistics, are often used. Several methods constructed based on the two mathematical tools for distribution estimation are detailed in this book. These methods have been applied by the author for several years to many cases. They are superior in the following senses: (1) No prior information of the distribution form to be determined is necessary. It can be determined automatically from the sample; (2) The sample size may be large or small; (3) They are particularly suitable for computers. It is the rapid development of computing technology that makes it possible for fast estimation of complicated distributions. The methods provided herein well demonstrate the significant cross influences between information theory and statistics, and showcase the fallacies of traditional statistics that, however, can be overcome by information theory. Key Features: - Density functions automatically determined from samples - Free of assuming density forms - Computation-effective methods suitable for PC - density functions automatically determined from samples - Free of assuming density forms - Computation-effective methods suitable for PC
Table of Contents
Prefacep. v
Acknowledgmentp. vii
Contentsp. viii
List of Tablesp. xiv
List of figuresp. xv
Randomness and probabilityp. 1
Randomnessp. 2
Random phenomenap. 2
Sample space and random eventsp. 2
Probabilityp. 4
Probability defined on eventsp. 4
Conditional probabilityp. 6
Independencep. 8
Random variablep. 9
Random variable and distributionsp. 9
Vector random variables and joint distributionp. 12
Conditional distributionp. 14
Expectationsp. 16
Typical distributionp. 19
Concluding remarksp. 22
Inference and statisticsp. 25
Samplingp. 26
Sampling distributions for small samplesp. 28
Sampling distributions for large samplesp. 30
Chebyshev's inequalityp. 31
The law of large numbersp. 32
The central limit theoremp. 33
Estimationp. 34
Estimationp. 34
Sampling errorp. 35
Properties of estimatorsp. 37
Maximum Likelihood Estimatorp. 39
The Maximum Likelihood Method (M-L Mehtod)p. 39
The Asymptotic Distribution of the M-L Estimatorp. 40
Hypothesis testingp. 45
Definitionsp. 45
Testing proceduresp. 47
Concluding remarksp. 48
Random numbers and their applicationsp. 49
Simulating random numbers from a uniform distributionp. 50
Quality of random number generatorsp. 53
Randomness testp. 54
Uniformity testp. 56
Independence testp. 58
Visual testingp. 58
Simulating random numbers from specific distributionsp. 59
Simulating random numbers for general CDFp. 61
Simulating vector random numbersp. 64
Concluding remarksp. 66
Approximation and B-spline functionsp. 67
Approximation and best approximationp. 69
Polynomial basisp. 72
B-splinesp. 77
Definitionsp. 77
B-spline basis setsp. 81
Linear independence of B-spline functionsp. 82
Properties of B-splinesp. 82
Two-dimensional B-splinesp. 87
Concluding remarksp. 87
Disorder, entropy and entropy estimationp. 89
Disorder and entropyp. 89
Entropy of finite schemesp. 92
Axioms of entropyp. 94
Kullback information and model uncertaintyp. 97
Estimation of entropy based on large samplesp. 105
Asymptotically unbiased estimators of four basic entropiesp. 107
Asymptotically unbiased estimator of TSE and AICp. 114
Entropy estimation based on small samplep. 118
Model selectionp. 119
Model selection based on large samplesp. 120
Model selection based on small samplesp. 126
Concluding remarksp. 128
Estimation of 1-D complicated distributions based on large samplesp. 129
General problems about pdf approximationp. 130
B-spline approximation of a continuous pdfp. 132
Estimationp. 135
Estimation from sample datap. 135
Estimation from a histogramp. 137
Model selctionp. 140
Numerical examplesp. 144
Concluding Remarksp. 156
Non-linear programming problem and the uniqueness of the solutionp. 159
Estimation of 2-D complicated distributions based on large samplesp. 163
B-Spline Approximation of a 2-D pdfp. 164
Estimationp. 167
Estimation from sample datap. 167
Computation accelerationp. 169
Estimation from a histogramp. 170
Model selectionp. 173
Numerical examplesp. 174
Concluding remarksp. 186
Estimation of 1-D complicated distribution based on small samplesp. 189
Statistical influence of small sample on estimationp. 190
Construction of smooth Bayesian priorsp. 192
Analysis of statistical fluctuationsp. 192
Smooth prior distribution of combination coefficientsp. 194
Bayesian estimation of complicated pdfp. 198
Bayesian point estimatep. 198
Determination of parameter[omega superscript 2]p. 200
Calculating b and determinant of F[superscript T]F p. 203
Numerical examplesp. 204
Application to discrete random distributionsp. 209
Concluding remarksp. 210
Characterization of the methodp. 210
Comparison with the methods presented in Chapter 6p. 211
Comments on Bayesian approachp. 211
Estimation of 2-D complicated distribution based on small samplesp. 213
Statistical influence of small samples on estimationp. 213
Construction of smooth 2-d Bayesian priorsp. 216
Analysis of statistical fluctuationsp. 216
Smooth prior distribution of combination coefficientsp. 217
Formulation of Bayesian estimation of complicated pdfp. 219
Bayesian point estimatep. 219
Determination of parameter[omega superscript 2]p. 221
Householder Transformp. 223
Numerical examplesp. 225
Application to discrete random distributionsp. 228
Concluding remarksp. 229
Householder transformp. 230
Tridiagonalization of a real symmetric matrixp. 230
Finding eigenvalues of a tridiagonal matrix by bisection methodp. 234
Determing determinant of a matrix by its eigenvaluesp. 235
Estimation of the membership functionp. 237
Introductionp. 237
Fuzzy experiment and fuzzy samplep. 242
How large is large?p. 242
Fuzzy data in physical sciencesp. 242
B-spline Approximation of the membership functionsp. 244
ME analysisp. 247
Numerical Examplesp. 248
Concluding Remarksp. 253
Proof of uniqueness of the optimum solutionp. 255
Estimation of distribution by use of the maximum entropy methodp. 259
Maximum entropyp. 260
Formulation of the maximum entropy methodp. 265
B-spline representation of [Oslash subscript i](x)p. 268
Optimization solversp. 270
Asymptotically unbiased estimate of[lambda subscript i]p. 271
Model selectionp. 272
Numerical Examplesp. 273
Concluding Remarksp. 279
Code specificationsp. 281
Plotting B-splines of order 3p. 281
Files in directory B-splinep. 281
Specificationp. 281
Random number generation by ARMp. 282
Files in the directory of randomp. 282
Specificationsp. 282
Estimating 1-D distribution using B-splinesp. 283
Files in the directory shh1p. 283
Specificationsp. 283
Estimation of 2-D distribution: large samplep. 284
Files in the directory shd2p. 284
Specificationsp. 284
Estimation of 1-D distribution from a histogramp. 285
Files in the directory shh1p. 285
Specificationsp. 285
Estimation of 2-D distribution from a histogramp. 286
Files in the directory shh1p. 286
Specificationsp. 286
Estimation of 2-D distribution using RBFp. 287
Files in the directory shr2p. 287
Specificationsp. 287
Bibliographyp. 289
Indexp. 295
Table of Contents provided by Ingram. All Rights Reserved.

This information is provided by a service that aggregates data from review sources and other sources that are often consulted by libraries, and readers. The University does not edit this information and merely includes it as a convenience for users. It does not warrant that reviews are accurate. As with any review users should approach reviews critically and where deemed necessary should consult multiple review sources. Any concerns or questions about particular reviews should be directed to the reviewer and/or publisher.

  link to old catalogue

Report a problem