World Scientific Publishing, Modern Physics Letters A, 33n35(18), p. 2439-2450
DOI: 10.1142/s0217732303012672
Full text: Download
It is common to model random errors in a classical measurement by the normal (Gaussian) distribution, because of the central limit theorem. In the quantum theory, the analogous hypothesis is that the matrix elements of the error in an observable are distributed normally. We obtain the probability distribution this implies for the outcome of a measurement, exactly for the case of 2x2 matrices and in the steepest descent approximation in general. Due to the phenomenon of `level repulsion', the probability distributions obtained are quite different from the Gaussian. ; Comment: Based on talk at "Spacetime and Fundamental Interactions: Quantum Aspects" A conference to honor A. P. Balachandran's 65th Birthday