Published in

World Scientific Publishing, Modern Physics Letters A, 33n35(18), p. 2439-2450

DOI: 10.1142/s0217732303012672

Links

Tools

Export citation

Search in Google Scholar

A Theory of Errors in Quantum Measurement

Journal article published in 2003 by S. G. Rajeev
This paper is available in a repository.
This paper is available in a repository.

Full text: Download

Green circle
Preprint: archiving allowed
Green circle
Postprint: archiving allowed
Red circle
Published version: archiving forbidden
Data provided by SHERPA/RoMEO

Abstract

It is common to model random errors in a classical measurement by the normal (Gaussian) distribution, because of the central limit theorem. In the quantum theory, the analogous hypothesis is that the matrix elements of the error in an observable are distributed normally. We obtain the probability distribution this implies for the outcome of a measurement, exactly for the case of 2x2 matrices and in the steepest descent approximation in general. Due to the phenomenon of `level repulsion', the probability distributions obtained are quite different from the Gaussian. ; Comment: Based on talk at "Spacetime and Fundamental Interactions: Quantum Aspects" A conference to honor A. P. Balachandran's 65th Birthday