Nuisance variable

In probability theory, a nuisance variable is a random variable which is fundamental to the probabilistic model, but which is of no particular interest in itself.

Having obtained the joint conditional distribution of all of the unknown random variables given the known variables, for example by applying Bayes theorem, one must then marginalise over the nuisance variables to obtain the conditional distribution for just the quantities of interest.

If this is not possible analytically, it may involve extensive computation, often achieved by using Markov chain Monte Carlo techniques.

In statistics, a nuisance parameter is any parameter which is not of immediate interest, which nonetheless must be accounted in the analysis of some other parameters. The classic example of a nuisance parameter is the variance σ2 of a normal distribution, when the mean μ is of primary interest.

Nuisance parameters are often variances, but not always; for example in an errors-in-variables model, the unknown true location of each observation is a nuisance parameter. In general, any parameter which intrudes on the analysis of another may be considered a nuisance parameter. Also, a parameter may cease to be a "nuisance" if it becomes the object of study, as the variance of a distribution may be.

In some cases, it is possible to formulate methods that circumvent nuisance parameters. The t-test is especially useful because the test statistic does not depend on the unknown variance. However, in other cases no such circumvention is known.

The use of Bayesian probability methods, as described above, can provide a consistent, principled manner of handling nuisance parameters. However, Bayesian methods are still controversial in some quarters.