User:Pakalomattam/pu

In statistics, propagation of uncertainty (or propagation of error) is the effect of variables' uncertainties (or errors) on the uncertainty of a function based on them. Mainly, the variables are measured in an experiment, and have uncertainties due to measurement limitations (e.g. instrument precision) which propagate to the result.

The uncertainty is usually defined by the absolute error — a variable that is probable to get the values x±Δx is said to have an uncertainty (or margin of error) of Δx. In other words, for a measured value x, it is probable that the true value lies in the interval [x&minus;Δx, x+Δx]. Uncertainties can also be defined by the relative error Δx/x, which is usually written as a percentage. In many cases it is assumed that the difference between a measured value and the true value is normally distributed, with the standard deviation of the distribution being the uncertainty of the measurement.

This article explains how to calculate the uncertainty of a function if the variables' uncertainties are known.

General formula
Let $$f(x_1,x_2,...,x_n)$$ be a function which depends on $$n$$ variables $$x_1,x_2,...,x_n$$. The uncertainty of each variable is given by $$\Delta x_j$$:


 * $$x_j \pm \Delta x_j\, .$$

If the variables are uncorrelated, we can calculate the uncertainty Δf of f that results from the uncertainties of the variables:


 * $$\Delta f = \Delta f \left(x_1, x_2, ..., x_n, \Delta x_1, \Delta x_2, ..., \Delta x_n \right) = \left( \sum_{i=1}^n \left(\frac{\partial f}{\partial x_i}\Delta x_i \right)^2 \right)^{1/2} \, ,$$

where $$\frac{\partial f}{\partial x_j}$$ designates the partial derivative of $$f$$ for the $$j$$-th variable.

If the variables are correlated, the covariance between variable pairs, Ci,k := cov(xi,xk), enters the formula with a double sum over all pairs (i,k):


 * $$\Delta f = \left( \sum_{i=1}^n \sum_{k=1}^n \left(\frac{\partial f}{\partial x_i}\frac{\partial f}{\partial x_k}C_{i,k} \right) \right)^{1/2}\, ,$$

where Ci,i = var(xi) = Δxi².

After calculating $$\Delta f$$, we can say that the value of the function with its uncertainty is:


 * $$f \pm \Delta f \, .$$

Example formulas
This table shows the uncertainty of simple functions, resulting from uncorrelated variables A, B, C with uncertainties ΔA, ΔB, ΔC, and a precisely-known constant c.


 * {| border="1" cellpadding="8" cellspacing="0" align="" style="text-align:center"

! style="background:#ffdead;" | Function !! colspan="2" style="background:#ffdead;" | Uncertainty non-independence: $$\sigma_X^2 = B^2\sigma_A^2 + A^2\sigma_B^2 + 2A\cdot B\cdot E_{1 1} + 2A\cdot E_{1 2} + 2B\cdot E_{2 1} + E_{2 2} - E_{1 1}^2$$ where $$E_{ij}=E((\Delta A)^i\cdot (\Delta B)^j)$$ where $$\Delta A = a-A$$ $$\frac{\Delta X}{X}=\frac{\Delta A}{A}+\frac{\Delta B}{B}+\frac{\Delta C}{C}$$ || $$\left (\frac{\sigma_X}{X}\right )^2=\left (\frac{\sigma_A}{A}\right )^2+\left (\frac{\sigma_B}{B}\right )^2+\left (\frac{\sigma_C}{C}\right )^2+\left (\frac{\sigma_A\cdot\sigma_B}{A\cdot B}\right )^2+\left (\frac{\sigma_A\cdot\sigma_C}{A\cdot C}\right )^2+\left (\frac{\sigma_B\cdot\sigma_C}{B\cdot C}\right )^2+\left (\frac{\sigma_A\cdot\sigma_B\cdot\sigma_C}{A\cdot B\cdot C}\right )^2$$ $$\left (\frac{\sigma_X}{X}\right )^2=\left (\frac{\sigma_A}{A}\right )^2+\left (\frac{\sigma_B}{B}\right )^2+\left (\frac{\sigma_C}{C}\right )^2$$
 * $$X = A \pm B \,$$ || $$\Delta X = \Delta A + \Delta B$$ || $$\sigma_X^2= \sigma_A^2 + \sigma_B^2 \,$$ non-independent: $$\sigma_X^2= \sigma_A^2 + \sigma_B^2+2\cdot cov(A,B)$$ where $$cov(A,B)$$ is the COVariance.
 * $$X = cA \,$$ || $$\Delta X = c \cdot \Delta A$$ || $$\sigma_X = c \cdot \sigma_A \,$$
 * $$X = A \cdot B \, $$
 * $$\frac{\Delta X}{X} = \frac{\Delta A}{A} + \frac{\Delta B}{B} + \frac{\Delta A\cdot \Delta B}{A\cdot B}$$ $$\frac{\Delta X}{X} = \frac{\Delta A}{A} + \frac{\Delta B}{B}$$ || $$\left( \frac{\sigma_X}{X} \right)^2 = \left( \frac{\sigma_A}{A} \right)^2 + \left( \frac{\sigma_B}{B} \right)^2 + \left ( \frac{\sigma_A\cdot \sigma_B}{A\cdot B} \right )^2 \,$$ $$\left( \frac{\sigma_X}{X} \right)^2 = \left( \frac{\sigma_A}{A} \right)^2 + \left( \frac{\sigma_B}{B} \right)^2$$
 * $$X = A \cdot B \, $$
 * $$\frac{\Delta X}{X} = \frac{\Delta A}{A} + \frac{\Delta B}{B} + \frac{\Delta A\cdot \Delta B}{A\cdot B}$$ $$\frac{\Delta X}{X} = \frac{\Delta A}{A} + \frac{\Delta B}{B}$$ || $$\left( \frac{\sigma_X}{X} \right)^2 = \left( \frac{\sigma_A}{A} \right)^2 + \left( \frac{\sigma_B}{B} \right)^2 + \left ( \frac{\sigma_A\cdot \sigma_B}{A\cdot B} \right )^2 \,$$ $$\left( \frac{\sigma_X}{X} \right)^2 = \left( \frac{\sigma_A}{A} \right)^2 + \left( \frac{\sigma_B}{B} \right)^2$$
 * $$\frac{\Delta X}{X} = \frac{\Delta A}{A} + \frac{\Delta B}{B} + \frac{\Delta A\cdot \Delta B}{A\cdot B}$$ $$\frac{\Delta X}{X} = \frac{\Delta A}{A} + \frac{\Delta B}{B}$$ || $$\left( \frac{\sigma_X}{X} \right)^2 = \left( \frac{\sigma_A}{A} \right)^2 + \left( \frac{\sigma_B}{B} \right)^2 + \left ( \frac{\sigma_A\cdot \sigma_B}{A\cdot B} \right )^2 \,$$ $$\left( \frac{\sigma_X}{X} \right)^2 = \left( \frac{\sigma_A}{A} \right)^2 + \left( \frac{\sigma_B}{B} \right)^2$$
 * $$X = A\cdot B\cdot C \,$$
 * $$\frac{\Delta X}{X}=\frac{\Delta A}{A}+\frac{\Delta B}{B}+\frac{\Delta C}{C}+\frac{\Delta A\cdot\Delta B}{A\cdot B}+\frac{\Delta A\cdot\Delta C}{A\cdot C}+\frac{\Delta B\cdot\Delta C}{B\cdot C}+\frac{\Delta A\cdot\Delta B\cdot\Delta C}{A\cdot B\cdot C}$$
 * $$\frac{\Delta X}{X}=\frac{\Delta A}{A}+\frac{\Delta B}{B}+\frac{\Delta C}{C}+\frac{\Delta A\cdot\Delta B}{A\cdot B}+\frac{\Delta A\cdot\Delta C}{A\cdot C}+\frac{\Delta B\cdot\Delta C}{B\cdot C}+\frac{\Delta A\cdot\Delta B\cdot\Delta C}{A\cdot B\cdot C}$$
 * $$X = A^i\cdot B^j$$ || $$\frac{\Delta X}{X}=|i|\frac{\Delta A}{A}+|j|\frac{\Delta B}{B}$$ || $$\left (\frac{\sigma_X}{X}\right )^2 = \left (i\frac{\sigma_A}{A}\right )^2+\left (j\frac{\sigma_B}{B}\right )^2$$ equivalently: $$\sigma_X^2=\left (i\cdot A^{i-1}\cdot B^j\right )^2\cdot\sigma_A^2 + \left (j\cdot A^i\cdot B^{j-1}\right )^2\cdot\sigma_B^2$$
 * $$X = \ln (A) \,$$ || $$\Delta X=\frac{\Delta A}{A}$$ || $$\sigma_X = \frac{\sigma_A}{A}$$
 * $$X = e^A \,$$ || $$\Delta X=e^A\cdot \Delta A$$ || $$\frac{\sigma_X}{X} = \sigma_A \,$$
 * }
 * $$X = e^A \,$$ || $$\Delta X=e^A\cdot \Delta A$$ || $$\frac{\sigma_X}{X} = \sigma_A \,$$
 * }
 * }

Partial derivatives
Given $$X=f(A, B, C, \cdots)$$
 * {| border="1" cellpadding="8" cellspacing="0" align="" style="text-align:center"

! style="background:#ffdead;" | Absolute Error !! style="background:#ffdead;" | Variance
 * $$\Delta X=\left |\frac{\delta f}{\delta A}\right |\cdot \Delta A+\left |\frac{\delta f}{\delta B}\right |\cdot \Delta B+\left |\frac{\delta f}{\delta C}\right |\cdot \Delta C+\cdots$$ || $$\sigma_X^2=\left (\frac{\delta f}{\delta A}\sigma_A\right )^2+\left (\frac{\delta f}{\delta B}\sigma_B\right )^2+\left (\frac{\delta f}{\delta C}\sigma_C\right )^2+\cdots$$
 * }
 * }

Example calculation: Inverse tangent function
We can calculate the uncertainty propagation for the inverse tangent function as an example of using partial derivatives to propagate error.

Define


 * $$f(\theta) = \arctan{\theta}$$,

where $$\sigma_{\theta}$$ is the absolute uncertainty on our measurement of $$\theta$$.

The partial derivative of $$f(\theta)$$ with respect to $$\theta$$ is


 * $$\frac{\partial f}{\partial \theta} = \frac{1}{1+\theta^2}$$.

Therefore, our propagated uncertainty is


 * $$\sigma_{f} = \frac{\sigma_{\theta}}{1+\theta^2}$$,

where $$\sigma_{f}$$ is the absolute propagated uncertainty.

Example application: Resistance measurement
A practical application is an experiment in which one measures current, I, and voltage, V, on a resistor in order to determine the resistance, R, using Ohm's law, $$R = V / I.$$

Given the measured variables with uncertainties, I±ΔI and V±ΔV, the uncertainty in the computed quantity, ΔR is


 * $$\Delta R = \left( \left(\frac{\Delta V}{I}\right)^2+\left(\frac{V}{I^2}\Delta I\right)^2\right)^{1/2} = R\sqrt{\left(\frac{\Delta V}{V}\right)^2+\left(\frac{\Delta I}{I}\right)^2}.$$

Thus, in this simple case, the relative error ΔR/R is simply the square root of the sum of the squares of the two relative errors of the measured variables.