Taylor Expansion
A quantity which is calculated using quantities with known uncertainties will have an uncertainty based upon the uncertainty of the quantities used in the calculation.
To determine the uncertainty in a quantity which is a function of other quantities, you can consider the dependence of these quantities in terms of a tayler expansion
The Taylor series expansion of a function f(x) about the point a is given as
[math]f(x) = f(a) + \left . f^{\prime}(x)\right |_{x=a} \frac{x}{1!} + \left . f^{\prime \prime}(x)\right |_{x=a} \frac{x^2}{2!} + ...[/math]
- [math]= \left . \sum_{n=0}^{\infty} f^{(n)}(x)\right |_{x=a} \frac{x^n}{n!}[/math]
For small values of x (x << 1) we can expand the function about 0 such that
[math]\sqrt{1+x} = \left . \sqrt{1-0} \frac{1}{2}(1+x)^{-1/2}\right |_{x=0} \frac{x^1}{1!}+ \left . \frac{1}{2}\frac{-1}{2}(1+x)^{-3/2} \right |_{x=0} \frac{x^2}{2!}[/math]
- [math]=1 + \frac{x}{2} - \frac{x^2}{8}[/math]
The taylor expansion of a function with two variables[math] (x_1 , x_2)[/math] about the average of the two variables[math] (\bar {x_1} , \bar{x_2} )[/math] is given by
[math]f(x, y)=f(\bar {x}, \bar{x})+(x-\bar {x}) \frac{\partial f}{\partial x}\bigg |_{(x = \bar {x}, y = \bar{y})} +(y-\bar{y}) \frac{\partial f}{\partial y}\bigg |_{(x = \bar {x}, y = \bar{x})}[/math]
or
[math]f(x, y)-f(\bar {x}, \bar{y})=(x-\bar {x}) \frac{\partial f}{\partial x}\bigg |_{(x = \bar {x}, y = \bar{y})} +(y-\bar{y}) \frac{\partial f}{\partial y}\bigg |_{(x = \bar {x}, y = \bar{y})}[/math]
The average
- [math]f(\bar {x}, \bar{y}) \equiv \frac{\sum f(x,y)_i}{N}[/math]
The term
[math]\delta f = f(x, y)-f(\bar {x}, \bar{y})[/math]
represents a small fluctuation [math](\delta f)[/math] of the function [math]f[/math] from its average [math]f(\bar {x}, \bar{y})[/math] if we ignore higher order terms in the Taylor expansion ( this means the fluctuations are small)then we can write the variance using the definition as
- [math]\sigma^2 = \frac{\sum \left [ f(x,y)_i - f(\bar {x}, \bar{y})\right ]^2}{N}[/math]
- [math]= \frac{\sum \left [(x_i-\bar {x}) \frac{\partial f}{\partial x}+(y_i-\bar{y}) \frac{\partial f}{\partial y}\right ]^2}{N}[/math]
- [math]= \frac{\sum (x_i-\bar {x})^2 \left ( \frac{\partial f}{\partial x}\right )^2}{N} + \frac{\sum (y_i-\bar {y})^2 \left ( \frac{\partial f}{\partial y}\right )^2}{N} + 2 \frac{\sum (x_i-\bar {x}) \left ( \frac{\partial f}{\partial x} \right ) (y_i-\bar {y}) \left ( \frac{\partial f}{\partial y}\right )}{N} [/math]
- [math]\sigma^2 = \sigma_x^2 \left ( \frac{\partial f}{\partial x}\right )^2 + \sigma_y^2\left ( \frac{\partial f}{\partial y}\right )^2 + 2 \sigma_{x,y}^2 \left ( \frac{\partial f}{\partial x} \right ) \left ( \frac{\partial f}{\partial y}\right ) [/math]
where
- [math]\sigma_{x,y}^2 = \frac{\sum (x_i-\bar {x}) (y_i-\bar {y}) }{N} \equiv[/math] Covariance
The above can be reproduced for functions with multiple variables.
Instrumental and Statistical Uncertainties
http://www.physics.uoguelph.ca/~reception/2440/StatsErrorsJuly26-06.pdf
Counting Experiment Example
The table below reports 8 measurements of the coincidence rate observed by two scintillators detecting cosmic rays. The scintillator are place a distance (x) away from each other in order to detect cosmic rays falling on the earth's surface. The time and observed coincidence counts are reported in separate columns as well as the angle made by the normal to the detector with the earths surface.
Date |
Time (hrs) |
[math]\theta[/math] |
Coincidence Counts |
Mean Coinc/Hr |
[math]\sigma_{Poisson} = \sqrt{\mbox{Mean Counts/Hr}}[/math] |
[math]\left | \sigma \right |[/math] from Mean
|
9/12/07 |
20.5 |
30 |
2233 |
109 |
10.4 |
1
|
9/14/07 |
21 |
30 |
1582 |
75 |
8.7 |
2
|
10/3/07 |
21 |
30 |
2282 |
100 |
10.4 |
1
|
10/4/07 |
21 |
30 |
2029 |
97 |
9.8 |
0.1
|
10/15/07 |
21 |
30 |
2180 |
100 |
10 |
0.6
|
10/18/07 |
21 |
30 |
2064 |
99 |
9.9 |
0.1
|
10/23/07 |
21 |
30 |
2003 |
95 |
9.7 |
0.2
|
10/26/07 |
21 |
30 |
1943 |
93 |
9.6 |
0.5
|
The average count rate for a given trial is given in the 5th column by diving column 4 by column 2.
One can expect a Poisson parent distribution because the probability of a cosmic ray interacting with the scintillator is low. The variance of measurement in each trial is related to the counting rate by
- [math]\sigma^2 = \mu =[/math] average counting rate
as a result of the assumption that the parent distribution is Poisson. The value of this [math]\sigma[/math] is shown in column 6.
- Is the Poisson distribution the parent distribution in this experiment?
To try and answer the above question lets determine the mean and variance of the data:
- [math]\bar{x} =\frac{\sum CPM_i}{8} = 97.44[/math]
- [math]s = \sqrt{\frac{\sum (x_i-\mu)^2}{8-1}} = 10.8[/math]
If the parent population is Poisson then we would expect
If you approximate the Poisson distribution by a Gaussian then the probability any one measurement is within 1 [math]\sigma[/math] of the mean is 68% = Probability that a measurement of a Gaussian variant will lie within 1 [math]\sigma[/math] of the mean. For the Poisson distribution with a mean of 97 you would have 66% of the data occur within 1 [math]\sigma = \sqrt{97}[/math].
root [26] ROOT::Math::poisson_cdf(97-sqrt(97),97)
(double)1.67580969302001004e-01
root [30] 1-2*ROOT::Math::poisson_cdf(97-sqrt(97),97)
(const double)6.64838061395997992e-01
root [28] ROOT::Math::normal_cdf(97-sqrt(97),sqrt(97),97)
(double)1.58655253931457185e-01
root [29] 1-2*ROOT::Math::normal_cdf(97-sqrt(97),sqrt(97),97)
(const double)6.82689492137085630e-01
The 7th column above identifies how many sigma the mean of that trial is from the average [math]\bar{x}[/math].
- [math]= 0.68 * 8 = 5[/math]
Looks like we have 7/8 events within 1[math] \sigma[/math] = 87.5%
Error Propagation
- [math]f = \bar{x} = \frac{\sum x_i}{N}[/math]
- [math]\frac{\partial f}{\partial x_i} = \frac{1}{N}[/math]
- [math]\delta f = \frac{\partial f}{\partial x_1}\sigma_{x_1} + \frac{\partial f}{\partial x_1}\sigma_{x_1} + \cdots \frac{\partial f}{\partial x_n}\sigma_{x_n}[/math]
- [math]\left ( \delta f \right)^2 = \left ( \frac{\partial f}{\partial x_1}\sigma_{x_1} + \frac{\partial f}{\partial x_1}\sigma_{x_1} + \cdots \frac{\partial f}{\partial x_n}\sigma_{x_n} \right )^2[/math]
- [math]= \left ( \frac{\partial f}{\partial x_1}\sigma_{x_1} \right )^2 + \left ( \frac{\partial f}{\partial x_2}\sigma_{x_2} \right )^2 + \cdots \left ( \frac{\partial f}{\partial x_n}\sigma_{x_n} \right )^2 + 2 \left ( \frac{\partial^2 f}{\partial x_1\partial x_2} \right) \sigma_{x_1}\sigma_{x_2} + \cdots[/math]
- [math]\frac{\partial^2 f}{\partial x_i\partial x_j} = \frac{\partial }{\partial x_j} \frac{1}{N} = 0 \Rightarrow[/math] no Covariances
- [math]\left ( \delta f \right)^2 = \left ( \frac{\partial f}{\partial x_1}\sigma_{x_1} \right )^2 + \left ( \frac{\partial f}{\partial x_2}\sigma_{x_2} \right )^2 + \cdots \left ( \frac{\partial f}{\partial x_n}\sigma_{x_n} \right )^2 [/math]
- [math] = \left ( \frac{1}{N} \sigma_{x_1} \right )^2 + \left ( \frac{1}{N}\sigma_{x_2} \right )^2 + \cdots \left ( \frac{1}{N}\sigma_{x_n} \right )^2 [/math]
If
- [math] \sigma_i = \sigma[/math]
Then
- [math]\left ( \delta f \right)^2 = \left ( \frac{1}{N} \sigma \right )^2 + \left ( \frac{1}{N}\sigma \right )^2 + \cdots \left ( \frac{1}{N}\sigma \right )^2 [/math]
- [math]=\frac{ \sigma^2}{N}[/math]
Example: Table Area
A quantity which is calculated using quantities with known uncertainties will have an uncertainty based upon the uncertainty of the quantities used in the calculation.
To determine the uncertainty in a quantity which is a function of other quantities, you can consider the dependence of these quantities in terms of a tayler expansion
Consider a calculation of a Table's Area
[math]A= L \times W[/math]
The mean that the Area (A) is a function of the Length (L) and the Width (W) of the table.
[math]A = f(L,W)[/math]
The Taylor series expansion of a function f(x) about the point a is given as
[math]f(x) = f(a) + \left . f^{\prime}(x)\right |_{x=a} \frac{x}{1!} + \left . f^{\prime \prime}(x)\right |_{x=a} \frac{x^2}{2!} + ...[/math]
- [math]= \left . \sum_{n=0}^{\infty} f^{(n)}(x)\right |_{x=a} \frac{x^n}{n!}[/math]
For small values of x (x << 1) we can expand the function about 0 such that
[math]\sqrt{1+x} = \left . \sqrt{1-0} \frac{1}{2}(1+x)^{-1/2}\right |_{x=0} \frac{x^1}{1!}+ \left . \frac{1}{2}\frac{-1}{2}(1+x)^{-3/2} \right |_{x=0} \frac{x^2}{2!}[/math]
- [math]=1 + \frac{x}{2} - \frac{x^2}{8}[/math]
The talylor expansion of a function with two variables[math] (x_1 , x_2)[/math] about the average of the two variables[math] (\bar {x_1} , \bar{x_2} )[/math] is given by
[math]f(x_1, x_2)=f(\bar {x}_1, \bar{x}_2)+(x_1-\bar {x}_1) \frac{\partial f}{\partial x_1}\bigg |_{(x_1 = \bar {x}_1, x_2 = \bar{x}_2)} +(x_2-\bar{x}_2) \frac{\partial f}{\partial x_2}\bigg |_{(x_1 = \bar {x}_1, x_2 = \bar{x}_2)}[/math]
or
[math]f(x_1, x_2)-f(\bar {x}_1, \bar{x}_2)=(x_1-\bar {x}_1) \frac{\partial f}{\partial x_1}\bigg |_{(x_1 = \bar {x}_1, x_2 = \bar{x}_2)} +(x_2-\bar{x}_2) \frac{\partial f}{\partial x_2}\bigg |_{(x_1 = \bar {x}_1, x_2 = \bar{x}_2)}[/math]
The term
[math]f(x_1, x_2)-f(\bar {x}_1, \bar{x}_2)[/math]
represents a small fluctuation of the function from its average [math]f(\bar {x}_1, \bar{x}_2)[/math] if we ignore higher order terms in the Taylor expansion ( this means the fluctuations are small).
Based on the Definition of Variance
- [math]\sigma^2 = \frac{\sum_{i=1}^{i=N} (x_i - \bar{x})^2}{N}[/math]
We can write the variance of the area
- [math]\sigma^2_A = \frac{\sum_{i=1}^{i=N} (A_i - \bar{A})^2}{N}[/math]
- [math]= \frac{\sum_{i=1}^{i=N} \left [ (L-\bar{L}) \frac{\partial A}{\partial L} \bigg |_{\bar L \bar W} + (W-\bar W) \frac{\partial A}{\partial W} \bigg |_{\bar L \bar WW} \right] ^2}{N}[/math]
- [math]= \frac{\sum_{i=1}^{i=N} \left [ (L-\bar{L}) \frac{\partial A}{\partial L} \bigg |_{\bar L \bar W} \right ] ^2}{N} + \frac{\sum_{i=1}^{i=N} \left [ (W-\bar W) \frac{\partial A}{\partial W} \bigg |_{\bar L \bar W} \right] ^2 }{N}[/math]
- [math]+2 \frac{\sum_{i=1}^{i=N} \left [ (L-\bar{L}) (W-\bar W) \frac{\partial A}{\partial L} \bigg |_{\bar L \bar W} \frac{\partial A}{\partial W} \bigg |_{\bar L \bar W} \right]^2}{N} [/math]
- [math]= \sigma^2_L \left ( \frac{\partial A}{\partial L} \right )^2 +\sigma^2_W \left ( \frac{\partial A}{\partial W} \right )^2 + 2 \sigma^2_{LW} \frac{\partial A}{\partial L} \frac{\partial A}{\partial W} [/math]
where
[math]\sigma^2_{LW} = \frac{\sum_{i=1}^{i=N} \left [ (L-\bar{L}) (W-\bar W) \right ]^2}{N}[/math] is defined as the Covariance between [math]L[/math] and [math]W[/math].
Weighted Mean and variance
If each observable ([math]x_i[/math]) is accompanied by an estimate of the uncertainty in that observable ([math]\delta x_i[/math]) then weighted mean is defined as
- [math]\bar{x} = \frac{ \sum_{i=1}^{i=n} \frac{x_i}{\delta x_i}}{\sum_{i=1}^{i=n} \frac{1}{\delta x_i}}[/math]
The variance of the distribution is defined as
- [math]\bar{x} = \sum_{i=1}^{i=n} \frac{1}{\delta x_i}[/math]
Date |
Time (hrs) |
[math]\theta[/math] |
Coincidence Counts |
Coinc/Hour |
[math]\sqrt{N}[/math]
|
9/6/07 |
18 |
45 |
1065 |
59.2 |
|
9/7/07 |
14.66 |
45 |
881 |
60.1 |
|
9/9/07 |
43 |
60 |
1558 |
36.23 |
|
9/12/07 |
20.5 |
330 |
2233 |
108.93 |
|
9/13/07 |
21 |
315 |
2261 |
107.67 |
|
9/14/07 |
21 |
330 |
1582 |
75.33 |
|
9/18/07 |
21 |
300 |
1108 |
52.8 |
|
9/19/07 |
21 |
300 |
1210 |
57.62 |
|
9/20/07 |
21 |
300 |
1111 |
52.69 |
|
9/21/07 |
21 |
300 |
1012 |
57.62 |
|
9/26/07 |
21 |
315 |
1669 |
79.48 |
|
9/27/07 |
21 |
315 |
1756 |
83.29 |
|
9/29/07 |
24.5 |
315 |
2334 |
95.27 |
|
10/3/07 |
21 |
330 |
2282 |
108.67 |
|
10/4/07 |
21 |
330 |
2029 |
96.62 |
|
10/10/07 |
21 |
315 |
1947 |
92.71 |
|
10/15/07 |
69 |
330 |
2180 |
31.59 |
|
10/18/07 |
21 |
330 |
2064 |
98.52 |
|
10/23/07 |
21 |
330 |
2003 |
95.38 |
|
10/26/07 |
21 |
330 |
1943 |
92.52 |
|
11/2/07 |
21 |
330 |
2784 |
|
|
11/5/07 |
69 |
330 |
10251 |
148.57
|
11/16/07 |
21 |
30 |
3581 |
170.52
|
[1] Forest_Error_Analysis_for_the_Physical_Sciences