The Method of Least Squares: uncertainty analysis for slope

Probability theory and statistics

The Method of Least Squares: uncertainty analysis for slope

Postby Guest » Mon Sep 14, 2020 4:35 am

Let be [tex]m[/tex] the slope and [tex]b[/tex] the intercept. We are under the assumption that the uncertainties on the values of [tex]y[/tex] are much larger than the uncertainties on the values of [tex]x[/tex]

With the Method of Least Squares these parameters are found to be

[tex]m = \dfrac{nS_{xy} - S_xS_y}{D}[/tex]

[tex]b = \dfrac{S_yS_{xx} - S_xS_{xy}}{D}[/tex]

with

[tex]S_x = \sum^n_{i=1} x_i[/tex]
[tex]S_{xy} = \sum^n_{i=1} x_iy_i[/tex]
[tex]S_{y} = \sum^n_{i=1} y_i[/tex]
[tex]S_{xx} = \sum^n_{i=1} x^2_i[/tex]
[tex]D =
\begin{vmatrix}
S_{xx} & S_{x}\\
S_{x} & n
\end{vmatrix}[/tex]

The parameters [tex]m[/tex] and [tex]b[/tex] are affected by uncertainty since the [tex]y_i[/tex] values are. [tex]m[/tex] and [tex]b[/tex] uncertainty are given by

[tex]\sigma^2_m = \sum^n_{i=1} \left( \dfrac{\partial m}{\partial y_i} \right)^2 \sigma^2_{y_i}[/tex]

[tex]\sigma^2_b = \sum^n_{i=1} \left( \dfrac{\partial b}{\partial y_i} \right)^2 \sigma^2_{y_i}[/tex]

So, let's make these partial derivatives

[tex]\sigma^2_m = \left [ \left( \dfrac{1}{D} \right) \dfrac{\partial m}{\partial y_i} (nS_{xy} - S_xSy) \right]^2 \sigma^2_{y_i} = \left[ \dfrac{1}{D} (nS_{x} - S_x) \right]^2 \sigma^2_{y_i} = \dfrac{n^2 \sigma^2_{y_i}}{D^2}[/tex]

The right solution is [tex]\sigma^2_m =\dfrac{n \sigma^2_{y_i}}{D}[/tex]


The result is the same as given by the text, except for the value of [tex]n[/tex] and [tex]D[/tex], which should not be squared.
Did I make any mistakes?
Guest
 

Return to Probabilities and Statistics



Who is online

Users browsing this forum: No registered users and 2 guests