# The Method of Least Squares: uncertainty analysis for slope

Probability theory and statistics

### The Method of Least Squares: uncertainty analysis for slope

Let be $$m$$ the slope and $$b$$ the intercept. We are under the assumption that the uncertainties on the values of $$y$$ are much larger than the uncertainties on the values of $$x$$

With the Method of Least Squares these parameters are found to be

$$m = \dfrac{nS_{xy} - S_xS_y}{D}$$

$$b = \dfrac{S_yS_{xx} - S_xS_{xy}}{D}$$

with

$$S_x = \sum^n_{i=1} x_i$$
$$S_{xy} = \sum^n_{i=1} x_iy_i$$
$$S_{y} = \sum^n_{i=1} y_i$$
$$S_{xx} = \sum^n_{i=1} x^2_i$$
$$D = \begin{vmatrix} S_{xx} & S_{x}\\ S_{x} & n \end{vmatrix}$$

The parameters $$m$$ and $$b$$ are affected by uncertainty since the $$y_i$$ values are. $$m$$ and $$b$$ uncertainty are given by

$$\sigma^2_m = \sum^n_{i=1} \left( \dfrac{\partial m}{\partial y_i} \right)^2 \sigma^2_{y_i}$$

$$\sigma^2_b = \sum^n_{i=1} \left( \dfrac{\partial b}{\partial y_i} \right)^2 \sigma^2_{y_i}$$

So, let's make these partial derivatives

$$\sigma^2_m = \left [ \left( \dfrac{1}{D} \right) \dfrac{\partial m}{\partial y_i} (nS_{xy} - S_xSy) \right]^2 \sigma^2_{y_i} = \left[ \dfrac{1}{D} (nS_{x} - S_x) \right]^2 \sigma^2_{y_i} = \dfrac{n^2 \sigma^2_{y_i}}{D^2}$$

The right solution is $$\sigma^2_m =\dfrac{n \sigma^2_{y_i}}{D}$$

The result is the same as given by the text, except for the value of $$n$$ and $$D$$, which should not be squared.
Did I make any mistakes?
Guest