Fitting Data

Noisy Data

Mathematica's Random function can be used to create "noisy" data.
Each time the following cell is evaluated, a different picture is produced.

In[1]:=

ϵ = 0.1 ;

nums = Table[Random[Real, {-ϵ, ϵ}],  {k, 100}] ; 

ListPlot[nums, PlotStyle {OrangeRed, PointSize[0.02]}] ;

[Graphics:HTMLFiles/6_fitting_data_4.gif]

Let's create some noisy data, and then pretend that we have just discovered it somewhere.

In[4]:=

Clear[f, x, xs, ys, noisyYs, data, noisyData] 

f[x_] = x^3 ; 

xs = 0.1 Range[0, 10] ;

ys = Map[f, xs] ; 

noisyYs = Map[# + Random[Real, {-ϵ, ϵ}] &, ys] ; 

data = Transpose[{xs, ys}] ;

noisyData = Transpose[{xs, noisyYs}] ;

Oh, my. Look at this. It's some noisy data.

In[11]:=

<<Graphics`Graphics`

In[12]:=

DisplayTogether[ListPlot[data, PlotStyle {ForestGreen, PointSize[0.02]}], ListPlot[noisyData, PlotStyle {OrangeRed, PointSize[0.02]}]] ;

[Graphics:HTMLFiles/6_fitting_data_14.gif]

Fitting Data with a Regression Line
Lay 6.5, Example 1, page 420

Find a least-squares line that best fits a set of data points.

In[13]:=

Clear[data, X, x1, x2, y, β, β0, β1, x] ; 

data = {{2, 1}, {5, 2}, {7, 3}, {8, 3}} ;

%//MatrixForm

Out[15]//MatrixForm=

( {{2, 1}, {5, 2}, {7, 3}, {8, 3}} )

Construct the equation X.β=y.
A typical row of X.β is of the form 1*β0 + x*β1

In[16]:=

x1 = {1, 1, 1, 1} ;

x2 = Transpose[data][[1]] ;

X = Transpose[{x1, x2}] ;

β = {β0, β1} ;

y = Transpose[data][[2]] ; 

Print[X//MatrixForm, ".", β//MatrixForm, "=", y//MatrixForm]

( {{1, 2}, {1, 5}, {1, 7}, {1, 8}} )  .  ( {{β0}, {β1}} )  =  ( {{1}, {2}, {3}, {3}} )

Use the normal equations.

In[22]:=

Transpose[X] . X ;

%//MatrixForm

Out[23]//MatrixForm=

( {{4, 22}, {22, 142}} )

In[24]:=

Transpose[X] . y ;

%//MatrixForm

Out[25]//MatrixForm=

( {{9}, {57}} )

Now solve the system X^TXβ =  X^Ty.

In[26]:=

Print[Transpose[X] . X//MatrixForm, ".", β//MatrixForm, " = ", Transpose[X] . y//MatrixForm]

( {{4, 22}, {22, 142}} )  .  ( {{β0}, {β1}} )  =  ( {{9}, {57}} )

In[27]:=

soln = Solve[Transpose[X] . X . β == Transpose[X] . y, {β0, β1}][[1]]

Out[27]=

{β02/7, β15/14}

In[28]:=

β = β/.soln

Out[28]=

{2/7, 5/14}

Interpret the result: The (least squares) line which best fits the data is

In[29]:=

Clear[y, x] ; 

y[x_] = β . {1, x}

Out[30]=

2/7 + (5 x)/14

Now plot the data and its regression line.

In[31]:=

dotPlot = ListPlot[data, PlotStyle {Red, PointSize[.02]}, DisplayFunctionIdentity] ; 

linePlot = Plot[y[x], {x, 2, 8}, PlotStyle->SteelBlue, DisplayFunctionIdentity] ; 

Show[dotPlot, linePlot, DisplayFunction$DisplayFunction] ;

Print["regression line: y = ", β[[1]], " + ", β[[2]], "x."]

[Graphics:HTMLFiles/6_fitting_data_47.gif]

regression line: y = 2/7 + 5/14x.

Fitting Data with a Regression Line
Lay 6.6.1

Find a least-squares line that best fits a set of data points.

In[35]:=

Clear[data, X, x1, x2, y, β, β0, β1, βHatx] ; 

data = {{0, 1}, {1, 1}, {2, 2}, {3, 2}} ;

%//MatrixForm

Out[37]//MatrixForm=

( {{0, 1}, {1, 1}, {2, 2}, {3, 2}} )

Visualize the data.

In[38]:=

dotPlot = ListPlot[data, PlotStyle {Red, PointSize[.02]}] ;

[Graphics:HTMLFiles/6_fitting_data_54.gif]

Construct the equation X.β=y.
A typical row of X.β is of the form 1*β0 + x*β1

In[39]:=

x1 = {1, 1, 1, 1} ;

x2 = Transpose[data][[1]] ;

X = Transpose[{x1, x2}] ;

β = {β0, β1} ;

y = Transpose[data][[2]] ; 

Print[X//MatrixForm, ".", β//MatrixForm, "=", y//MatrixForm]

( {{1, 0}, {1, 1}, {1, 2}, {1, 3}} )  .  ( {{β0}, {β1}} )  =  ( {{1}, {1}, {2}, {2}} )

Use the normal equations.

In[45]:=

Transpose[X] . X ;

%//MatrixForm

Out[46]//MatrixForm=

( {{4, 6}, {6, 14}} )

In[47]:=

Transpose[X] . y ;

%//MatrixForm

Out[48]//MatrixForm=

( {{6}, {11}} )

Now solve the system X^TXβ =  X^Ty.

In[49]:=

Print[Transpose[X] . X//MatrixForm, ".", β//MatrixForm, " = ", Transpose[X] . y//MatrixForm]

( {{4, 6}, {6, 14}} )  .  ( {{β0}, {β1}} )  =  ( {{6}, {11}} )

In[50]:=

soln = Solve[Transpose[X] . X . β == Transpose[X] . y, {β0, β1}][[1]]

Out[50]=

{β09/10, β12/5}

In[51]:=

βHat = β/.soln

Out[51]=

{9/10, 2/5}

Interpret the result: The (least squares) line which best fits the data is

In[52]:=

Clear[y, x] ; 

y[x_] = βHat . {1, x}

Out[53]=

9/10 + (2 x)/5

Now plot the data and its regression line.

In[54]:=

dotPlot = ListPlot[data, PlotStyle {Red, PointSize[.02]}, DisplayFunctionIdentity] ; 

linePlot = Plot[y[x], {x, 0, 3}, PlotStyle->SteelBlue, DisplayFunctionIdentity] ; 

Show[dotPlot, linePlot, DisplayFunction$DisplayFunction] ;

Print["regression line: y = ", βHat[[1]], " + ", βHat[[2]], "x."]

[Graphics:HTMLFiles/6_fitting_data_83.gif]

regression line: y = 9/10 + 2/5x.

Fitting Data with a Regression Line
Lay 6.6.3

Find a least-squares line that best fits a set of data points.

In[58]:=

Clear[data, X, x1, x2, y, β, β0, β1, βHatx] ; 

data = {{-1, 0}, {0, 1}, {1, 2}, {2, 4}} ;

%//MatrixForm

Out[60]//MatrixForm=

( {{-1, 0}, {0, 1}, {1, 2}, {2, 4}} )

Visualize the data.

In[61]:=

dotPlot = ListPlot[data, PlotStyle {Red, PointSize[.02]}] ;

[Graphics:HTMLFiles/6_fitting_data_90.gif]

Construct the equation X.β=y.
A typical row of X.β is of the form 1*β0 + x*β1

In[62]:=

x1 = {1, 1, 1, 1} ;

x2 = Transpose[data][[1]] ;

X = Transpose[{x1, x2}] ;

β = {β0, β1} ;

y = Transpose[data][[2]] ; 

Print[X//MatrixForm, ".", β//MatrixForm, "=", y//MatrixForm]

( {{1, -1}, {1, 0}, {1, 1}, {1, 2}} )  .  ( {{β0}, {β1}} )  =  ( {{0}, {1}, {2}, {4}} )

Use the normal equations.

In[68]:=

Transpose[X] . X ;

%//MatrixForm

Out[69]//MatrixForm=

( {{4, 2}, {2, 6}} )

In[70]:=

Transpose[X] . y ;

%//MatrixForm

Out[71]//MatrixForm=

( {{7}, {10}} )

Now solve the system X^TXβ =  X^Ty.

In[72]:=

Print[Transpose[X] . X//MatrixForm, ".", β//MatrixForm, " = ", Transpose[X] . y//MatrixForm]

( {{4, 2}, {2, 6}} )  .  ( {{β0}, {β1}} )  =  ( {{7}, {10}} )

In[73]:=

soln = Solve[Transpose[X] . X . β == Transpose[X] . y, {β0, β1}][[1]]

Out[73]=

{β011/10, β113/10}

In[74]:=

βHat = β/.soln

Out[74]=

{11/10, 13/10}

Interpret the result: The (least squares) line which best fits the data is

In[75]:=

Clear[y, x] ; 

y[x_] = βHat . {1, x}

Out[76]=

11/10 + (13 x)/10

Now plot the data and its regression line.

In[77]:=

dotPlot = ListPlot[data, PlotStyle {Red, PointSize[.02]}, DisplayFunctionIdentity] ; 

linePlot = Plot[y[x], {x, -1, 2}, PlotStyle->SteelBlue, DisplayFunctionIdentity] ; 

Show[dotPlot, linePlot, DisplayFunction$DisplayFunction] ;

Print["regression line: y = ", βHat[[1]], " + ", βHat[[2]], "x."]

[Graphics:HTMLFiles/6_fitting_data_119.gif]

regression line: y = 11/10 + 13/10x.

Fitting Data with a Quadratic Polynomial
Lay 6.6.7

Find a quadratic polynomial of the form y = β1 x + β2 x^2 that best fits the data.

In[81]:=

Clear[data, X, x1, x2, y, β, β1, β2, βHatx] ; 

data = {{1, 1.8}, {2, 2.7}, {3, 3.4}, {4, 3.8}, {5, 3.9}} ;

%//MatrixForm

dotPlot = ListPlot[data, PlotStyle {Red, PointSize[.02]}] ;

Out[83]//MatrixForm=

( {{1, 1.8}, {2, 2.7}, {3, 3.4}, {4, 3.8}, {5, 3.9}} )

[Graphics:HTMLFiles/6_fitting_data_127.gif]

Visualize the data.

In[85]:=

dotPlot = ListPlot[data, PlotStyle {Red, PointSize[.02]}] ;

[Graphics:HTMLFiles/6_fitting_data_129.gif]

Construct the linear model y = X.β + ε.
A typical row of X.β is of the form  β1x + β2 x^2

In[86]:=

x1 = Transpose[data][[1]] ;

x2 = x1^2 ;

X = Transpose[{x1, x2}] ;

β = {β1, β2} ;

y = Transpose[data][[2]] ;

ϵ = {ϵ1, ϵ2, ϵ3, ϵ4, ϵ5} ; 

Print[y//MatrixForm, " = ", X//MatrixForm, ".", β//MatrixForm, " + ", ϵ//MatrixForm]

( {{1.8}, {2.7}, {3.4}, {3.8}, {3.9}} )  =  ( {{1, 1 ... 304; +  ( {{ϵ1}, {ϵ2}, {ϵ3}, {ϵ4}, {ϵ5}} )

Use the normal equations.

In[93]:=

Transpose[X] . X ;

%//MatrixForm

Out[94]//MatrixForm=

( {{55, 225}, {225, 979}} )

In[95]:=

Transpose[X] . y ;

%//MatrixForm

Out[96]//MatrixForm=

( {{52.1}, {201.5}} )

Now solve the system X^TXβ =  X^Ty.

In[97]:=

Print[Transpose[X] . X//MatrixForm, ".", β//MatrixForm, " == ", Transpose[X] . y//MatrixForm]

( {{55, 225}, {225, 979}} )  .  ( {{β1}, {β2}} )  ==  ( {{52.1}, {201.5}} )

In[98]:=

soln = Solve[Transpose[X] . X . β == Transpose[X] . y, {β1, β2}][[1]]

Out[98]=

{β11.76037, β2 -0.198758}

In[99]:=

βHat = β/.soln

Out[99]=

{1.76037, -0.198758}

Interpret the result: The (least squares) quadratic polynomial we seek is

In[100]:=

Clear[y, x] ; 

y[x_] = βHat . {x, x^2}

Out[101]=

1.76037 x - 0.198758 x^2

Now plot the data and its least squares polynomial.

In[102]:=

dotPlot = ListPlot[data, PlotStyle {Red, PointSize[.02]}, DisplayFunctionIdentity] ; 

curvePlot = Plot[y[x], {x, 1, 5}, PlotStyle->SteelBlue, DisplayFunctionIdentity] ; 

Show[dotPlot, curvePlot, DisplayFunction$DisplayFunction] ;

                                                                                             ... t["regression polynomial: y = ", βHat[[1]], "x ", βHat[[2]], x  .]

[Graphics:HTMLFiles/6_fitting_data_160.gif]

                                                                                2 regression polynomial: y = 1.76037x  -0.198758 x  .

Fitting Data with a Trigonometric Sum
Lay 6.6.9

Find a function of the form y =A Cos[x] + B Sin[x] that best fits the data.

In[106]:=

Clear[data, X, x1, x2, y, β, A, B, βHat, ϵ] ; 

data = {{1, 7.9}, {2, 5.4}, {3, -.9}} ;

%//MatrixForm

Out[108]//MatrixForm=

( {{1, 7.9}, {2, 5.4}, {3, -0.9}} )

Visualize the data.

In[109]:=

dotPlot = ListPlot[data, PlotStyle {Red, PointSize[.02]}] ;

[Graphics:HTMLFiles/6_fitting_data_167.gif]

Construct the linear model y = X.β + ε.
A typical row of X.β is of the form A Cos[x] + B Sin[x]

In[110]:=

x1 = Map[Cos, Transpose[data][[1]]] ;

x2 = Map[Sin, Transpose[data][[1]]] ;

X = Transpose[{x1, x2}] ;

β = {A, B} ;

y = Transpose[data][[2]] ;

ϵ = {ϵ1, ϵ2, ϵ3} ; 

Print[y//MatrixForm, " = ", X//MatrixForm, ".", β//MatrixForm, " + ", ϵ//MatrixForm]

( {{7.9}, {5.4}, {-0.9}} )  =  ( {{Cos[1], Sin[1]},  ... {A}, {B}} )  +  ( {{ϵ1}, {ϵ2}, {ϵ3}} )

Use the normal equations.

In[117]:=

Transpose[X] . X ;

%//MatrixForm

Out[118]//MatrixForm=

( {{Cos[1]^2 + Cos[2]^2 + Cos[3]^2, Cos[1] Sin[1] + Cos[2] Sin[2] + Cos[3] Sin[3]}, {Cos[1] Sin[1] + Cos[2] Sin[2] + Cos[3] Sin[3], Sin[1]^2 + Sin[2]^2 + Sin[3]^2}} )

In[119]:=

Transpose[X] . y ;

%//MatrixForm

Out[120]//MatrixForm=

( {{2.91219}, {11.4308}} )

Now solve the system X^TXβ =  X^Ty.

In[121]:=

Print[Transpose[X] . X//MatrixForm, ".", β//MatrixForm, " == ", Transpose[X] . y//MatrixForm]

( {{Cos[1]^2 + Cos[2]^2 + Cos[3]^2, Cos[1] Sin[1] + Cos[2] Sin[2] + Cos[3] Sin[3]},  ... ; ( {{A}, {B}} )  ==  ( {{2.91219}, {11.4308}} )

In[122]:=

soln = Solve[Transpose[X] . X . β == Transpose[X] . y, {A, B}][[1]]

Out[122]=

{A2.34212, B7.4475}

In[123]:=

βHat = β/.soln

Out[123]=

{2.34212, 7.4475}

Interpret the result: The (least squares) quadratic polynomial we seek is

In[124]:=

Clear[y, x] ; 

y[x_] = βHat . {Cos[x], Sin[x]}

Out[125]=

2.34212 Cos[x] + 7.4475 Sin[x]

Now plot the data and its least squares polynomial.

In[126]:=

dotPlot = ListPlot[data, PlotStyle {Red, PointSize[.02]}, DisplayFunctionIdentity] ; 

curvePlot = Plot[y[x], {x, 0, 5}, PlotStyle->SteelBlue, DisplayFunctionIdentity] ; 

Show[dotPlot, curvePlot, DisplayFunction$DisplayFunction] ;

Print["regression polynomial: y[x] = ", y[x]]

[Graphics:HTMLFiles/6_fitting_data_197.gif]

regression polynomial: y[x] = 2.34212 Cos[x] + 7.4475 Sin[x]

Fitting Data with a Sum of Exponentials
Lay 6.6.10

Find a function of the form y = M_A ^(-.02 t) + M_B ^(-.07 t) which best fits the data.

In[130]:=

Clear[data, X, x1, x2, y, β, A, B, βHat, ϵ, f, g, t] ; 

data = {{10, 21.34}, {11, 20.68}, {12, 20.05}, {14, 18.87}, {15, 18.3}} ;

%//MatrixForm

Out[132]//MatrixForm=

( {{10, 21.34}, {11, 20.68}, {12, 20.05}, {14, 18.87}, {15, 18.3}} )

Visualize the data.

In[133]:=

dotPlot = ListPlot[data, PlotStyle {Red, PointSize[.02]}] ;

[Graphics:HTMLFiles/6_fitting_data_208.gif]

Construct the linear model y = X.β + ε.
A typical row of X.β is of the form A Cos[x] + B Sin[x]

In[134]:=

f[t_] = Exp[-.02 t] ;

g[t_] = Exp[-.07 t] ; 

x1 = Map[f, Transpose[data][[1]]] ;

x2 = Map[g, Transpose[data][[1]]] ;

X = Transpose[{x1, x2}] ;

β = {Ma, Mb} ;

y = Transpose[data][[2]] ;

ϵ = {ϵ1, ϵ2, ϵ3, e4, e5} ; 

Print[y//MatrixForm, " = ", X//MatrixForm, ".", β//MatrixForm, " + ", ϵ//MatrixForm]

( {{21.34}, {20.68}, {20.05}, {18.87}, {18.3}} )  =  (ᡝ ... 62370;)  +  ( {{ϵ1}, {ϵ2}, {ϵ3}, {e4}, {e5}} )

Use the normal equations.

In[143]:=

Transpose[X] . X ;

%//MatrixForm

Out[144]//MatrixForm=

( {{3.05316, 1.66064}, {1.66064, 0.910667}} )

In[145]:=

Transpose[X] . y ;

%//MatrixForm

Out[146]//MatrixForm=

( {{77.6583}, {42.314}} )

Now solve the system X^TXβ =  X^Ty.

In[147]:=

Print[Transpose[X] . X//MatrixForm, ".", β//MatrixForm, " == ", Transpose[X] . y//MatrixForm]

( {{3.05316, 1.66064}, {1.66064, 0.910667}} )  .  ( {{Ma}, {Mb}} )  ==  ( {{77.6583}, {42.314}} )

In[148]:=

soln = Solve[Transpose[X] . X . β == Transpose[X] . y, {Ma, Mb}][[1]]

Out[148]=

{Ma19.9411, Mb10.1015}

In[149]:=

βHat = β/.soln

Out[149]=

{19.9411, 10.1015}

Interpret the result: The (least squares) function we seek is

In[150]:=

Clear[y, x] ; 

y[t_] = βHat . {f[t], g[t]}

Out[151]=

10.1015 ^(-0.07 t) + 19.9411 ^(-0.02 t)

Now plot the data and its least squares polynomial.

In[152]:=

dotPlot = ListPlot[data, PlotStyle {Red, PointSize[.02]}, DisplayFunctionIdentity] ; 

curvePlot = Plot[y[t], {t, 10, 15}, PlotStyle->SteelBlue, DisplayFunctionIdentity] ; 

Show[dotPlot, curvePlot, DisplayFunction$DisplayFunction] ;

Print["regression function: y[x] = ", y[x]]

[Graphics:HTMLFiles/6_fitting_data_240.gif]

regression function: y[x] = 10.1015 ^(-0.07 x) + 19.9411 ^(-0.02 x)

Orbit of a Comet
Lay 6.6.11

Find a function of the form y = β + e (r Cos[ϑ]) that best fits the astronomical data.

In[156]:=

Clear[data, y, X, β, βHat, x, ϵ, r, e] ; 

data = {{.88, 3.}, {1.1, 2.3}, {1.42, 1.65}, {1.77, 1.25}, {2.14, 1.01}} ;

%//MatrixForm

Out[158]//MatrixForm=

( {{0.88, 3.}, {1.1, 2.3}, {1.42, 1.65}, {1.77, 1.25}, {2.14, 1.01}} )

Visualize the data.

In[159]:=

dotPlot = ListPlot[data, AxesLabel {ϑ, r}, PlotStyle {Red, PointSize[.02]}] ;

[Graphics:HTMLFiles/6_fitting_data_247.gif]

Construct the linear model y = X.β + ε.
A typical row of X.β is of the form  β*1 + e* (r Cos[ϑ])

In[160]:=

x1 = {1, 1, 1, 1, 1} ;

rs = Transpose[data][[2]] ;

ϑs = Transpose[data][[1]] ;

cosϑs = Cos[ϑs] ;

x2 = rs cosϑs ;

X = Transpose[{x1, x2}] ; 

βvec = {β, e} ;

y = rs ;

ϵ = {ϵ1, ϵ2, ϵ3, ϵ4, ϵ5} ; 

Print[y//MatrixForm, " = ", X//MatrixForm, ".", βvec//MatrixForm, " + ", ϵ//MatrixForm]

( {{3.}, {2.3}, {1.65}, {1.25}, {1.01}} )  =  ( {{1, ... 304; +  ( {{ϵ1}, {ϵ2}, {ϵ3}, {ϵ4}, {ϵ5}} )

Use the normal equations.

In[170]:=

Transpose[X] . X ;

%//MatrixForm

Out[171]//MatrixForm=

( {{5., 2.41088}, {2.41088, 5.16101}} )

In[172]:=

Transpose[X] . y ;

%//MatrixForm

Out[173]//MatrixForm=

( {{9.21}, {7.68388}} )

Now solve the system X^TXβ =  X^Ty.

In[174]:=

Print[Transpose[X] . X//MatrixForm, ".", βvec//MatrixForm, " = ", Transpose[X] . y//MatrixForm]

( {{5., 2.41088}, {2.41088, 5.16101}} )  .  ( {{β}, {e}} )  =  ( {{9.21}, {7.68388}} )

In[175]:=

soln = Solve[Transpose[X] . X . βvec == Transpose[X] . y, {β, e}][[1]]

Out[175]=

{β1.45093, e0.811053}

In[176]:=

βHat = {β, e} = {β, e}/.soln

Out[176]=

{1.45093, 0.811053}

Interpret the result:
Since 0 < e < 1, the orbit is elliptic.
The (least squares) function we seek is

In[177]:=

Clear[r, ϑ] ; 

r[ϑ_] = β/(1 - e Cos[ϑ])

Out[178]=

1.45093/(1 - 0.811053 Cos[ϑ])

Now plot the data and its least squares approximation.

In[179]:=

dotPlot = ListPlot[data, PlotStyle {Red, PointSize[.02]}, AxesLabel {ϑ, r}, DisplayFunctionIdentity] ; 

curvePlot = Plot[r[ϑ], {ϑ, .8, 2.2}, PlotStyle->SteelBlue, DisplayFunctionIdentity] ; 

Show[dotPlot, curvePlot, DisplayFunction$DisplayFunction] ;

Print["least squares approximation: r[ϑ] = ", r[ϑ]]

[Graphics:HTMLFiles/6_fitting_data_280.gif]

least squares approximation: r[ϑ] = 1.45093/(1 - 0.811053 Cos[ϑ])

In[183]:=

Print["When ϑ = 4.6, the comet will have radial coordinates (r,ϑ) = (", r[4.6] , ",", 4.6, ")."]

When ϑ = 4.6, the comet will have radial coordinates (r,ϑ) = (1.32995, 4.6).

Let's plot that.

In[184]:=

dotPlot = ListPlot[data, PlotStyle {Red, PointSize[.02]}, AxesLabel {ϑ, r}, DisplayFunctionIdentity] ; 

curvePlot = Plot[r[ϑ], {ϑ, .8, 5}, PlotStyle->SteelBlue, DisplayFunctionIdentity] ; 

extraDot = ListPlot[{{4.6, r[4.6]}}, PlotStyle {IndianRed, PointSize[.02]}, AxesLabel {ϑ, r}, DisplayFunctionIdentity] ; 

Show[dotPlot, curvePlot, extraDot, DisplayFunction$DisplayFunction] ;

[Graphics:HTMLFiles/6_fitting_data_288.gif]

Super!

Blood Pressure vs. Weight
Lay 6.6.12

Find a function of the form p = β0 + β1 ln(w) that best fits the data.

In[188]:=

Clear[data, y, X, β, β0, β1, βHat, x, ϵ, w, p] ; 

data = {{44, 91}, {61, 98}, {81, 103}, {113, 110}, {131, 112}} ;

%//MatrixForm

Out[190]//MatrixForm=

( {{44, 91}, {61, 98}, {81, 103}, {113, 110}, {131, 112}} )

Visualize the data.

In[191]:=

dotPlot = ListPlot[data, AxesLabel {w, p}, PlotStyle {Red, PointSize[.02]}] ;

[Graphics:HTMLFiles/6_fitting_data_294.gif]

Construct the linear model y = X.β + ε.
A typical row of X.β is of the form  β0*1 + β1*Log[w]

In[192]:=

x1 = {1, 1, 1, 1, 1} ;

ps = Transpose[data][[2]] ;

ws = Transpose[data][[1]] ;

x2 = Map[Log, ws] ;

X = Transpose[{x1, x2}] ; 

βvec = {β0, β1} ;

y = ps ;

ϵ = {ϵ1, ϵ2, ϵ3, ϵ4, ϵ5} ; 

Print[y//MatrixForm, " = ", X//MatrixForm, ".", βvec//MatrixForm, " + ", ϵ//MatrixForm]

( {{91}, {98}, {103}, {110}, {112}} )  =  ( {{1, Log ... 304; +  ( {{ϵ1}, {ϵ2}, {ϵ3}, {ϵ4}, {ϵ5}} )

Use the normal equations.

In[201]:=

Transpose[X] . X ;

%//MatrixForm

Out[202]//MatrixForm=

( {{5, Log[44] + Log[61] + Log[81] + Log[113] + Log[131]}, {Log[44] + Log[61] + Log[81] + Log[113] + Log[131], Log[44]^2 + Log[61]^2 + Log[81]^2 + Log[113]^2 + Log[131]^2}} )

In[203]:=

Transpose[X] . y ;

%//MatrixForm

Out[204]//MatrixForm=

( {{514}, {91 Log[44] + 98 Log[61] + 103 Log[81] + 110 Log[113] + 112 Log[131]}} )

Now solve the system X^TXβ =  X^Ty.

In[205]:=

Print[Transpose[X] . X//N//MatrixForm, ".", βvec//MatrixForm, " = ", Transpose[X] . y//N//MatrixForm]

( {{5., 21.8921}, {21.8921, 96.6463}} )  .  ( {{β0}, {β1}} )  =  ( {{514.}, {2265.89}} )

In[206]:=

soln = Solve[Transpose[X] . X . βvec == Transpose[X] . y, {β0, β1}][[1]]//N

Out[206]=

{β017.9243, β119.385}

In[207]:=

βHat = {β0, β1} = {β0, β1}/.soln

Out[207]=

{17.9243, 19.385}

Interpret the result: The (least squares) function we seek is

In[208]:=

Clear[y, x] ; 

f[w_] = 1 ;

g[w_] = Log[w] ; 

y[w_] = βHat . {f[w], g[w]}

Out[211]=

17.9243 + 19.385 Log[w]

Now plot the data and its least squares approximation.

In[212]:=

dotPlot = ListPlot[data, PlotStyle {Red, PointSize[.02]}, AxesLabel {ϑ, r}, DisplayFunctionIdentity] ; 

curvePlot = Plot[y[w], {w, 40, 135}, PlotStyle->SteelBlue, DisplayFunctionIdentity] ; 

Show[dotPlot, curvePlot, DisplayFunction$DisplayFunction] ;

Print["least squares approximation: y[w] = ", y[w]]

[Graphics:HTMLFiles/6_fitting_data_328.gif]

least squares approximation: y[w] = 17.9243 + 19.385 Log[w]

In[216]:=

Print["The systolic blood pressure of a healthy child weighing 100 lbs should be about ", y[100] , " mm."]

The systolic blood pressure of a healthy child weighing 100 lbs should be about 107.196 mm.

Let's plot that.

In[217]:=

dotPlot = ListPlot[data, PlotStyle {Red, PointSize[.02]}, AxesLabel {ϑ, r}, DisplayFunctionIdentity] ; 

curvePlot = Plot[y[w], {w, 40, 135}, PlotStyle->SteelBlue, DisplayFunctionIdentity] ; 

extraDot = ListPlot[{{100, y[100]}}, PlotStyle {ForestGreen, PointSize[.03]}, AxesLabel {ϑ, r}, DisplayFunctionIdentity] ; 

Show[dotPlot, curvePlot, extraDot, DisplayFunction$DisplayFunction] ;

[Graphics:HTMLFiles/6_fitting_data_336.gif]

Super!

Takeoff Performance of an Airplane
Lay 6.6.13

Find a cubic polynomial that best fits the data.

In[221]:=

Clear[data, y, X, β, β0, β1, β2, β3, βHat, x, ϵ, w, p] ; 

ts = Range[0, 12] ;

hs = {0, 8.8, 29.9, 62., 104.7, 159.1, 222., 294.5, 380.4, 471.1, 571.7, 686.8, 809.2} ;

data = Transpose[{ts, hs}] ;

%//MatrixForm

Out[225]//MatrixForm=

( {{0, 0}, {1, 8.8}, {2, 29.9}, {3, 62.}, {4, 104.7}, {5, 159.1}, {6, 222.}, {7, 294.5}, {8, 380.4}, {9, 471.1}, {10, 571.7}, {11, 686.8}, {12, 809.2}} )

Visualize the data.

In[226]:=

dotPlot = ListPlot[data, AxesLabel {"t (sec)", "h (ft)"}, PlotStyle {Red, PointSize[.02]}] ;

[Graphics:HTMLFiles/6_fitting_data_344.gif]

Construct the linear model y = X.β + ε.
A typical row of X.β is of the form  β0*1 + β1*t + β2 t^2 + β3 t^3

In[227]:=

x1 = {1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1} ;

x2 = ts ;

x3 = x2^2 ;

x4 = x2 x3 ;

X = Transpose[{x1, x2, x3, x4}] ; 

βvec = {β0, β1, β2, β3} ;

y = hs ;

ϵ = {ϵ0, ϵ1, ϵ2, ϵ3, ϵ4, ϵ5, ϵ6, ϵ7, ϵ8, ϵ9, ϵ10, ϵ11, ϵ12} ; 

Print[y//MatrixForm, " = ", X//MatrixForm, ".", βvec//MatrixForm, " + ", ϵ//MatrixForm]

( {{0}, {8.8}, {29.9}, {62.}, {104.7}, {159.1}, {222.}, {294.5}, {380.4}, {471.1}, { ... {ϵ6}, {ϵ7}, {ϵ8}, {ϵ9}, {ϵ10}, {ϵ11}, {ϵ12}} )

Use the normal equations.

In[236]:=

Transpose[X] . X ;

%//MatrixForm

Out[237]//MatrixForm=

( {{13, 78, 650, 6084}, {78, 650, 6084, 60710}, {650, 6084, 60710, 630708}, {6084, 60710, 630708, 6735950}} )

In[238]:=

Transpose[X] . y ;

%//MatrixForm

Out[239]//MatrixForm=

( {{3800.2}, {35127.7}, {348064.}, {3.5998*10^6}} )

Now solve the system X^TXβ =  X^Ty.

In[240]:=

Print[Transpose[X] . X//N//MatrixForm, ".", βvec//MatrixForm, " = ", Transpose[X] . y//N//MatrixForm]

( {{13., 78., 650., 6084.}, {78., 650., 6084., 60710.}, {650., 6084., 60710., 630708 ... #62370;)  =  ( {{3800.2}, {35127.7}, {348064.}, {3.5998*10^6}} )

In[241]:=

soln = Solve[Transpose[X] . X . βvec == Transpose[X] . y, {β0, β1, β2, β3}][[1]]//N

Out[241]=

{β0 -0.855769, β14.70249, β25.55537, β3 -0.0273601}

In[242]:=

βHat = {β0, β1, β2, β3} = {β0, β1, β2, β3}/.soln

Out[242]=

{-0.855769, 4.70249, 5.55537, -0.0273601}

Interpret the result: The (least squares) function we seek is

In[243]:=

Clear[y, t] ; 

y[t_] = βHat . {1, t, t^2, t^3}

Out[244]=

-0.855769 + 4.70249 t + 5.55537 t^2 - 0.0273601 t^3

Now plot the data and its least squares approximation.

In[245]:=

dotPlot = ListPlot[data, PlotStyle {Red, PointSize[.02]}, AxesLabel& ... ; {"t (sec)", "h (ft)"}, DisplayFunctionIdentity] ; 

curvePlot = Plot[y[t], {t, 0, 12}, PlotStyle->SteelBlue, DisplayFunctionIdentity] ; 

Show[dotPlot, curvePlot, DisplayFunction$DisplayFunction] ;

Print["least squares approximation: y[t] = ", y[t]]

[Graphics:HTMLFiles/6_fitting_data_378.gif]

least squares approximation: y[t] =  -0.855769 + 4.70249 t + 5.55537 t^2 - 0.0273601 t^3

In[249]:=

Print["The velocity of the plane at t = 4.5 sec is y'[4.5] = ", y '[4.5] , " ft/sec."]

The velocity of the plane at t = 4.5 sec is y'[4.5] = 53.0387 ft/sec.


Created by Mathematica  (December 11, 2004) Valid XHTML 1.1!