Engineering computation curve fitting 1
Download
1 / 28

Engineering Computation Curve Fitting 1 - PowerPoint PPT Presentation


  • 258 Views
  • Uploaded on

Engineering Computation Curve Fitting 1. Curve Fitting By Least-Squares Regression and Spline Interpolation Part 7. Engineering Computation Curve fitting 2. Curve Fitting: Given a set of points: - experimental data - tabular data - etc.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Engineering Computation Curve Fitting 1' - zaza


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Engineering computation curve fitting 1
Engineering Computation Curve Fitting 1

Curve Fitting By

Least-Squares Regression

and

Spline Interpolation

Part 7


Engineering computation curve fitting 2
Engineering Computation Curve fitting 2

Curve Fitting:

Given a set of points:

- experimental data

- tabular data

- etc.

Fit a curve (surface) to the points so that we can easily evaluate f(x) at any x of interest.

If x within data range

interpolating (generally safe)

If x outside data range

extrapolating (often dangerous)


Engineering computation curve fitting 3
Engineering Computation Curve fitting 3

  • Curve Fitting:

  • Two main methods will be covered:

  • 1. Least-Squares Regression

    • Function is "best fit" to data.

    • Does not necessarily pass through points.

    • Used for scattered data (experimental)

    • Can develop models for analysis/design.

  • 2. Interpolation

    • Function passes through all (or most) points.

    • Interpolates values of well-behaved (precise) data or for geometric design.


  • Engineering computation curve fitting interpolation

    interpolation

    extrapolation

    Curve Fitting:

    1. We have discussed Least-Squares Regression where the function is "best fit" to points but does not necessarily pass through the points.

    Engineering Computation Curve Fitting & Interpolation

    2. We now discuss Interpolation & Extrapolation

    The function passes through all (or at least most) points.



    Engineering computation least squares regression 6
    Engineering Computation Least Squares Regression Procedure 6

    • Curve Fitting by Least-Squares Regression:

    • Objective:

      • Obtain low order approximation (curve or surface) that

      • "best fits" data

  • Note:

    • Because the order of the approximation is < the number of data points, the curve or surface can not pass through all points.

    • We will need a consistent criterion for determining the "best fit."

  • Typical Usage:

  • Scattered (experimental) data

  • Develop empirical models for analysis/design.


  • Engineering computation least squares regression 7

    y Procedure

    y

    i

    x

    x

    i

    Engineering Computation Least Squares Regression 7

    Least-Squares Regression:

    1. In laboratory, apply x, measure y, tabulate data.

    2. Plot data and examine the relationship.


    Engineering computation least squares regression 8

    y Procedure

    y

    i

    x

    x

    i

    Engineering Computation Least Squares Regression 8

    Least-Squares Regression:

    1. In laboratory, apply x, measure y, tabulate data.

    2. Plot data and examine the relationship.


    Engineering computation least squares regression 9

    • Least-Squares Regression: Procedure

    • 3. Develop a "model" – an approximate relationship between y

    • and x:

    • y = m x + b

    • 4. Use the model to predict or estimate y for any given x.

    • 5. "Best fit" of the data requires:

      • Optimal way of finding parameters (e.g., slope and intercept of a straight line.

      • Perhaps optimize the selection of the model form

      • (i.e., linear, quadratic, exponential, ...).

      • That the magnitudes of the residual errors do not vary in any systematic fashion. [In statistical applications, the residual errors should be independent and identically distributed.]

    Engineering Computation Least Squares Regression 9


    Engineering computation least squares regression 10

    Least-Squares Regression Procedure

    Given: n data points: (x1,y1), (x2,y2), … (xn,yn)

    Obtain: "Best fit" curve:

    f(x) =a0 Z0(x) + a1 Z1(x) + a2 Z2(x)+…+ am Zm(x)

    ai's are unknown parameters of model

    Zi's are known functions of x.

    We will focus on two of the many possible types of regression models:

    Simple Linear Regression

    Z0(x) = 1 & Z1(x) = x

    General Polynomial Regression

    Z0(x) = 1, Z1(x)= x, Z2(x) = x2, …, Zm(x)= xm

    Engineering Computation Least Squares Regression 10


    b = REGRESS(y,X) Procedure returns the vector of regression coefficients, b, in the linear model y = Xb,

    (X is an nxp matrix, y is the nx1 vector of observations).

    [B,BINT,R,RINT,STATS] = REGRESS(y,X,alpha) uses the input, ALPHA to calculate 100(1 - ALPHA) confidence intervals for B and the residual vector, R, in BINT and RINT respectively. The vector STATS contains the R-square statistic along with the F and p values for the regression.

    >> x=linspace(0,1,20)’;

    >> y=2*x+1+0.1*randn(20,1);

    >> plot(x,y,'.')

    >> xx=[ones(20,1), x];

    >> b=regress(y,xx)

    b =

    1.0115

    1.9941

    >> yy=xx*b;

    >> hold on

    >> plot(x,yy,‘k-')


    Engineering computation least squares regression general procedure 12

    Least Squares Regression (cont'd): Procedure

    General Procedure:

    For the ith data point, (xi,yi) we find the set of coefficients for which:

    yi = a0 Z0(xi) + a1 Z1(xi) .... + am Zm (xi)+ ei

    where ei is the residual error = the difference between reported value and model:

    ei = yi – a0Z0 (xi) – a1Z1 (x)i –… – amZm (xi)

    Our "best fit" will minimize the total sum of the squares of the residuals:

    Engineering Computation Least Squares Regression: General Procedure 12


    Engineering computation least squares regression general procedure 13

    measured Procedure

    value

    y

    yi

    ei

    modeled

    value

    x

    x

    i

    Engineering Computation Least Squares Regression: General Procedure 13

    Our "best fit" will be the function which minimizes the sum of

    squares of the residuals:


    Engineering computation least squares regression general procedure 14
    Engineering Computation Least Squares Regression: General Procedure 14

    Least Squares Regression (cont'd):

    To minimize this expression with respect to the unknowns a0, a1 … am take derivatives of Sr and set them tozero:


    Engineering computation least squares linear algebra 15

    Least Squares Regression (cont'd): Procedure

    In Linear Algebra form:

    {Y} = [Z] {A} + {E} or {E} = {Y} – [Z] {A}

    where: {E} and {Y} --- n x 1

    [Z] -------------- n x (m+1)

    {A} ------------- (m+1) x 1

    n = # points (m+1) = # unknowns

    {E}T = [e1 e2 ... en],

    {Y}T = [y1 y2 ... yn],

    {A}T = [a0 a1 a2 ... am]

    Engineering Computation Least Squares: Linear Algebra 15


    Engineering computation least squares sum square error 16

    Least Squares Regression (cont'd): Procedure

    {E} = {Y} – [Z]{A}

    Then

    Sr = {E}T{E} = ({Y}–[Z]{A})T ({Y}–[Z]{A})

    = {Y}T{Y} – {A}T[Z]T{Y} – {Y}T[Z]{A} + {A}T[Z]T[Z]{A}

    = {Y}T{Y}– 2 {A}T[Z]T{Y} + {A}T[Z]T[Z]{A}

    Setting = 0 for i =1,...,n yields:

    = 0 = 2 [Z]T[Z]{A} – 2 [Z]T{Y}

    or

    [Z]T[Z]{A} = [Z]T{Y}

    Engineering Computation Least Squares: Sum Square error 16


    Engineering computation least squares normal equations 17
    Engineering Computation Least Squares: Normal Equations 17

    Least Squares Regression (cont'd):

    [Z]T[Z]{A} = [Z]T{Y} (C&C Eq. 17.25)

    This is the general form of Normal Equations.

    They provides (m+1) equations in (m+1) unknowns.

    (Note that we end up with a system of linear equations.)


    Engineering computation least squares simple linear regression 18

  • Obtain: "Best fit" curve: f(x) = a0 + a1x

  • from the n equations:

  • y1 = a0 + a1x1 + e1

  • y2 = a0 + a1x2 + e2

  • yn = a0 + a1xn + en

  • Or, in matrix form: [Z]T[Z] {A} = [Z]T{Y}

  • Engineering Computation Least Squares: Simple Linear Regression 18


    Engineering computation least squares simple linear regression 19
    Engineering Computation Least Squares: Simple Linear Regression 19

    Simple Linear Regression (m = 1):

    Normal Equations

    [Z]T[Z] {A} = [Z]T{Y}

    upon multiplying the matrices become

    Normal Equations for Linear Regression

    C&C Eqs. (17.4-5)

    (This form works well for spreadsheets.)


    Engineering computation least squares simple linear regression 20

    Simple Linear Regression (m = 1): Regression

    [Z]T[Z] {A} = [Z]T{Y}

    Engineering Computation Least Squares: Simple Linear Regression 20

    Solving for {a}:

    C&C equations (17.6) and (17.7)


    Engineering computation least squares simple linear regression 21

    Simple Linear Regression (m = 1): Regression

    [Z]T[Z] {A} = [Z]T{Y}

    Engineering Computation Least Squares: Simple Linear Regression 21

    A better version of the first normal equation is:

    which is easier and numerically more stable, but the 2nd equation remains the same:


    Engrd 241 cee 241 engineering computation curve fitting 22
    ENGRD 241 / CEE 241: Engineering Computation Curve Fitting 22

    Common Nonlinear Relations:

    Objective: Use linear equations for simplicity.

    Remedy: Transform data into linear form and perform regressions.

    Given: data which appears as:

    • exponential-like curve:

      • (e.g., population growth, radioactive decay, attenuation of a transmission line)

      • Can also use: ln(y) = ln(a1) + b1x


    Engrd 241 cee 241 engineering computation curve fitting 23
    ENGRD 241 / CEE 241: Engineering Computation Curve Fitting 23

    • Common Nonlinear Relations:

    • Power-like curve:

    • ln(y) = ln(a2) + b2 ln(x)

    • (3) Saturation growth-rate curve

    • population growth under limiting conditions

    • Be careful about the implied distribution of the errors. Always use the untransformed values for error analysis.

    a3=5

    b3=1..10


    Engineering computation goodness of fit 24
    Engineering Computation Goodness of fit 24

    • Major Points in Least-Squares Regression:

    • In all regression models one is solving an overdetermined system of equations, i.e., more equations than unknowns.

    • How good is the fit?

      • Often based on a coefficient of determination, r2


    Engineering computation goodness of fit 25

    Spread of the data around the mean:

    Engineering Computation Goodness of fit 25

    r2Compares the average spread of the data about the regression line compared to the spread of the data about the mean.

    Spread of the data around the regression line:


    Engineering computation goodness of fit 26
    Engineering Computation Goodness of fit 26

    Coefficient of determination

    describes how much of variance is “explained” by

    the regression equation

    • Want r2 close to 1.0.

    • Doesn't work if models have different numbers of parameters.

    • Be careful when using different transformations – always do the analysis on the untransformed data.


    Engineering computation standard errpr of the estimate 27
    Engineering Computation Standard Errpr of the estimate 27

    Precision :

    If the spread of the points around the line is of similar

    magnitude along the entire range of the data,

    Then one can use

    = standard error of the estimate

    (standard deviation in y)

    to describe the precision of the regression estimate (in which m+1 is the number of coefficients calculated for the fit, e.g., m+1=2 for linear regression)


    Engineering computation standard errpr of the estimate 28
    Engineering Computation Standard Errpr of the estimate 28

    Statistics

    Chapra and Canale in sections PT5.2, 17.1.3 and 17.4.3 discuss the statistical interpretation of least squares regression and some of the associated statistical concepts.

    The statistical theory of least squares regression is elegant, powerful, and widely used in the analysis of real data throughout the sciences.

    See Lecture Notes pages X-14 through X-16.


    ad