3d geometry for computer graphics n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
3D Geometry for Computer Graphics PowerPoint Presentation
Download Presentation
3D Geometry for Computer Graphics

Loading in 2 Seconds...

play fullscreen
1 / 38

3D Geometry for Computer Graphics - PowerPoint PPT Presentation


  • 94 Views
  • Uploaded on

3D Geometry for Computer Graphics. The plan today. Least squares approach General / Polynomial fitting Linear systems of equations Local polynomial surface fitting. Motivation. y = f ( x ). Given data points, fit a function that is “close” to the points. y. P i = ( x i , y i ). x.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about '3D Geometry for Computer Graphics' - gaetan


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
the plan today
The plan today
  • Least squares approach
    • General / Polynomial fitting
    • Linear systems of equations
    • Local polynomial surface fitting
motivation
Motivation

y = f (x)

  • Given data points, fit a function that is “close” to the points

y

Pi = (xi, yi)

x

motivation1
Motivation
  • Local surface fitting to 3D points
line fitting
Line fitting
  • Orthogonal offsets minimization – we already learned (PCA)

y

x

line fitting1
Line fitting
  • y-offsets minimization

y

Pi = (xi, yi)

x

line fitting2
Line fitting
  • Find a liney = ax + bthat minimizes
  • E(a,b)is quadratic in the unknown parametersa, b
  • Another option would be, for example:
  • But – it is not differentiable, harder to minimize…
line fitting ls minimization
Line fitting – LS minimization
  • To find optimal a, b we differentiate E(a, b):

E(a, b) = (–2xi)[yi – (axi + b)] = 0

E(a, b) = (–2)[yi – (axi + b)] = 0

line fitting ls minimization1
Line fitting – LS minimization
  • We get two linear equations for a, b:

(–2xi)[yi – (axi + b)] = 0

(–2)[yi – (axi + b)] = 0

line fitting ls minimization2
Line fitting – LS minimization
  • We get two linear equations for a, b:

[xiyi – axi2 – bxi] = 0

[yi – axi – b] = 0

line fitting ls minimization3
Line fitting – LS minimization
  • We get two linear equations for a, b:

( xi2) a + ( xi) b = xiyi

( xi)a + ( 1)b = yi

line fitting ls minimization4
Line fitting – LS minimization
  • Solve for a, b using e.g. Gauss elimination
  • Question: why the solution is the minimum for the error function?

E(a, b) = [yi – (axi + b)]2

fitting polynomials1
Fitting polynomials
  • Decide on the degree of the polynomial, k
  • Want to fit f (x) = akxk + ak-1xk-1 + … + a1x+ a0
  • Minimize:

E(a0,a1, …,ak) = [yi – (akxik+ak-1xik-1+ …+a1xi+a0)]2

E(a0,…,ak) = (– 2xm)[yi – (akxik+ak-1xik-1+…+ a0)] = 0

fitting polynomials2
Fitting polynomials
  • We get a linear system of k+1 in k+1 variables
general parametric fitting
General parametric fitting
  • We can use this approach to fit any function f(x)
    • Specified by parameters a, b, c, …
    • The expression f(x) linearly depends on the parameters a, b, c, …
general parametric fitting1
General parametric fitting
  • Want to fit function fabc…(x) to data points (xi, yi)
    • Define E(a,b,c,…) = [yi – fabc…(xi)]2
    • Solve the linear system
general parametric fitting2
General parametric fitting
  • It can even be some crazy function like
  • Or in general:
solving linear systems in ls sense
Solving linear systems in LS sense
  • Let’s look at the problem a little differently:
    • We have data points (xi, yi)
    • We want the function f(x) to go through the points:

 i =1, …, n: yi = f(xi)

    • Strict interpolation is in general not possible
      • In polynomials: n+1 points define a unique interpolation polynomial of degree n.
      • So, if we have 1000 points and want a cubic polynomial, we probably won’t find it…
solving linear systems in ls sense1
Solving linear systems in LS sense
  • We have an over-determined linear system nk:

f(x1) = 1f1(x1) + 2f2(x1) + … + kfk(x1) = y1

f(x2) = 1f1(x2) + 2f2(x2) + … + kfk(x2) = y2

f(xn) = 1f1(xn) + 2f2(xn) + … + kfk(xn) = yn

solving linear systems in ls sense4
Solving linear systems in LS sense
  • More constrains than variables – no exact solutions generally exist
  • We want to find something that is an “approximate solution”:
finding the ls solution
Finding the LS solution
  • v Rk
  • Av  Rn
  • As we vary v, Av varies over the linear subspace of Rnspanned by the columns of A:

Av =

= 1

+ 2

+…+ k

1

2

.

.

k

A1

A2

Ak

A1

A2

Ak

finding the ls solution1
Finding the LS solution

Av

closest to y

  • We want to find the closest Av to y:

y

Subspace spanned

by columns of A

Rn

finding the ls solution2
Finding the LS solution
  • The vector Av closest to y satisfies:

(Av – y)  {subspace of A’s columns}

 column Ai, <Ai, Av – y> = 0

 i, AiT(Av – y) = 0

AT(Av – y) = 0

(ATA)v = ATy

These are

called the

normal equations

finding the ls solution3
Finding the LS solution
  • We got a square symmetric system (ATA)v = ATy(kk)
  • If A has full rank (the columns of A are linearly independent) then (ATA) is invertible.
weighted least squares
Weighted least squares
  • Sometimes the problem also has weights to the constraints:
local surface fitting to 3d points
Local surface fitting to 3D points
  • Normals?
  • Lighting?
  • Upsampling?
local surface fitting to 3d points1
Local surface fitting to 3D points

Locally approximate

a polynomial surface

from points

fitting local polynomial
Fitting local polynomial
  • Fit a local polynomial around a point P

Reference plane

P

Z

Y

X

fitting local polynomial surface
Fitting local polynomial surface
  • Compute a reference plane that fits the points close to P
  • Use the local basis defined by the normal to the plane!

z

y

x

fitting local polynomial surface1
Fitting local polynomial surface
  • Fit polynomial z = p(x,y) = ax2 + bxy + cy2 + dx + ey + f

z

y

x

fitting local polynomial surface2
Fitting local polynomial surface
  • Fit polynomial z = p(x,y) = ax2 + bxy + cy2 + dx + ey + f

z

y

x

fitting local polynomial surface3
Fitting local polynomial surface
  • Fit polynomial z = p(x,y) = ax2 + bxy + cy2 + dx + ey + f

z

x

y

fitting local polynomial surface4
Fitting local polynomial surface
  • Again, solve the system in LS sense:

ax12 + bx1y1 + cy12 + dx1 + ey1 + f =z1

ax22 + bx2y2 + cy22 + dx2 + ey2 + f =z1

. . .

axn2 + bxnyn + cyn2 + dxn + eyn + f =zn

  • Minimize  ||zi – p(xi, yi)||2
fitting local polynomial surface5
Fitting local polynomial surface
  • Also possible (and better) to add weights:

 wi||zi – p(xi, yi)||2, wi > 0

  • The weights get smaller as the distance from the origin point grows.